{"id":341,"date":"2024-03-07T11:00:00","date_gmt":"2024-03-07T12:00:00","guid":{"rendered":"http:\/\/www.washnow.me\/?p=341"},"modified":"2024-03-08T14:21:26","modified_gmt":"2024-03-08T14:21:26","slug":"how-nvidia-beat-everyone-else-in-the-ai-race","status":"publish","type":"post","link":"http:\/\/www.washnow.me\/index.php\/2024\/03\/07\/how-nvidia-beat-everyone-else-in-the-ai-race\/","title":{"rendered":"How Nvidia beat everyone else in the AI race"},"content":{"rendered":"
\n
\n \"A
Nvidia, founded in California in 1993, originally made chips mainly used for gaming. | Jakub Porzycki\/NurPhoto via Getty Images<\/figcaption><\/figure>\n

The company you might not have heard of is now worth $2 trillion \u2014 more than Google or Amazon.<\/p>\n

Only four companies in the world are worth over $2 trillion. Apple<\/a>, Microsoft<\/a>, the oil company Saudi Aramco \u2014 and, as of 2024, Nvidia. It\u2019s understandable if the name doesn\u2019t ring a bell<\/a>. The company doesn\u2019t exactly make a shiny product attached to your hand all day, every day, as Apple does. Nvidia designs a chip hidden deep inside the complicated innards of a computer, a seemingly niche product more are relying on every day. <\/p>\n

Rewind the clock back to 2019, and Nvidia\u2019s market value was hovering around $100 billion<\/a>. Its incredible speedrun to 20 times that already enviable size was really enabled by one thing \u2014 the AI<\/a> craze. Nvidia is arguably the biggest winner in the AI industry. ChatGPT-maker OpenAI, which catapulted this obsession into the mainstream, is currently worth around $80 billion<\/a>, and according to market research firm Grand View Research<\/a>, the entire global AI market was worth a bit under $200 billion in 2023. Both are just a paltry fraction of Nvidia\u2019s value. With all eyes on the company\u2019s jaw-dropping evolution, the real question now is whether Nvidia can hold on to its lofty perch \u2014 but here\u2019s how the company got to this level.<\/p>\n

From games to crypto mining to AI<\/h3>\n

In 1993, long before uncanny AI-generated art and amusing AI chatbot convos took over our social media feeds, three Silicon Valley electrical engineers launched a startup that would focus on an exciting, fast-growing segment of personal computing: video games<\/a>.<\/p>\n

Nvidia was founded to design a specific kind of chip called a graphics card \u2014 also commonly called a GPU (graphics processing unit) \u2014 that enables the output of fancy 3D visuals on the computer screen. The better the graphics card, the more quickly high-quality visuals can be rendered, which is important for things like playing games and video editing. In the prospectus<\/a> filed ahead of its initial public offering in 1999, Nvidia noted that its future success would depend on the continued growth of computer applications relying on 3D graphics. For most of Nvidia\u2019s existence, game graphics were Nvidia\u2019s raison d\u2019etre.<\/p>\n

Ben Bajarin, CEO and principal analyst at the tech industry research firm Creative Strategies, acknowledged that Nvidia had been \u201crelatively isolated to a niche part of computing in the market\u201d until recently.<\/p>\n

Nvidia became a powerhouse selling cards for video games \u2014 now an entertainment industry juggernaut making over $180 billion in revenue<\/a> last year \u2014 but it realized it would be smart to branch out from just making graphics cards for games. Not all its experiments panned out. Over a decade ago, Nvidia made a failed gambit to become a major player in the mobile chip market<\/a>, but today Android phones use a range of non-Nvidia chips<\/a>, while iPhones use Apple-designed ones.<\/p>\n

Another play, though, not only paid off, it became the reason we\u2019re all talking about Nvidia today. In 2006, the company released a programming language called CUDA that, in short, unleashed the power of its graphics cards for more general computing processes. Its chips could now do a lot of heavy lifting for tasks unrelated to pumping out pretty game graphics, and it turned out that graphics cards could multitask even better than the CPU (central processing unit), what\u2019s often called the central \u201cbrain\u201d of a computer. This made Nvidia\u2019s GPUs great for calculation-heavy tasks like machine learning (and, crypto mining). 2006 was the same year Amazon<\/a> launched its cloud computing business; Nvidia\u2019s push into general computing was coming at a time when massive data centers were popping up around the world. <\/p>\n

\n