‘Why Google Still Needs Nvidia; The AWS-OpenAI Marriage That Could Have Been’ – The Information

https://www.theinformation.com/articles/why-google-still-needs-nvidia-the-aws-openai-marriage-that-could-have-been   Google unveiled its next-generation artificial intelligence chip at its annual enterprise

https://www.theinformation.com/articles/why-google-still-needs-nvidia-the-aws-openai-marriage-that-could-have-been

 

Google unveiled its next-generation artificial intelligence chip at its annual enterprise software conference, Google Cloud Next, on Tuesday. But it still felt the need to tout a different kind of win: a partnership with Nvidia to offer that company’s state-of-the-art AI chips through Google Cloud alongside its own hardware. Nvidia CEO Jensen Huang even appeared on stage at the conference—wearing his signature leather jacket of course—to field questions from Google Cloud CEO Thomas Kurian about how the company’s chips would benefit Google’s customers.

The dynamic reflects a cold reality for Google. Many AI engineers prefer to use Nvidia’s hardware, which is in short supply. But beggars can’t be choosers. Those same developers will take computing capacity wherever they can get it, given the industry-wide shortages for such specialized hardware, according to several attendees. “There’s never enough compute, and it’s never at a low enough cost,” said Saurabh Baji, senior vice president of engineering at AI startup Cohere, at a panel discussion on Tuesday.

That has Google playing a balancing act. It needs Nvidia’s hardware to draw customers to its platform. But it also wants to drive usage of its Tensor Processing Units, the specialized chips in which it has invested heavily in recent years. Google is taking steps to make it easier for developers to use both chips. Google’s software for building large machine-learning models will now work for Nvidia hardware in addition to Google’s chips, the company announced Tuesday.

Total
0
Shares
Related Posts