Technology

Google Training Gemini On Its Own Chips Reveals Another Of Its Advantages

0
Please log in or register to do it.
Google Training Gemini On Its Own Chips Reveals Another Of Its Advantages


Google on Wednesday unveiled its highly anticipated new artificial intelligence model Gemini, an impressive piece of software that can solve math problems, understand images and audio and mimic human reasoning. But Gemini also reveals Google’s unique advantage compared to other AI players: Google trained it on its own chips designed in house, not the highly-coveted GPUs the rest of the industry is scrambling to stockpile.

As the AI arms race has heated up, GPUs, or graphics processing units, have become a powerful currency in Silicon Valley. The scrum has turned Nvidia, a company founded 30 years ago that was primarily known for gaming, into a trillion dollar behemoth. The White House has clamped down on chip exports to China, in an attempt to keep the AI prowess of a foreign adversary at bay.

But analysts say the fact that Google DeepMind, the tech giant’s AI lab, trained its marquee AI model on custom silicon highlights a major advantage large companies have against upstarts, in an age where giants like Google and Microsoft are already under intense scrutiny for their market dominance.

Google’s compute hardware is so effective it was able to produce the industry’s most cutting edge model, apparently one-upping OpenAI’s ChatGPT, which was largely built using Nvidia GPUs. Google claims that Gemini outperforms OpenAI’s latest model GPT-4 in several key areas, including language understanding and the ability to generate code. Google said its TPUs allow Gemini to run “significantly faster” than earlier, less-capable models.

“If Google is delivering a GPT-4 beating model trained and run on custom silicon, we believe this could be a sign that AI tech stacks vertically integrated from silicon to software are indeed the future,” Fred Havemeyer, head of U.S. AI research at the financial services firm Macquarie, wrote in a note to clients. Havemeyer added, however, that Google is uniquely positioned to make use of custom chips like few others can, flexing its “scale, budget, and expertise.”

“Google showed that it’s at least possible,” Havemeyer told Forbes. “We think that’s really interesting because right now the market has been really constrained” by access to GPUs.

Big tech companies have been developing their own silicon for years, hoping to wean themselves off of dependency from the chip giants. Google has spent nearly a decade developing its own AI chips, called Tensor Processing Units, or TPUs. Aside from helping to train Gemini, the company has used them to help “read” the names of the signs captured by its roving Street View cameras and develop protein-folding health tech for drug discovery. Amazon has also launched its own AI accelerator chips, called Trainium and Inferentia, and Facebook parent Meta announced its own chip, MTIA, earlier this year. Microsoft is reportedly working on custom silicon as well, reportedly code-named Athena. Apple, which has long designed its own silicon, unveiled a new chip earlier this year called R1, which powers the company’s Vision Pro headset.

Lisa Su, CEO of the chip giant AMD, which has a smaller share of the GPU market, has shrugged off concerns that big tech customers could someday be competitors. “It’s natural,” she told Forbes earlier this year. She said it makes sense for companies to want to build their own components as they look for efficiencies in their operations, but she was doubtful big tech companies could match AMD’s expertise built up over decades. “I think it’s unlikely that any of our customers are going to replicate that entire ecosystem.”

Google’s new model has the potential to shake up the AI landscape. The company is releasing three versions of Gemini with varying levels of sophistication. The most powerful version, a model that can analyze text and images called Gemini Ultra, will be released early next year. The smallest version, Gemini Nano, will be used to power features on Google’s flagship Pixel 8 Pro smartphone. The mid-level version, Gemini Pro, is now being used to power Bard, the company’s generative chatbot launched earlier this year. The bot initially garnered a lukewarm reception, generating an incorrect answer during a promo video and wiping out $100 billion in Google parent Alphabet’s market value. Gemini could be Google’s best shot at overtaking OpenAI, after a bout of instability last month as CEO Sam Altman was ousted and reinstated in a matter of days.

Google also used the Gemini announcement to unveil the newest version of its custom chips, the TPU v5p, which Google will make available to outside developers and companies to train their own AI. “This next generation TPU will accelerate Gemini’s development and help developers and enterprise customers train large-scale generative AI models faster, allowing new products and capabilities to reach customers sooner,” Google CEO Sundar Pichai and DeepMind cofounder Demis Hassabis said in a blog post.

Gemini is the outcome of a massive push inside Google to speed up its shipping of AI products. Last November, the company was caught flat-footed when OpenAI released ChatGPT, a surprise hit that captured the public’s imagination. The frenzy triggered a “code red” inside Google and prompted cofounder Sergey Brin, long absent after leaving his day-to-day role at the company in 2019, to begin coding again. In April, the company merged its two research labs, Google Brain and DeepMind, which had previously been notoriously distinct, in an attempt to give product development a push.

“These are the first models of the Gemini era and the first realization of the vision we had when we formed Google DeepMind earlier this year,” Pichai said. “This new era of models represents one of the biggest science and engineering efforts we’ve undertaken as a company.”



Source link

Christmas gifts for the person who has everything: Give them an experience instead
$25 Secret Santa Christmas Gift Ideas For Every MBTI