316
Audio & Video Production295
Software Development223
Automation & Workflow195
Writing & Content Creation178
Marketing & Growth170
AI Infrastructure & MLOps139
Design & Creative146
Photography & Imaging136
Data & Analytics106
Voice & Speech121
Education & Learning117
Customer Support108
Sales & Outreach105
Research & Analysis84
Google Cloud CEO Thomas Kurian says new AI chips and Gemini models can lower costs and help Google catch up in cloud computing.
In short: Google says its own AI chips and AI models will help Google Cloud compete more closely with Amazon and Microsoft.
Google Cloud CEO Thomas Kurian said Google is leaning on its in-house AI work to gain ground in the cloud market, where Amazon Web Services and Microsoft Azure are bigger players. Cloud computing is basically renting computing power and storage from large data centers instead of running your own servers (like renting a warehouse instead of building one).
Google announced two new versions of its TPU chips this week. TPU stands for Tensor Processing Unit, a special computer chip made to handle AI work. One new chip is aimed at training AI models, which is the long, expensive process of teaching an AI system using lots of data. The other is built to run AI systems faster, often called inference (like using a trained brain to answer questions quickly).
Kurian argued that owning the chips and the AI models, like Google’s Gemini models, means Google does not have to buy as much from outside suppliers such as Nvidia. He also said this can improve profits, since less revenue goes to chip and model partners.
The Financial Times report also notes growing tension with Nvidia. Nvidia CEO Jensen Huang has questioned Google’s performance claims and said Google has not put its chips through independent tests. Kurian responded that many AI labs use TPUs, and that customers can choose what hardware to buy.
For everyday people, this matters because the cloud runs many of the apps and online services people use daily. If Google can offer AI computing at a lower cost, businesses may be able to build and run AI features more cheaply, which can affect prices, speed, and how widely these tools show up in products.
Source: Financial Times