316
Audio & Video Production296
Software Development226
Automation & Workflow196
Writing & Content Creation182
Marketing & Growth171
AI Infrastructure & MLOps140
Design & Creative145
Photography & Imaging135
Data & Analytics107
Voice & Speech122
Education & Learning117
Sales & Outreach106
Customer Support107
Research & Analysis85
Meta will use millions of Amazon’s Graviton chips on AWS to run AI systems after training, a shift from the GPU-heavy approach used to build models.
In short: Meta has agreed to use millions of Amazon’s Graviton computer chips through AWS to run more of its AI work.
Amazon said Meta signed a deal to use millions of AWS Graviton chips. Graviton is a CPU, or central processing unit, which is the general-purpose “main brain” chip inside many computers.
This is notable because much of today’s AI boom has centered on GPUs, chips that are often used to train large AI models. Training is like teaching the model by showing it huge amounts of data. After that, companies still need lots of computing power to actually use the model, for example to answer questions, write code, or handle multi-step tasks.
Amazon says Graviton is designed to handle more of this “after training” work, including AI agents. AI agents are systems that can take actions across steps, a bit like a digital assistant that not only answers, but also plans and carries out tasks.
The deal also highlights competition between the big cloud companies. Meta previously signed a six-year deal reported to be worth over $10 billion with Google Cloud, even though it had long used AWS and also uses Microsoft Azure. Amazon also makes an AI-focused GPU called Trainium, but TechCrunch noted that Anthropic has already agreed to spend heavily on AWS with a focus on Trainium.
For regular people, this is about where AI services get their computing power and how much it costs. If companies can run AI more cheaply and reliably using different chips, it can affect the price, speed, and availability of the AI features that show up in everyday apps.
Source: TechCrunch AI