351
Audio & Video Production340
Software Development246
Automation & Workflow218
Marketing & Growth188
Writing & Content Creation203
AI Infrastructure & MLOps168
Design & Creative169
Photography & Imaging155
Data & Analytics133
Customer Support123
Voice & Speech133
Sales & Outreach122
Education & Learning128
Research & Analysis95
At a Milken conference panel, executives said AI demand is hitting limits in chip supply, energy use, and real world data for robotics and vehicles.
In short: Several AI and tech leaders said the AI boom is running into real world limits, including not enough chips, not enough power, and not enough useful data.
Five people from different parts of the AI industry spoke with TechCrunch at the Milken Global Conference in Beverly Hills. They included the CEO of ASML, which makes key machines used to manufacture advanced computer chips, and leaders from Google Cloud, Perplexity, Applied Intuition, and a startup called Logical Intelligence.
One message was that demand is outpacing supply. ASML CEO Christophe Fouquet said chip manufacturing is speeding up, but the market could still be “supply limited” for the next two to five years. In plain terms, even big buyers may not be able to get all the chips they want.
Google Cloud COO Francis deSouza said the power needed to run AI systems is also becoming a constraint. He confirmed Google is exploring data centers in space as one possible response, mainly because of access to energy. He also noted that cooling computers in space is harder because there is no air to carry heat away.
Applied Intuition CEO Qasar Younis said his bottleneck is not chips, it is real world data for machines like cars, drones, and other equipment. He said you still need to send systems into the real world to learn from what happens, and simulation alone does not replace that.
The panel also highlighted two other pressure points. One is trust and control as AI “agents” (software that can take steps on your behalf, like a junior assistant) are used inside companies. Another is whether new approaches, like Logical Intelligence’s “energy-based models” (a different kind of AI meant to learn rules, not just predict the next word), can reduce the need for massive computing power.
Source: TechCrunch AI