355
Audio & Video Production344
Automation & Workflow224
Software Development250
Marketing & Growth192
AI Infrastructure & MLOps173
Writing & Content Creation203
Data & Analytics140
Design & Creative169
Customer Support130
Photography & Imaging156
Sales & Outreach125
Voice & Speech135
Operations & Admin87
Education & Learning131
Nvidia and AMD still dominate AI chips. Public information does not clearly show an AI hardware startup named Amp that could challenge them.
In short: AI chip making is still dominated by a few big companies, and it is unclear whether “Amp” is actually an AI hardware challenger.
Nvidia and AMD control most of the chips used to build and run today’s AI systems. These chips are called GPUs, which are like very fast calculators that can do lots of small math problems at the same time. Those calculations are needed for both training an AI model (teaching it) and inference (using it to answer questions).
As of 2026, Nvidia is widely estimated to hold about 80 to 90 percent of the market for AI accelerator chips. AMD has been gaining some ground with its MI300 series, which targets large AI training work. Other big players include Google and Amazon, which use their own custom chips inside their cloud services.
Against this backdrop, a New York Times story says “Amp hopes to create an alternative” to the giants. But based on publicly available information as of May 2026, there is no clear evidence of a company named Amp building AI compute hardware that competes with Nvidia or AMD.
Instead, “Amp” shows up in unrelated areas. For example, there is Amp from Sourcegraph, which is an AI coding helper that edits software. There are also products like amp simulators for guitar sound, and an older marketing testing product called Amp.ai. None of these are AI chip makers.
If you are looking for real alternatives to Nvidia, watch AMD’s next MI-series chips and specialized chips focused on running AI quickly, such as Groq’s inference hardware. Also watch how much more Google and Amazon push their in-house chips, since they can roll them out inside their own data centers.
Source: NYTimes