324
Audio & Video Production312
Software Development229
Automation & Workflow207
Writing & Content Creation190
Marketing & Growth177
AI Infrastructure & MLOps143
Design & Creative153
Photography & Imaging145
Data & Analytics111
Voice & Speech123
Education & Learning119
Sales & Outreach113
Customer Support111
Research & Analysis86
A Wired report shows robots getting better at handling objects, but explains why real-world “physical smarts” are still hard to achieve.
In short: Robots can now do more human-like actions, but making them reliably handle everyday objects is still a major challenge.
Wired describes a new wave of robots that look surprisingly careful when they use their “hands”, like a claw with two pincers. In one example, a robot reaches for a light bulb, slows down before hitting it, then gently grabs it, chases it when it rolls away, and screws it into a socket.
The point is not just that robots can move, it is that they are starting to adjust as they go. That looks a bit like the moment chatbots got noticeably better at conversation, when systems like ChatGPT made text responses feel more natural to many people.
But Wired also raises a basic question: do these robots actually understand the physical world, or are they getting by with narrow skills that only work in certain settings? Handling objects sounds simple, but the real world is messy. A light bulb can slip, a chicken nugget can be greasy, and a table can have clutter. It is like the difference between a person who can cook in any kitchen and a person who can only follow one recipe on one stove.
Watch for how well these robots perform outside of demos, especially in new places with new objects and unexpected problems. If robots can handle more variety without lots of setup, they could become more useful in warehouses, restaurants, and even homes. If not, they may stay limited to tightly controlled jobs where everything is positioned just right.
Source: Wired