Featured Topics
Featured Products
Events
S&P Global Offerings
Featured Topics
Featured Products
Events
S&P Global Offerings
Featured Topics
Featured Products
Events
S&P Global Offerings
Featured Topics
Featured Products
Events
Financial and Market intelligence
Fundamental & Alternative Datasets
Government & Defense
Banking & Capital Markets
Economy & Finance
Energy Transition & Sustainability
Technology & Innovation
Podcasts & Newsletters
Financial and Market intelligence
Fundamental & Alternative Datasets
Government & Defense
Banking & Capital Markets
Economy & Finance
Energy Transition & Sustainability
Technology & Innovation
Podcasts & Newsletters
Blog — Jan. 27, 2026
By Neil Barbour
Robots took a decidedly unhurried step forward in the physical AI era at CES from Jan. 6–9, 2026, as over 140,000 people poured into Las Vegas to bear witness to an emerging class of automatons folding laundry, carrying free weights and playing ping pong.
There are two key dynamics pushing functional humanoid robots closer to reality. The first is perhaps something everyone saw coming: As AI has become more capable over the past half decade, the industry has naturally wondered what all that advanced reasoning would look like in the real world. Having a computer write a poem is great and all, but wouldn’t a robot doing chores be more useful?
That's where the term physical AI comes into play. It's the act of bringing AI into the physical world in the form of machinery. The same large model approach that helps something like ChatGPT string words together in a sensical way can be used to help a robot figure out what next step it needs to take or a driverless car what its next turn should look like.
The issue, so far at least, is that the robots are a little sluggish. Watching LG Corp.'s CLOiD try to do laundry was an exercise in extreme patience as the robot painstakingly put a single towel in a washing machine, taking a little over 30 seconds to do so.
They’ll have to speed that up a bit if they wish to deliver a more practical humanoid helper agent.
Perhaps that's where NVIDIA Corp. comes in. We were invited to a media Q&A with President and CEO Jensen Huang, during which he highlighted the potential of physical AI and the work his company is doing behind the scenes with its Cosmos Reason 2 visual language model.
The second dynamic advancing practical robotics revolves around good old-fashioned engineering. As Elon Musk famously said of his own robotics ambitions, "Hands are hard," reflecting how difficult it was to build a robot that could actually pick something up. The current wave of robots can move, lift and carry things in dynamic environments in ways that their predecessors could not.
For instance, WIRobotics' Allex is a robot capable of lifting 30 kilograms with hands that can also high five a human and make heart hands. This functionality is the result of a complex array of sensors that can feel up to 100 grams of fingertip force, as well as 20 joints with 15 degrees of motion. There are also advancements in balancing and "backdrivability" in the arms, waist and neck that allow Allex to be pliable, nimble and responsive.
AgiBot Innovation (Shanghai) Technology Co. Ltd. offered a similar demo at NVIDIA's booth, and we talked to smaller startups such as Matrix and Ensuring Operational Safety Ltd., which are working specifically on robotic hands.
But will any of this actually make it to market? Most of the teams we talked to said their robots would be working their way into research or pilot programs over the next few years, with few details on timing and even fewer details on pricing.
That means humanoid robots could one day, in the not-too-distant future, start slotting into warehouses and manufacturing, but it seems far less likely that the average consumer will soon have access to another set of hands to take on the drudgery of daily chores.
Instead, consumers will have to make do with pricey robotic pets, which we saw plenty of on the floor.