The true transformers

ARTICLE | December 9, 2025

The true transformers

Digital intelligence comes alive when it enters the real world

By Eugene Chuvyrov, Innovation Engineering; Nick Diaz, Innovation Engineering; and Ian Krieger, APG Innovation Officer


Imagine that an offshore wind farm experiences a damaged turbine that triggers a systemic outage. Today, repairing it would involve assembling a human crew, arranging transportation, and hoping things don’t get worse before the workers arrive to assess the problem and fix the turbine.

Soon, however, a different scenario may play out. Even before the turbine fails, a digital agent detects an anomaly in performance and activates a field service drone. The AI-powered robot autonomously identifies, assesses, and repairs the issue, ensuring the turbine operates normally without downtime. Best of all, the system will learn from this experience to prevent similar problems from happening in the future.

Such a reality may be possible sooner than you think, thanks to physical AI, which combines AI's reasoning capabilities with a robot’s ability to act in the world. Generative AI has transformed how we process information and make decisions, and physical AI will extend those capabilities into factories, hospitals, farms, power plants, and more. 

When digital intelligence comes alive in the physical world, the question won’t be whether it will reshape operations. Rather, it will be whether you and your organization are ready to capitalize on it. Translating these technological advances into business gains requires the ability to orchestrate digital and physical workflows as a unified system. Without that integration, even the most sophisticated robots will remain isolated tools, impressive but expensive and limited. 

Related

Physical AI: How AI and LLMs are supercharging robotics

For decades, robots excelled at repetition but struggled with adaptation. A robot trained to pick and sort apples in a well-lit warehouse might fail to succeed at the same task outdoors on an overcast day, for example. The problem wasn't with the hardware, but with the intelligence layer behind it. Traditional robotics required programmers to anticipate every possible scenario and code explicit instructions for how the bot should respond to each one, a process that couldn't scale.

Two AI-fueled advances are quickly changing this. The first is vision-language-action models, which teach robots to see, understand, and act in the moment. These systems process visual information, interpret natural language, and translate both into physical actions. When you tell a robot in plain language to pick up an apple, it can now understand the command, recognize an apple and its location in three-dimensional space, and figure out how to grasp it—all without having seen that exact apple before. Robots using these models can respond to commands, such as "Lift legs higher when climbing stairs," and immediately adjust their behavior. No code rewrites. No downtime.

The second breakthrough addresses the data problem in robotics. Until now, a robot transporting cargo at an airport would need rigid paths to follow and behaviors preprogrammed, requiring time-consuming field training that rendered their use costly and limited. Today, simulation technology can create hyper-realistic virtual environments where robotic systems learn from thousands of real-world scenarios in a compressed amount of time (such virtual environments are known as digital twins). With a wealth of knowledge gained from these simulations, AI robots can hit the airport tarmac running, so to speak, ready to deal with all types of contingencies, such as a variety of weather conditions and physical obstacles.

Consider how 1X trained humanoid robots for household chores. By simulating domestic environments, the company adapted robots faster than ever to tasks such as vacuuming. Stanford researchers extended this approach, training robots at super speeds on 1,000 household activities and then deploying them in real homes, where they performed reliably despite never having seen those spaces before. 

Robotic adoption is increasing. More than 542,000 industrial robots were installed globally in 2024 alone, more than double the number from a decade ago. But here's where most organizations miss the point: Physical AI isn't valuable in isolation.

 

A warehouse robot that can perfectly pick items isn’t useful if it can't communicate with the inventory management system. The real transformation happens when digital agents, physical robots, and human workers operate as a team.

The industrial robotics market is projected to grow from $55.1 billion in 2025 to $291.1 billion by 2035, but revenues won’t grow evenly. Organizations treating physical AI as stand-alone automation will see incremental efficiency gains at best. However, those that embed it enterprisewide in operational workflows will unlock exponential value—not just faster processes, but entirely new capabilities.

Manufacturing giants are already using digital twin technologies to design fully AI-driven robotic factories. At the same time, the robot training process has shortened from years to months. Buzzy startup Figure AI is building general-purpose humanoid robots, and it announced plans in April to deploy more than 200,000 of them by 2029. These robots already work on BMW factory floors, operating in spaces and using tools designed for humans without requiring facility redesigns.

Generative AI has transformed how we process information and make decisions, and physical AI will extend those capabilities into factories, hospitals, farms, power plants, and more.”

What does this mean for business leaders? While agentic AI will transform millions of jobs, organizations currently face worker shortages that automation will not be able to fill. American manufacturing alone faces a projected 2.1 million-worker shortage by 2030.

In short, there is a necessary role for robots in this future. Physical AI will handle dangerous tasks, extend working hours beyond human limitations, and perform precision work at a scale previously unattainable, while human workers oversee and manage such systems. However, this transformation occurs only when AI-powered robots are integrated into workflows and aligned with broader business goals and strategic priorities.

When robots operate alongside digital agents and human operators, businesses will transform workflows to deliver new business value. Instead of just fixing an isolated wind turbine, workflows augmented by digital and physical AI will produce insights applicable across the entire business.

The success of tomorrow’s business leaders will be determined by how open they are to the transformative power promised by physical AI. The path forward requires humans to accept that robots will go from isolated automation tools to team members in a workforce that spans humans, digital AI agents, and thinking machines, while fundamentally reimagining how work gets done. 

The ‘Wild West’ era of AI is over

Related articles

Creating the smart factory
Creating the smart factory

The future of manufacturing is a combination of highly trained people and the latest tech and automated processes

GenAI will take your job (and make it better)
ARTICLE
GenAI will take your job (and make it better)

Emerging tech promises to transform jobs and humanize work

The city that thinks
ARTICLE
The city that thinks

AI is creating ever-improving cognitive cities where organizations will thrive 

Making AI experiments work
ARTICLE
Making AI experiments work

Using hyperautomation to weather uncertainty

Author

ServiceNow Innovation Engineering.

Author

ServiceNow Innovation Engineering.

Author

ServiceNow innovation officer for APJ.

Loading spinner