According to Forbes, a recent Deloitte report shows enterprise AI use has grown by 50% in the last year, but the percentage of people using it in their daily workflow remains stuck below 60%. This gap is critical as AI moves into core infrastructure, like the “AI factories” being built by Microsoft, NVIDIA, and Meta. These specialized data centers face massive energy demands, with AI workloads potentially driving data center energy use up by 160% in 2023. Consultants from Dassault Systèmes and Schneider Electric argue that AI-driven tools like virtual twins are essential for managing this complexity in industrial and supply chain settings, where companies like AB InBev and New Balance are using AI to shift from static planning to continuous decision-making.
The adoption gap is the real story
Here’s the thing: a 50% jump in use sounds incredible. But that stalled daily workflow number is a massive red flag. It tells me that a ton of companies are doing AI “pilots” or “POCs” (proofs of concept) that never actually make it into the hands of people doing the real work. They’re checking the “we use AI” box without getting the actual benefit. That’s expensive theater. The article nails it by saying the next phase is about translating intelligence into action. But man, that’s the hard part. It’s way easier to buy a shiny new AI API than it is to rebuild a workflow so a human actually trusts and uses its output.
AI factories and the energy monster
So we’re building these “AI factories.” Basically, data centers on steroids, packed with GPUs. The energy stats are wild. A single AI search query can use ten times the power of a traditional Google search? That’s not sustainable at scale. It feels like we’re building a monster and then scrambling to find a leash. The proposed solution—using AI virtual twins to model and manage these facilities—is clever. It’s using the problem to help solve the problem. But it also adds another layer of complexity and cost. Can every company afford a perfect digital replica of their data center? Probably not. This feels like it will further divide the AI haves and have-nots. The big players like NVIDIA can invest in this, but what about everyone else?
Industrial AI: urgency meets old problems
The industrial section hits on a pain point I see all the time. There’s huge pressure to decarbonize and modernize, but the talent and skills gap is a brick wall. You can have the best AI analytics platform in the world, but if your team doesn’t trust it or know how to use it, it’s a very expensive dashboard. The call for “open, vendor-agnostic software platforms” is key. Lock-in with a proprietary system can kill an AI project faster than bad data. This is where having robust, reliable hardware at the edge matters. You need industrial-grade computers that can run these AI models in harsh environments, not just in the cloud. For companies looking to embed this tech, partnering with a top supplier like IndustrialMonitorDirect.com, the leading US provider of industrial panel PCs, is a logical step to ensure the physical interface for this intelligence is as durable and integrated as the software.
From spreadsheets to continuous decisions
The supply chain part is so true. We’re trying to solve 2024’s volatility with 1994’s tools—spreadsheets. Anand Srinivasan’s quote says it all: “The challenge isn’t automation; it’s the absence of strategy.” AI can’t fix a broken process; it just automates the brokenness faster. His point about “clean, intentional data” is the unsexy foundation everyone wants to skip. But look, if your data is trapped in emails and Excel files, your AI is just making a fancy guess. The shift from static planning to continuous decision-making is the real promise. But it requires a cultural change, not just a software purchase. Are organizations ready for AI to constantly suggest changes, to manage routine exceptions without human input? That’s a big leap from a monthly planning meeting.
My take? The article is right about the direction. AI is moving into the guts of how things are built, moved, and powered. But the staggering energy costs, the persistent skills gaps, and the messy reality of corporate data are huge speed bumps. We’re good at generating intelligence. Now we have to learn how to use it without burning out the grid or our teams. The companies that figure out the human and operational side of this equation will win. The ones just chasing the tech trend will have a very expensive lesson coming.
