
An increasing number of customers state that the most interesting thing about artificial intelligence is no longer the intelligence, but what it quietly gets done in the background. In the space of a few years, AI has advanced from automating back‑office chores, to creating passable prose, and even code.
Senior Vice President of Alibaba Cloud Intelligence.
From bots to board priorities
The first corporate interplay with AI could be considered a marriage of convenience with automation. Organizations deployed machine-learning models and “bots” to shave seconds off workflows, route support tickets, or to flag fraud; the latter being especially useful given that the threat landscape intensifies on a daily basis.
By 2025 though, AI had stopped being deployed on side projects and had become a default part of how large organizations operate. According to McKinsey’s latest State of AI research, about four‑fifths of organizations now use AI in at least one business function, and around seven in ten report using generative AI tools regularly.
Article continues below
Those figures mark a dramatic jump from that early‑stage experimentation just two years earlier. The same survey notes that generative AI adoption has moved from “early curiosity” to “broad‑based use,” with more than 70% of respondents deploying it across functions such as operations, marketing, and customer service.
We’re seeing that, what began as a technical trial has, in effect, now become a management preoccupation in most global enterprises.
Perhaps more interestingly, the financial story has shifted. Across McKinsey’s 2024 surveys, a growing share of executives reported that gen AI use cases were raising revenue within the business units deploying them, and not just trimming costs.
By late 2024, a meaningful subset of leaders reported revenue increases of more than 10% in certain functions attributable to gen AI. At that point, AI made the leap from being a CTO’s side project to becoming a CEO agenda item.
Foundation models fade into the background
If 2025 was the year of foundation models, then 2026 will be the year in which those models disappear into an organizations tech architecture. Large language models have become infrastructure: while they are vital, they are no more strategically differentiating on their own than a database engine.
The trajectory could be predicted, aided as it was by some platforms explicitly positioned as a way to lower the barrier to adopting advanced models and to expose them through serverless, on‑demand cloud services.
In hindsight, that moment marked the beginning of models becoming a commodity for the many, rather than an asset for a few.
Since then, similar model hubs and APIs have proliferated. The technical frontier continues to advance, but competitively the conversation has moved up the stack, from “Which model do you have?” to “What system have you built around it, and what business can it run without your intervention?”
The rise of the AI agent
Arguably, the most interesting systems now are not chatbots but agents. Today, an AI agent does not merely answer a question; it interprets goals, calls, tools, and weaves together a sequence of decisions over time.
For instance, users in China can simply speak or type a request such as “Order 40 cups of coffee, half Americano, half latte”, “Recommend high‑performance robot vacuums under RMB 3,000” or “Plan a family trip to Sanya for Chinese New Year” into an existing LLM app.
Once the user confirms, the app handles everything, from comparing flights and hotels via Fliggy, reserving a private dining room by calling the restaurant, to ordering local meals through Taobao Flash Sale with AI‑assisted payment, filtering and recommending tailored products from Taobao, and even accessing public‑service applications via Alipay; all within a single interface and without switching between apps.
But what begins with holidays will not end there. That very same pattern – goal, context, tools, action – is being applied to shopping journeys and other service applications.
In some firms, a “single task” chatbot for customer service has already given way to a fleet of specialized agents working behind the scenes: one to summarize interactions, another to adjust account settings, a third to flag anomalies for human review; an AI assistant in every sense of the word.
From efficiency lever to growth engine
This evolution has forced boards to rethink how they talk about AI. For years, it was framed as a productivity driver, with Gen AI’s early success in drafting documents and code reinforcing that perception. But the data now coming back from use cases shows a more nuanced and detailed picture.
A closer look at McKinsey’s analysis of gen‑AI deployments in 2024, shows revenue gains at least as frequently as cost reductions, with the largest reported top‑line impacts in areas such as supply chain, marketing and service operations.
In follow‑up surveys, a rising share of respondents indicated revenue increases of 10% or more in business functions using gen AI, particularly in service operations. It is important to note that those are not speculative projections; but self‑reported P&L effects over 12 months.
With that in mind, it is no surprise that CEOs are not asking “How many processes have we automated?” but “Which revenue pools can an AI system open up that we could not economically touch before?”
An AI agent that can serve millions of micro‑segments, each with a different product configuration or journey, starts to look less like a chatbot and more like a new distribution channel.
Model-as-a-service comes to the masses
Underpinning this shift is a quiet revolution in access. Platforms like ModelScope demonstrated that hundreds of high‑quality models – covering vision, language and speech – could be exposed as cloud services, with serverless scaling and pay‑as‑you‑go pricing.
By abstracting away infrastructure management, these offerings allow even modest teams to orchestrate complex AI workflows without standing up their own research labs.
The effect is to level the AI playing field. Competitive advantage no longer comes from owning the “best” model in isolation. It comes from proprietary data, tightly designed workflows, domain‑specific agents and, increasingly, from the ability to earn and retain user trust when software can act on their behalf.
What boards must do next
For business leaders, this year’s AI debate should be less about pilots and more about architecture. The strategic questions are now:
Which parts of the business are ready to be expressed as goals and constraints that an agent can act upon?
Which systems expose the APIs and guardrails needed for safe autonomy?
Which revenue lines could be re‑imagined as always‑on services delivered by software?
There is also an organizational reckoning looming, given that agentic systems cut across silos. Take for instance a travel agent.
Booking flights, hotels and restaurants will demand data and decision rights from multiple business units. In scenarios like that, the lesson here is that the firms set to benefit most from AI, will be those willing to redesign processes, incentive structures and risk controls around AI‑native ways of working, rather than sprinkling agents on top of legacy workflows.
AI has grown-up astonishingly fast. From automating invoices to autonomously managing complex journeys, it has progressed from an efficiency tool to a partner in growth.
The next competitive frontier will belong to companies that treat agents not as novelties or threats, but as colleagues – ones who never sleep, never get bored of routine, and can, if properly directed, turn strategy into action at machine speed.
We’ve featured the best AI website builder.
This article was produced as part of TechRadar Pro Perspectives, our channel to feature the best and brightest minds in the technology industry today.
The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/pro/perspectives-how-to-submit
https://cdn.mos.cms.futurecdn.net/4DKiUF32YY5BX96h6fscGL-2560-80.jpg
Source link




