After I write my annual predictions, I keep a little file of stories that relate to my prognostications. The most active one so far – if you tune out my opening line that “this is not going to be a normal year” – is #3: “2025 will not be the year AI agents take off.” It may be hard to recall, but by the end of last year, AI agents and “the agentic web” were all the rage, pushed as the Next Big Thing by just about everyone who had a stake in tech’s Numbers Go Up economy.
But it struck me that there was a lot of wood to chop between the hand waving of tech optimists and the reality of how complex systems actually work. I noted that the most significant structural impediment was Big Tech’s business model, which is reliant on consumer advertising and enterprise subscriptions and sales. Agents, as I pointed out in Where’s The Business Model in Chat-Based Search?, will likely undermine traditional consumer advertising models employed by Google and Meta. As for the enterprise, well, inter-operability been the bugaboo and the holy grail of enterprise software for as long as enterprise software has existed. Without protocols that allow developers to integrate across diverse systems, agents are never going to take off.
It takes years, not weeks, for such protocols to emerge and gain widespread support. Earlier this year I wrote about Anthropic’s MCP, which addresses a core issue: data connectivity (OpenAI recently announced support for MCP.) But MCP doesn’t address a host of other integration issues, including user interface, directory services, communication handling, and many other dull-but-important tasks. Aware of this problem, Google this week announced another protocol: A2A.
Launched at Google’s Cloud Next conference, A2A stands for Agent2Agent, an open source effort guided by Google and already endorsed by more than 50 enterprise software firms. “The A2A protocol will allow AI agents to communicate with each other, securely exchange information, and coordinate actions on top of various enterprise platforms or applications,” Google wrote in its announcement. “We believe the A2A framework will add significant value for customers, whose AI agents will now be able to work across their entire enterprise application estates.”
The phrase “will now be able to” is doing a lot of work in that statement, as A2A is a novel protocol that, at present, is more of a promise than a reality. Even Google admits as much – the very next sentence shifts from the present tense to an imagined future:
“This collaborative effort signifies a shared vision of a future when AI agents, regardless of their underlying technologies, can seamlessly collaborate to automate complex enterprise workflows and drive unprecedented levels of efficiency and innovation.”
Ah yes, that fabled “shared vision of the future.” One in which every piece of software, every customer interaction, every sale, and all the data linking them together hold hands and live as one. I’ve been hearing that story since I covered the “executive information system” beat in the late 1980s. We’ve come a long way in the past 40 years, but we’ve a long way yet to go.
A2A is an exciting and needed addition to the “agentic web” stack, but as I said, it takes years for protocols to gain traction. Perhaps Google’s offering will take off – when it ships, according to Google’s announcement, “later this year.”
—

John–loved your book on search. What are your thoughts on Google AgentSpace and/or Glean as the unifying layer and interface for enterprise AI/agents? They enable unified data access/index, immense understanding of user intent, and actions across systems. Really feels similar to how Google built a “Database for Intentions” on the consumer side.
Glean/AgentSpace are building an unparalleled “Database of Intentions” for the enterprise – knowing what users need to do, what they try to do, and where they succeed or fail. It understands the effective workflows of the most productive employees and can propagate best practices through automation. If Google search was what aggregated consumer clickstream/intent data and transformed patterns in those clickstreams into revenue via advertising (Google), these enterprise AI/search interfaces are what will aggregate workflow/tacit knowledge data and transform that into revenue via agents that do the work for business users.
Really think that there’s something interesting here and it’s always blown my mind that no one has won the universal interface for the enterprise. Microsoft, Salesforce, Google, etc. all compete, but no one has truly won. Average enterprise has 100+ apps. Completely different than how consumer turned out where there’s only a few major properties and they all continue to look more and more similar.
I’m starting to think of the constellation of horizontal SaaS apps (Salesforce, Workday, ServiceNow, etc.) as the “10 blue links” of the enterprise internet. Glean/Google AgentSpace are becoming the Google Search/Chrome – the primary interface through which users find and act on information and increasingly, get work done for them. This will relegate a lot of these enterprise SaaS players to backend API calls / databases, and eventually it’s possible they are pretty obsolete, much like many of the independent open web publishers. More and more about the info, work, etc. will originate and be done inside Glean/AgentSpace over time.
Appreciate your thoughts always!