Jim Anderson, Google Cloud & Dan Rosenthal, Anthropic
In this interview from Google Cloud Next 2026, Sarbjeet Johal, founder and chief executive officer of Stackpane, joins theCUBE's John Furrier to discuss Google's full-stack AI momentum and what enterprises must prioritize as the industry shifts from AI experimentation to agentic execution. Johal ties Google's sharply improved market standing — stock up 111% over the past year — to its vertical integration advantage: by building its own TPUs, the company avoids steep GPU margins paid to third-party suppliers, giving it structurally better AI economics than most cloud peers. He details two new TPU generations announced at the show, one for training with 2.7x price-to-performance gains and one for inference delivering a 5x latency improvement, underscoring Google's push to own the full hardware stack from silicon to Gemini. Additionally, Johal shares a three-part framework for matching AI to the right system type — systems of record, engagement and innovation — arguing that generative AI fits most naturally where language, not deterministic outcomes, drives value. The discussion unpacks the rising cost reality of production AI, including hidden token expenditures triggered every time a model is swapped and retested at scale. Furrier highlights open table formats like Iceberg and the data lakehouse as the single highest-leverage point for unifying data feeds and letting agents operate at full speed. Both analysts identify agent governance as the defining wave ahead — mirroring how DevSecOps unlocked enterprise cloud adoption — and flag change management, not technology, as the primary barrier slowing organizations down. From mapping the nascent primitives of an agentic AI stack to developing the "triple threat" skillset of build, operate and invest, the conversation charts a clear-eyed path through the execution risks of moving too slowly — or too fast — in the current AI cycle.