In this NYSE Wired + theCUBE conversation from AI Factories – Data Centers of the Future, Vivek Mohindra, special advisor to the vice chairman at Dell Technologies and national speaker on AI for the U.S. State Department, joins theCUBE’s Dave Vellante to share a global view on AI deployment. Mohindra breaks down what he’s hearing from thousands of customers, partners and regulators – from Korea to Geneva – explaining why enterprises are less worried about an AI “bubble” and more focused on use cases, data readiness and not being left behind. He contrasts customer urgency with government concerns around competitiveness, regulation and skilling, and explains how enterprise AI adoption is increasingly centered on inferencing across a hybrid estate that spans public cloud, on-prem data centers and increasingly powerful PCs and edge devices.
The discussion then turns to what AI factories mean for infrastructure, sovereignty and economics as Dell helps organizations rethink everything from GPU choices to where workloads run. Mohindra shares survey data showing that 74% of customers expect OEMs like Dell to guide their AI roadmaps, highlights why many enterprises see PCs as the most cost- and energy-efficient inferencing node, and explains how sovereign AI strategies depend on countries owning best-of-breed hardware while opening AI capacity to small and midsize businesses. He also unpacks energy as a critical constraint for gigawatt-scale AI data centers, outlines why governments must rethink education and skilling for an AI-first world, and describes AI factories evolving into a distributed system of massive training hubs, smaller edge sites and “mini AI factories” on devices – an architecture he believes can unlock multi-trillion dollar productivity gains relative to today’s CapEx.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Register for AI Factories - Data Centers of the Future
Please fill out the information below. You will receive an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for AI Factories - Data Centers of the Future.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Vivek Mohindra, Dell Technologies
In this interview from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, Glean co-founder and CEO Arvind Jain joins theCUBE’s John Furrier to unpack what’s really working in enterprise AI today and what comes next. Jain explains why knowledge access remains the first successful AI use case at scale and how Glean’s enterprise search brings AI into everyday work. He details the past year’s lessons with AI agents – from the need for guardrails, security, evaluation and monitoring to democratizing agent building so business owners (not just data scientists) can create production-grade agents.
The conversation dives into Glean’s vision of the enterprise brain powered by an enterprise graph, highlighting the importance of deep context, human workflows and behavior to reduce “noise” and drive outcomes. Jain outlines core building blocks – hundreds of enterprise integrations and a growing actions library – that let agents securely read company knowledge and take actions across systems (e.g., CRM updates, HR tasks, calendar checks). He discusses how organizations are standing up AI Centers of Excellence, prioritizing “top 10–20” agents across functions like engineering, support and sales, and why a horizontal AI data platform that unifies structured and unstructured data – accessed conversationally and stitched together via standards like MCP – sets the foundation for AI factory-scale operations. Looking ahead, Jain says Glean’s upgraded assistant is evolving from reactive tool to proactive companion that anticipates tasks and accelerates productivity.
In this NYSE Wired + theCUBE conversation from AI Factories – Data Centers of the Future, Vivek Mohindra, special advisor to the vice chairman at Dell Technologies and national speaker on AI for the U.S. State Department, joins theCUBE’s Dave Vellante to share a global view on AI deployment. Mohindra breaks down what he’s hearing from thousands of customers, partners and regulators – from Korea to Geneva – explaining why enterprises are less worried about an AI “bubble” and more focused on use cases, data readiness and not being left behind. He contrasts cust...Read more
exploreKeep Exploring
What are the key considerations for enterprises when adopting AI, particularly in relation to training and inferencing workloads?add
What are the current challenges in AI adoption and how are business leaders addressing them?add
What challenges are companies facing in their journey related to data and talent?add
What percentage of respondents expect guidance and advice from Dell regarding their technology needs?add
What factors should companies consider when addressing sovereign AI issues and competitiveness in different countries?add
What are the differing perspectives of governments on the concept of sovereign AI and its implications for global competition?add
>> Hey everybody. We're back at the New York Stock Exchange, the New York NYSE Wired plus theCUBE's AI Factory series. Vivek Mohindra is here. He's a special advisor to the vice chairman Jeff Clark at Dell Technologies, a longtime collaborator. Great to see you, man. Thanks so much-
Vivek Mohindra
>> Hey, thanks for having me over, Dave....
Vivek Mohindra
>> for coming here live.
Vivek Mohindra
>> Of course. Can't miss it for the world. This is fun.>> At our new studio here with Wired and theCUBE. And so you've been on a whirlwind tour. So we were in Washington DC last week, listening to Jensen. You were in Korea recently. Of course, that's a hot spot right now in a good way.
Vivek Mohindra
>> In a good way.>> Good stuff happening there. You're here in New York, talking about just the future of AI at the Financial Times Forum tomorrow.
Vivek Mohindra
>> Tomorrow, yeah.>> So what are you seeing in terms of just AI around the world? Everybody's talking about bubble, everybody's talking about CapEx. People are trying to figure it out. What do you think from your former VC, your strategy brain and just having been around long enough to see these things evolve, what's your take on where we're at?
Vivek Mohindra
>> Yeah. Oh, thank you. Thank you, Dave, for having me.>> You bet.
Vivek Mohindra
>> It's a phenomenal set. I'm really delighted to be here. It's been a long time in the making. Hey look, as you mentioned, I've been traveling a fair bit and I was spending a lot of time with our customers as well. So I was in Korea. I spent time with 3,000 of our customers and partners, and then the US Embassy had hosted me as, I think as I mentioned to you, I'm a national speaker on AI for the State Department. So they had hosted me and I met with regulators and entrepreneurs. I'm going to meet with FT tomorrow and do the panel with FT tomorrow, and then I'm headed to Geneva to do a same thing. The US Embassy hosting me over there. If I zoom out, what I'm detecting are two different things from customers, and then from a regulator's perspective. From the customer's perspective, it is all about they don't want to be left behind. Nobody's talking about a bubble. Those who have started early are already beginning to see some productivity and other gains. So there are converts. Those who haven't started are basically worrying about being left behind. So those are the sentiments from the customers, and by and large, they're saying, "Hey, look, how do we get started? Help figure out the use case, data," all the things you and I know about and talked about. The regulators is a really interesting domain. The regulators, by and large, are worried about two things. One, how are they positioned, meaning that particular country positioned vis-a-vis the others? And number two, how to balance the regulatory framework, which puts emphasis on safety with innovation. So those are the two big themes I hear from the government. And then the third theme that shows up, it usually shows up at countries like India who have massive population and young population. It's all around the implications of AI on skilling and on jobs. So those are the three things I hear. But look, we've been in the industry a long, long time. I have fundamentally been tracking it three decades now. We have these waves that go on where the hardware accelerates and all the hardware gets deployed and the software catches up and fills it all up, and then the hardware accelerates some more. I don't think we are in a bubble just given how early we are in the deployment of these use cases, and I can't see an end to it in the short term.>> We were talking to Michael Dell at the Dell Tech Summit. He said, "Look, of course we've been around long enough to know sometimes bubbles burst, but we don't see any signs yet that there's a problem or certainly not an oversupply." Every hyperscaler that's spending on CapEx is talking about limited compute.
Vivek Mohindra
>> Correct.>> You guys, I'm sure, would sell every Blackwell that you can get your hands on.
Vivek Mohindra
>> And other variants, not only Blackwell, other variants as well.>> So explain those other variants, because we're so enamored of what NVIDIA of course is doing, but the enterprise doesn't necessarily need thousands and thousands of GPUs. So what are you seeing in terms of enterprise AI adoption?
Vivek Mohindra
>> Yeah, so enterprise AI adoption pretty much comes up in all my conversations when I'm talking to customers. And when I think about it, I think as you well know, we have training and we have inferencing. There's these massive tens of thousands of GPUs, GB300 equivalents and gigawatt-scale data centers by and large are focused on training. Enterprises and the value they derive is largely driven by inferencing workloads. But when you look at inferencing itself, the conversation I end up having with enterprises really goes along the lines of thinking about the entirety of the estate they have to execute these workloads, which is fundamentally hybrid. So there are some of these workloads that they will do in the public clouds, many of them who are lucky that they continue to maintain an on-prem presence in many ways, that's at their disposal right now. And then there's these PCs and edge devices which are getting more and more performant as these models are getting much more performant and capable of being executed in the edge too. So when you look, at least when I think about inferencing workloads, many of them do not need the GB300 class of GPUs. There's a whole spectrum that they can actually use, including ranging from what I just mentioned in terms of PCs and workstation and the role they play. And the more sophisticated enterprise I speak with both at our EBCs and when I travel all over the world, are thinking about the entirety of the estate with that level of sophistication. And they're beginning to figure out that a PC fundamentally is the most cost and energy efficient node to execute these workloads. And if they have the power and space budget within their on-prem data centers, they can actually put a lot of capacity in there to execute other types of workloads in there. And that's why I think it's not going to be all about these GB300s for the enterprises. The smarter ones are thinking about the entirety of the portfolio and thinking about different types of GPUs they could use.>> When you speak to these enterprises, I'm interested actually who you're talking to. I mean, you guys have obvious, Dell of course is, your peeps are the IT folks, but you're talking to a much broader audience these days. So what's the persona or role that you're speaking to, and where are they struggling in terms of AI adoption? What are the barriers that they're seeing and how are they overcoming them?
Vivek Mohindra
>> Yeah, so we actually, I think as you correctly point out, the adoption of AI is predominantly being driven by business people with the IT decision makers and others playing a role in facilitating it and providing the right parameters around it. For the people we've been actually talking to, have expanded to what it used to be. We used to talk to a lot of IT folks obviously, but now we are beginning to talk to more and more business-centric people, either people who are running companies or business leaders. And I was struck by when I was, for example, in Korea, I was struck by there are many of these business folks who are showing up for these types of discussions, because this is such an imperative to drive both growth and productivity for all of them that they need to show up. So those are the people we're talking to. The challenges, the challenges are interestingly the same that we discovered on our own when we started our own journey three years ago. So where do they focus the use cases? How did they get the data in shape? How do they think about the ideal place to execute this workload? And they are all running into issues related to, or many of them are running into issues related to talent. They don't have the talent to be able to do that. To figure out which use cases, they don't have the talent to be able to figure out what data to put in place. And interestingly, this is an interesting stat, which I didn't expect as I had these discussions and we did our own survey of customers. We surveyed 5,000 customers last year late in November, and we repeated that again earlier this year. 74% of them are actually expecting somebody like Dell, an OEM, to provide them the guidance and advice in terms of all of these issues. Because what they primarily tell me is they say, "Vivek, the issue is anybody else who shows up has a particular tilt to what they'll tell us to do. But with you guys, you have a broadest portfolio in the industry with server storage, networking, data protection, and PCs. And we know that you're not going to sell us an expensive rack if we can execute something on a PC. And similarly, if you don't think a PC's good enough, you'll come and tell us what we ought to do." So we are beginning to find out we are getting drawn into more and more of these discussions with business folks who are looking towards us to help them decide what they ought to do. That is something I expected when I was a partner at McKinsey decades ago, but I did not expect that from somebody like us. But we are being given that mandate.>> Well, plus I think you guys are ahead of most organizations just in terms of the urgency with which, I mean, Michael Dell's been pretty forthcoming about how he stood up in front of the entire company, said, "If we don't disrupt ourselves, we're going to be out of business."
Vivek Mohindra
>> Absolutely.>> "We're going for it in AI." So you've probably got more experience than the average company. Financial services obviously leaning in. They tend to be sort of canary in the coal mine, if you will. I want to come back and ask you about the edge, and I want to tie it in to so US competitiveness. You saw last week an interesting announcement between Nvidia and Nokia, and when Jensen announced that, he basically said, "Look, we lost the 4G, 5G race, if you will, and we're going to win that back." Interesting he chose Nokia. Nokia ironically picked up Bell Labs when AT&T was broken apart.
Vivek Mohindra
>> That's right.>> Thank you, US government. Whether you're like it or not, that's what the result was. But the point being that the US needs to regain its competitiveness for the telco in the telco industry. How does that affect the edge opportunity for you? Does it matter to Dell if the US dominates that? Does it favor you because you're a US-based company? What does it mean to have that sort of telecommunications infrastructure generally, and then specifically, having US companies be more competitive?
Vivek Mohindra
>> Well, look, there's a lot of different elements to that, Dave. I think for us, we're a global company. We sell to all companies globally, as you well know. And what we basically advise companies all over the world and countries too, as we get into these sovereign AI type of issues is, look, if you own these assets, meaning the servers and storage and the AI factories, if you own them using let's say Dell gear, then you own it and you control your own destiny. So there is a part of that which basically talks to the competitiveness and it really builds upon, okay, what do you do with these factories which different countries are thinking about differently? Now, after your question around the Nokia announcement that Jensen and Justin made last week, look, I think there are all these different industries which suddenly lots of countries are realizing are critical to their competitiveness. So telecommunications is one. We've had a lot of discussion and debate about rare earth elements recently that I'm sure you're tracking and following.>> Of course. Energy is-
Vivek Mohindra
>> Energy is a big one as well. So as far as we are concerned, I think it is good for countries to think about strategically, how do they harness AI to really get the most out of AI for residents of that country and citizens of country. And they got to think about just like we think about how do we transform ourselves with AI, countries have to think about how do they transform and form their own competitive advantage based on that. So it's a long-winded way of saying, I'm not surprised by this. We will see more and more of this show up, because AI is fundamentally a massive enabler to re-architecting and rethinking all kinds of industries, including telecom. I mean, we used to have all these different network equipment providers a long time ago. It's heavily consolidated down now, right?>> Yeah. You basically have Huawei, Ericsson and Nokia.
Vivek Mohindra
>> And Nokia, yeah.>> And okay, Huawei's going to dominate China. We don't want Huawei to dominate outside of China. Obviously Ericsson's a player there. Nokia and Nvidia could be a really interesting combination. So I want to ask you about, come back to energy, because everybody's talking about energy as a big blocker. Land, water and power are the big three. I feel as though you know this from the strategy work that you've done, that countries that do well, there's a proportional relationship with their energy, accessibility to cheap energy.
Vivek Mohindra
>> Absolutely.>> So it seems to me that that problem will be solved. I don't see the US not solving that problem. You're seeing data centers going up and they're putting basically liquid natural gas capabilities right there in the data center. There's a lot of concern in Virginia, for instance, the households are paying more, but ultimately I think that will get resolved. How much of a blocker, we know it's a critical input.
Vivek Mohindra
>> Yeah.>> Do you agree that that problem will be solved by the innovators?
Vivek Mohindra
>> It is, but it's a matter of time scale. Energy is a really, really important discussion in my view because, and this comes up when people ask me about different countries and how they're positioned. If you are in a country like which at the beginning of the decade had less than half of the energy generation capacity of the US and now have two and a half times as much, then the constraint in which you're operating is very different, the constraint you're operating with in terms of AI workloads and how you think about the effectiveness, power effectiveness of these different GPUs and the teraflop for what is very different domain. For us in the US, because of the various policies we have followed, we are in an energy-constrained environment. No doubt about that. Now, okay, how do you solve that? I think one thing, the good news is that the government is focused on that and is solving it. But we are beginning to find out second and third order effects creeping up. Because if you look at a training facility, 150 megawatt seemed like yesterday's data center. The peak to valley during the checkpointing phase of training is 15 megawatt. There's not enough inertial mass in the energy generation environment unless you're running gas turbines and others. So it's certainly leading to a lot of very complicated things that we've got to sort out, including a transmission and distribution. I don't know if you ever Googled Odessa incident, you should. You'll begin to find out what really happens when the frequency fluctuates on the order of a 10th or a couple of tenths of hertz, and what kind of dominant effect it can set up in the transmission side. So I think energy will be solved, and my personal view is it'll take an all in approach in terms of all levels of energy that will have to bring to bear.>> Well, it seems the current administration agrees with you.
Vivek Mohindra
>> Absolutely.
Vivek Mohindra
>> And understands that that is a critical input to AI success. I want to come back to sovereign AI. What are you seeing there? Interesting, I'm just thinking about the comments that the constraints are different depending on where you are. Here, it's energy. In China, it's access to GPUs or software, whatever it is. Other countries may have regulatory environments that are constricting. I know there's a big discussion in the United States right now. We don't want to do this state by state. Europe is probably going to do this country by country or maybe not. Maybe they'll have a pan-European EU policy that gets adopted around the world. We've seen that before. So What are you seeing in terms of sovereign AI? How real is it? What's your thinking around the timeframe that this actually contributes to meaningful market momentum?
Vivek Mohindra
>> I think it's real. I think all the governments globally are thinking about it. Now, not all of them are thinking about it the right way, in my view. Some of them are thinking about sovereign AI as a need to control the entirety of the stack all the way down to semiconductors. And because of that, and I have made that point to them privately. Because of that, they may find out that they're left behind, because there's no human way that each and every country globally, major country globally, can control it all the way down to semiconductors or servers and others. So you've got to think about sovereign AI in a very interesting and different way, which is what defines the competitive advantage for you as a country? And by and large, all of that resides on top of the hardware layer in many ways. So most countries would be better off acquiring the best of breed hardware equipment and owning it in their control. So whatever the sovereign data center they set up, the entirety of the equipment is theirs to do with whatever they want to do, and then they can innovate on top of that. I think this is going to accelerate very significantly, because many of these countries are beginning to realize that these large data centers are available to the largest companies because of the way the economics works. But a big driver of job creation in most of these economies tends to be the small and medium businesses. That's where the job creation occurs. That's where a lot of economic growth comes from. And in order to provide those resources to small and medium businesses, countries have to fundamentally think about sovereign AI and provide that to these enterprises. I think this is going to pick up very significantly all over the world.>> So I want to ask you about labor. Big theme right now is scaling without labor. Of course, there is in some respects a labor shortage. On the other hand, you're seeing concern about particularly white collar workers. You're seeing these layoffs, although I think most analysts would agree those layoffs are not directly AI-related. Maybe there's a little bit of AI in there, but it's more kind of re-looking at the portfolio and adjusting it a little bit post-COVID. Okay. But we know it's coming. We know there's going to be a job shift. I think most people in tech believe that more jobs will ultimately be created. But oftentimes when these waves hit, there's that uncomfortable period. How do you see that playing out? Is it a big concern amongst the customers that you talk to? And certainly presumably, depending on where you are around the world, governments must be concerned about this. What are your thoughts on that?
Vivek Mohindra
>> Governments are concerned about it. Look, I'm an optimist, and I'll tell you why. 60% of the roles that exist today did not exist in 1940. 85% of the roles that exist now since 1960 were created because of technology innovation. And when you think about all these technology changes that occur, let's say, spreadsheets in our industry. When the spreadsheets came along, at that time, there were a whole bunch of people within companies who were called calculators, whose sole job was to add numbers and calculate them. Those jobs clearly disappeared with spreadsheets, but the whole new professional financial planning came into existence because you could do a lot of these things with spreadsheets. So my fundamental view, again, I'm on the optimist end of the spectrum, is that some jobs will go away, some jobs will change, and a whole new category of new jobs, which we can not even envision today, will come into existence. Think about context engineering and prompt engineering. Those jobs didn't exist. So I'm fundamentally a believer that a whole new categories of jobs will come about. The point I make to a lot of these governments is they have a role to play in making sure that they can actually empower the population. And what does that look like? It looks like you've got to rethink the education system in K through 12 program, like Dell is participating in that with our tech student group program. You've got to rethink how you university education occurs, and you've got to rethink how companies, and companies need to rethink how they skill their own employees to be able to take advantage of these. And I frequently make the point, with these agentic technologies coming into sharper focus, all of a sudden you may realize that you don't need to go to a grad school to become a really good software programmer because with the rise of these tools, you could literally go to a two-year community college and come out of that with really, really great skills in terms of coding. So the accessibility of these jobs, which were not previously accessible, becomes much more accessible to the population. But the governments do have a big role to play in terms of, okay, how do they programmatically think about the skilling side effect?>> So I want to close, Vivek, by asking you about the AI factory and how that evolves in your model, mental model. I mean, obviously the hyperscalers are spending hundreds and hundreds of billions of dollars in CapEx. I mean, I'm convinced NeoCloud's, maybe not every NeoCloud player makes it, but NeoCloud as a category is going to survive in my view. I mean, the AI factory is this, Jensen says it's manufacturing tokens. I like to say it's manufacturing intelligence. And today, we think about these massive data centers that are going in, but to come back to our edge conversation, like the cloud, the cloud no longer just a bunch of remote services somewhere in the cloud. It's this ever-expanding, we call it super-cloud sometimes. And the same with the AI factory. It's going to be this expansive, not only these big giant data centers, but it's going to be at the edge. Inferencing, as we know, is going to be accounting for most of the actual work and probably energy consumption and intelligence manufacturing. So how do you see the AI factory of the future? What does that look like?
Vivek Mohindra
>> Look, I think you hit it. I think it is not a monolithic definition. An AI factory is a notion which manifests in several different footprints, and I think that's what we will see, because there will be some of these workloads which are super latency-sensitive and or don't require heavy processing. And those will move to, for the lack of a better term, a mini AI factory, which is really your PC or the workstation that you have on the edge. Then there will be others, much smaller footprint AI factories, not gigawatt scales, not even perhaps megawatt scales, which will be dispersed across a country's footprint based on edge AI factories, which will be a much smaller footprint. And then of course, there'll be these gigawatt-scale AI factories, which are for training workloads. We will begin to see, just like any other factory, we will begin to see manifestation of these factories in various form factors based upon what a company and a country need to do. I think that's just a natural evolution.>> And the power that you're going to have on your laptops and your devices is going to be, you heard Sam Altman the other day say you're going to be able to GPT whatever, five-
Vivek Mohindra
>> Absolutely.... >> completely on-prem or on a device, and that's going to drive amazing productivity. That's what we're all waiting for, right? This productivity boom where the money that we're spending is a fraction of the value that it's creating, and that's the bet that we're all making, right?
Vivek Mohindra
>> It's a bet. I think it's a pretty safe bet too, because if you think about it, global GDP on the order of 110 trillion right now. The various estimates that people have range roughly between 10 to 15 trillion type of a dollar impact in terms of productivity improvement. When you look at these CapEx that's currently going in, it's about give or take $400 billion a year. So when you actually start running the math based on that and think about the power it can actually unleash, it's not hard to imagine where the ROI is actually terrific. And ultimately, that is really what we are counting upon.>> And that assumption, that assumes a fairly modest 10% GDP uplift.
Vivek Mohindra
>> Yeah. That's exactly->> Could be much higher than that.
Vivek Mohindra
>> Could be much higher than that. I know several economists are getting paid a lot of money to sort that out, but you're exactly right. Could be much higher than that.>> Vivek, thanks so much for coming on. I really appreciate it.
Vivek Mohindra
>> Dave, pleasure. Yeah, thank you. It was a great conversation. I love it.>> Wonderful having you here face to face.
Vivek Mohindra
>> Absolutely. I'm delighted to be here. Thank you.>> Thank you. Thank you for watching the NYSE Wired plus theCUBE's coverage of AI factories. We'll be right back, right after this short break.