In this conversation from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future, Mindy Cancila, vice president of corporate strategy at Dell Technologies, joins theCUBE’s John Furrier in the NYSE studio to unpack how AI factories are redefining enterprise infrastructure. Cancila shares fresh Dell research on how organizations are shifting both generative and agentic AI from proof-of-concept into production, with the majority planning to run on-prem to protect proprietary data while driving productivity and new revenue. She explains how Dell’s experience with large-scale training providers and a long-standing enterprise installed base is shaping AI-ready data center designs that can support demanding, GPU-intensive workloads at scale.
The discussion digs into what it really takes to operationalize AI factories in the enterprise: navigating power and talent constraints, co-designing next-generation architectures with customers and aligning AI investments to clear ROI and governance frameworks. Cancila highlights why this is not a “fast follower” market, how more than 3,000 customers are already tapping Dell AI factories, and where edge and “physical AI” use cases are emerging across financial services, research, healthcare, robotics, retail and manufacturing. From rethinking hyper-converged versus disaggregated designs to treating data and networks as strategic assets, she outlines how AI factories are becoming a new unit of value in the data center and a cornerstone of digital strategy for the next era of compute.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Register for AI Factories - Data Centers of the Future
Please fill out the information below. You will receive an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for AI Factories - Data Centers of the Future.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Mindy Cancila, Dell Technologies
In this theCUBE + NYSE Wired segment from “AI Factories – Data Centers of the Future,” Nebius co-founder and CBO Roman Chernin sits down with theCUBE’s John Furrier at the New York Stock Exchange to unpack how AI factories are reshaping enterprise infrastructure and the future of data centers. Chernin outlines Nebius’ two-track strategy: a multi-tenant cloud built for developer experience and managed services, and large-scale, mostly bare-metal deployments for hyperscalers and AI labs. He discusses the significance of Nebius’ Microsoft deal (described as “up to $20B” and set to become one of the largest single-site GB300 deployments) as both an engineering milestone and a way to feed scale and cash flow back into the core cloud business. The conversation explores why enterprises want “the baby of supercomputer in the cloud,” marrying cloud flexibility with supercomputing efficiency to minimize time-to-value without sacrificing performance.
Chernin details Nebius’ specialization in AI-centric workloads (large distributed training and inference at scale), a platform roadmap that moves beyond infrastructure into inference, fine-tuning and reinforcement learning as services, and a commitment to helping customers build on open-source models for control, cost and data leverage. He traces customer waves from foundational model builders to vertical AI companies and tech-forward enterprises, noting early traction with firms like Shopify and momentum in regulated sectors such as healthcare following Nebius’ compliance milestones. With roots in Yandex’s large-scale engineering culture and meaningful exposure to ClickHouse, Chernin also weighs in on the economics of AI-scale infrastructure (power and capacity as gating factors), hybrid orchestration and sovereignty, and why latency priorities vary by use case – from reasoning models to voice agents – as AI factories become the new unit of value in modern enterprise compute.
In this conversation from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future, Mindy Cancila, vice president of corporate strategy at Dell Technologies, joins theCUBE’s John Furrier in the NYSE studio to unpack how AI factories are redefining enterprise infrastructure. Cancila shares fresh Dell research on how organizations are shifting both generative and agentic AI from proof-of-concept into production, with the majority planning to run on-prem to protect proprietary data while driving productivity and new revenue. She explains how Dell’s experi...Read more
exploreKeep Exploring
What are the current trends and conversations surrounding the adoption of AI in organizations?add
What percentage of customers expect to move generative AI and agentic use cases from proof of concept into production this year?add
What are the differences in infrastructure needs among various types of customers in the context of AI workloads?add
What is the current status of customers using the AI factory, and how does data play a role in its applications across different industries?add
>> Welcome back with theCUBE here at our NYSE studios on the East coast. I'm John Furrier, host of theCUBE. We also have our Palo Alto studio connecting the West Coast and the East Coast Silicon Valley and Wall Street Tech and money. AI factories are changing the game on how people are extracting value out of this new AI wave. As AI is infused into all businesses, whether they're Cloud, Neo Clouds and enterprises, it's certainly changing the game. Mindy's back here in person, Vice President of Corporate Strategy, Dell Technologies. Great to see you, Mindy. Thanks for coming in.
Mindy Cancila
>> Always nice to see you. Happy to be here. The energy, I love it all.
John Furrier
>> I love how you brought your daughters in with a show floor exchange tour and certainly after. But I want to get into the agent conversation because last time you were here, you shared some data around POCs and POC adoption, and the hottest topic right now is AI infrastructure. But there's also enabling the conversation around what sits on top of it because as tokens come out of the factory, a lot of that software layer around the data and the agents are now hitting full stride. You're starting to see enterprises figure out that there's workflows they can create agents for. You're starting to see ROI come in. You're starting to hear conversations shift from cost savings to revenue generating, which means the adoption and the value is being generated. So what's your take on this wave?
Mindy Cancila
>> Yeah, you are spot on. Every conversation I have continues to be deep in AI. People want to talk about what are my use cases and how am I going to do this and how am I going to make sure that I have the right environment so that I can really drive that productivity and revenue generation? I think last time I mentioned one of the stats out of our survey that said 84% of organizations are looking to do generative AI on prem. As we think about that, I think an interesting statistic I don't think I shared last time that ladders onto that, is 58% of those same customers said this year they expect to move those generative AI use cases from POC into production. We think about what that looks like by comparison with agentic, same story. We're seeing like 68% of organizations already think they're mature and agentic, which for a technology that, gosh, I think in John Rose's prediction, that was the buzzword for last year. It's gone from buzzword to media implementation. That prediction could not have been more correct. Not only that, like 76% of those organizations expect to move their agentic workflows from POC into production this year. The aggregate of that and what it means for businesses across the board, every organization is really focused on the right set of use cases that are going to, just as you said, move from ROI into generating revenue. And recently, I had the benefit of being with some financial services institutions, and when we were talking about their journey and what they're focused on, it was all about AI infrastructure. Because if you pair that story, as you eloquently said, if I'm focused on whether it's generative or agentic AI and I want to do that on prem, I need to figure out that infrastructure stack. So that's certainly the momentum that we're seeing.
John Furrier
>> Yeah, in New York what really highlights me coming here from Silicon Valley, is that the concentration of customer base for Dell and all the vendors is very high. You've got multiple diverse industries. Actually, financial services always an early adopter, but you have other domains that are taking advantage of the supercomputing capabilities of AI factories. NVIDIA talks about this all the time, and I want to get your thoughts on this because you're starting to see, well, Dell is a unique position. You have a relationship with almost all of these customers. I mean, they've been buying Dell PCs and servers for generations. You haven't installed base, but now they're going to run bigger loads that they haven't seen the performance on before. This is where the AI factories comes in. So how are these industries evolving to add onto, I mean, it's not a pivot. There's no pivot. There's no changing direction. It's a trajectory. Angle of trajectory is now going up to the right.
Mindy Cancila
>> Yeah, I'm really happy to hear you acknowledge the different types of customers. I think a lot of folks are getting confused on, we think about classical segmentation of customers across the AI workloads. Their footprint from an infrastructure perspective looks radically different. This, it reminds me of the hyper-scaler in early days, that conversation. Today, Dell is supporting the bulk of a lot of those AI workloads that are being driven by those that are standing up training in GPU as a service. Those training the model providers, that's a critical foothold because it allows us to understand the infrastructure they need, the scale, which is radically different than what we see today on enterprise, their requirements, their talent. Those types of things give us, as you mentioned, early indicators as we move to those that tend to lead with these technology transitions. So now we get into enterprise as we move from training into inferencing, that's really where we see this foothold grow and scale. As you said, we see a lot of it. It tends to start with research and academia. It goes into financial services, these really large, sophisticated organizations. And what I hear from those customers is two core things around the infrastructure conversation, which by the way, a third of those survey respondents said, help me with my infrastructure. That's our game. We're definitely there and are here to take that call. When we think about what they need, it comes down to two things. They talk about power and it's really a broader power conversation around their data center and they talk about their talent. And I think those become two really interesting areas that we can help them solve.
John Furrier
>> It's interesting on the strategy side. It's pretty clear from the boardroom, go AI, we need to get in this market. There's been some commentary here on theCUBE during this series where I'll just paraphrase, it's not a fast follower market, meaning if you're not on it, don't wait for others to get a lead because you could be left behind because as they call it, accelerated computing is key. And also, that's one trend, accelerated computing with a fast-follow market that's not there. The other one is extreme co-design, which NVIDIA and Dell talk a lot about with AI factory. Talk about that piece because as you co-design with the enterprises, their infrastructure ask could range from, hey, I want a PowerEdge server, maybe PowerScale and some SpectrumX. You're doing all the blocking and tackling. Dell's a great supplier. But the bigger conversation is what's my infrastructure for the next generation? So there's really a co-design vibe with the relationship with customers. That's not common. That was not the ecosystem 10 years ago. We get a great supplier relationship, here's a good discount, if you buy in volume, here's some good TCO, check, check, check. It's shifted to here's performance, here's a new ROI calculation. Take me through your thoughts on that and how you think about that frame, because that's a system design. That's like building a server. You got to think everything, power and cooling, energy envelope, footprint, energy management.
Mindy Cancila
>> Yeah, I'll talk about both of those. The first one I think is a really good point. The organization did, we tend to say with these technology transitions, get started, get your feet wet, evolve the talent, focus on the things that customers are going to be really looking to do. When we think about that in this market, that's not the case. You need to double down and start thinking about your right use cases. The ones that are going to deliver an ROI. You can't do everything, so you need to be very focused on what you're going to do right now. That then leads into the, again, that infrastructure conversation. And as far as I'm concerned, this requires an ecosystem of partners to really bring the full stack together. It's not easy when you look at all of the things from the infrastructure up all the way into MLOps and orchestration and governance. This is going to take some time for organizations to really build out the right architecture that works for their business model. It varies by segment. What those training providers are going to do looks different than maybe an MB or a large enterprise organization. But thinking through the right use cases and making sure you're putting that architecture in place, upskilling your talent, putting a governance framework together, those become the real challenges that I see that most organizations need to prioritize.
John Furrier
>> Now, as age has become the first leg of the ROI journey, and remember Michael Dell presented at the investor meeting, the physical AI is still around the corner.
Mindy Cancila
>> Yeah.
John Furrier
>> That's the full convergence of physical digital. But right now, it's get the infrastructure, get the performance things you mentioned. Now the agents come on top. That's going to be the proof ground. What's your thoughts there with customers around how they're thinking about agentic? There's been surveys out there, which I throw shade on, which was all these projects are failing. Now, if you factor in every experiment in POCs as a failure, I mean Thomas Edison failed how many times before he discovered electricity, but there are workloads going into production faster now than a year ago. So there is this whole, what's the low hanging fruit? What runs on an AI factory? When should I use an AI factory? What's your thinking around that? What's Dell's strategy and position there?
Mindy Cancila
>> Yeah, our strategy's really born out of what we've learned as a company, and I feel it's right. I've been privileged to be in the forefront of that and really see how our strategy has unfolded. And really what we've done is tied the ROI associated with the use cases into our governance model. So you can do a bunch of-
John Furrier
>> What does that mean? What does that mean?
Mindy Cancila
>> So when you think about it, the very first step in our governance framework is this going to deliver a meaningful ROI? And then what you would do is take all of those use cases and prioritize the ones that are going to deliver the most ROI the fastest, or they're going to deliver the most ROI, but they have a multi-year runway. I think a lot of people are peanut butter spreading that all AI use cases are alike, and I am not suggesting that people shouldn't be focused on using AI tools to help them better search or create content. To me, we're already beyond that conversation. That's table stakes. I'm talking about the use cases that really ingest your core sensitive data. If they're agentic, they take the unique facets of your organization and how you do work and they marry those things together. Those use cases are the ones you really want to put at the very beginning of your governance framework to make sure if they're not going to deliver an ROI, put them in the queue and wait, or really challenge the teams to think broader and more creatively about what they could deliver. So I think that those two conversations, interestingly to me, are coming together. How do I govern all of this and what is my ROI?
John Furrier
>> I was talking with Kyle from Dell who just leaves a lot of customers, and in talking to Michael Dell and Jeff Clark at the last Dell Tech World, he said, "We made good..." Jeff, Jeff Clark said, "We made some good bets and one was engineering the right things with EHAB and the team and John Rose." And then Michael Dell made the bet, I think Dell Tech World about four years ago. He presented the end-to-end architecture, core, cloud core edge or core cloud edge, whatever it was. So those things come into play when you think about the workflows. So when you look at the successful AI projects, they have this end-to-end workflow that scopes. They understand it. It's part of their business fabric. And usually, there's a lot of mundane toil or undifferentiated heavy lifting, a perfect wheelhouse for agents. What's your view on that? Are you seeing that more that people are picking these use cases, applying some governance, security? Oh, the other bet that EHAB made at Dell was people aren't going to put their proprietary data in the large language models. That's true.
Mindy Cancila
>> That's what we have already seen, right?
John Furrier
>> Yeah.
Mindy Cancila
>> That's the main driver of that 84% want to do generative AI on-prem, one hundred percent. And Agentic is an extension of that.
John Furrier
>> So all good bets.
Mindy Cancila
>> Yeah.
John Furrier
>> How is that flowering up in terms of value proposition with AI factory? Can you share some stats or observations around what's trending in that direction?
Mindy Cancila
>> Yeah. I mean, we already have 3000 customers today more than that are using the AI factory that are already getting benefit out of those use cases. They're in POC. They're moving into production. When I think about the most strategic thing on the Dell side is we've always said data's a differentiator. Generative AI loves data. It consumes it. It uses it. It creates it. When you think about what that means in terms of opportunity, these industries we've always supported, think academia, think research, think medical. These happen at the Edge. They still have a data center component. They're going to leverage models that have been trained and we're selling into those environments. But when you think about the hospital of the future, the education environment of the future, when you think about retail, when you think about manufacturing, banking, these have all already pushed things out to the edge. That means they are latency sensitive. They're the most critical sensitive data. Organizations need and want that really focused simple button with the AI factory. We think that's an opportunity to bring storage into the conversation. It propagates that power and cooling and what are all the things that we can deliver to innovate.
John Furrier
>> Mindy, it's fun to watch Dell over the years because you guys are smart and Michael's smart. You guys have a great team. I think it was a couple of years ago, Michael's like, "The Edge is the most exciting thing." This is a couple of years ago. I'm doing a study right now for MWC coming up around something pretty controversial called the hyper-converged Edge, meaning that if you get 2G, 3G, 4G, 5G, and 6G, and collapse that with Wi-Fi and ethernet with a DGX box or a Dell AI factory, like this big of a box. We saw Jensen, here's an AI factory you can hold in your hand. That will change the game at the Edge. Now you got edge to factory. What's your reaction to that? Because if that happens, you can bring training to the Edge, add inference and collapse a lot of these bottlenecks around, am I losing license spectrum, unlicensed spectrum, ethernet, backhaul? I mean, this is like the holy grail if this hits. What's your thoughts on that?
Mindy Cancila
>> Yeah, and I know when you say hyper-converged in that facet, you're really leaning into the network. I mean, one of the things that we've seen for a number of reasons is, and that we believe is right, is organizations went into a hyper-converged model because it did a few things for them at a point in time that were valuable. They simplified deployment, configuration and management. They brought facets together into an easier button. The reality is we're hitting a point with generative AI where just as we decade to decade centralize and decentralize, the benefit of bringing those things into a singular layer where they have to scale one-to-one is breaking. So we're actually seeing a pivot in the traditional frame of hyper-converged into disaggregated architectures again. Within that, that network is doing exactly what you said, the potential for us to have low latency, which we know is what we need in these edge-based deployments. You think about high-frequency trading, you think about the environments that really have that performance and IO set of requirements. The network becomes really interesting again, and it's an area to really drive the growth and scale at the edge that's critical.
John Furrier
>> And also validates the thesis of small language models that are domain-specific. I can walk into a retail store. Okay, John's got facial recognition. That's John. Goes to the AI factory, pulls down my model or my agent. I could program it say when I go to the store, oh, that coat you want is on rack 4 right now. So these are use cases that don't exist today.
Mindy Cancila
>> Right. But I think we're getting there. I mean, I've spent a lot of time recently deep in robotics, so if anybody wants to geek out on robotics, feel free to join me in that one. That'd be another interesting AI prong of a conversation. But that physical AI conversation is really that. It's the ability to bring all of these things together, the digital physical right around me as I need it and want the information. It's all about that data. The network becomes super important that we're doing the right types of things to deliver that latency.
John Furrier
>> It's funny you brought up robotics. One of our other popular series is the AI robotics series, and what it looks like is hardware nerds meet software nerds at the low levels, but fully encapsulated AI factories. Like a car, an autonomous car is basically a rolling factory.
Mindy Cancila
>> Yeah.
John Furrier
>> It's got chips. And so when you get into robotics, the software piece becomes super cool, but that's what Dell does. You guys are bringing AI software into the factory, AKA the hardware.
Mindy Cancila
>> Yeah, that's exactly right, and actually I love what you said about robotics being the hardware and the software. I actually think the part that's the most amazing about robotics is that it actually has a third leg to that stool, and it's the person. It's the human. It's how do I want to interact with this? What am I trying to accomplish? What problems can you solve for me? I dislike using the idea of Jetson thinking, but the promise of all of the things that we imagine that you can really start to see those coming to life.
John Furrier
>> I did an interview with NVIDIA's lead of healthcare at GTC two weeks ago, and she and I were talking about the AI operating room, so this is just healthcare, but now that is a full AI factory enabled-
Mindy Cancila
>> That's right....
John Furrier
>> Use case. But they also talked to other hospitals in the network. So network is also data networks, not just moving packets around. So when you have that physical needs, you need the horsepower. Again, this is why I think life sciences, robotics, healthcare, med tech are going to explode because they're going to see for the first time the ability to get the compute that they never saw before, would actually spawn more software. Are you seeing that on the radar now or is that still developing because these verticals are ripe with innovation?
Mindy Cancila
>> Yeah, in my view, so it's early, but it's there. I mean, my view is when we talk about robotics, it is a subset of AI. It's one of the AI workloads that we want to help capture and cultivate. It's all the industries you mentioned. I would add to it entertainment. I mean, we love to use and talk about the use case with McLaren. They're creating digital twins of their physical drivers. These things are happening today. You think about retail, things that they're doing to help you have a better shopping experience that's improving their engagement with the customer. You get a better experience. I've said long, that's a win-win. When your customer's happy or your bottom line is better, that's really where there's goodness.
John Furrier
>> Well, I'm super excited by the momentum, and again, I really appreciate you sharing here on theCUBE. I guess my question, last question is what are you optimizing for these days in your job? You've got a great job. You've got to look at the portfolio, squint through the strategy. You're playing 3D chess, you're watching the trends. I mean, you got a lot going on. How are you managing that? How are you optimizing your time?
Mindy Cancila
>> Yeah. I think Jeff helps us to do that, so our COO is very good at directing the things he'd like us to focus on. Our team has the benefit of being deep in the business and what an interesting time with AI. I mean, all the things that you just talked about, they're going to require a new compute architecture and a lot more compute. They're going to generate a lot more data. That data is really fueled on Dell technology storage, so good opportunities for us in that space, and then when we get to think about the growth opportunities, things like robotics, really bringing those to life, that's where my passions lie.
John Furrier
>> Final, final question, just to wrap up on topic, Dave Vellante and I are looking at the whole enterprise market. We think this year is going to be very network-based. I love that point. But we think it's going to open up pretty big as agents start to get clear economic lines of sight on value proposition with quantifiable metrics, governance, working backwards from the governance security. What's your view on the enterprise market right now for AI factories adoption? They don't have one hundred billion dollars of CapEx. Not everyone's JP Morgan Chase. Not everyone's got $50 billion like Anthropix announced yesterday, but they do have budgets. They have to play with the cards they're dealt with. How are they rolling out the factories? What's the forecast look like?
Mindy Cancila
>> Yeah. On my side, what I see is these organizations, whether you mentioned JP Morgan, whether they're a really large deeper pockets organization or they're early in their journey, this is the year that organizations are really taking those generative AI and agentic use cases and moving them into production. We see that from the data that the research we do tells us. We see it from the customer conversations that we're having and we know this is really the time. This is their opportunity to harness that ROI conversation and get moving on the most critical use cases to deliver business value.
John Furrier
>> Mindy, thank you so much for coming into the studio in person. I'm glad you're in New York. We can grab you and come in.
Mindy Cancila
>> Yeah, my pleasure. Anytime. Always happy to be here. Love the time with you. Happy to see Dave again hopefully soon, and really appreciate you having me.
John Furrier
>> Let's put a pin in the robotics one. Love to come back to this. One of our top areas.
Mindy Cancila
>> Sounds great. Happy to have that conversation.
John Furrier
>> I'm John Furrier, host of theCUBE with Dave Vellante. This is our AI factory series where we're unpacking this power source that's going to power the tokens and it's the infrastructure's going to bring in the new software models, AI native and provide a lot of value and extract a lot of value and ROI. All that's happening. We're doing our part to bring you here on theCUBE. Thanks for watching.