In this theCUBE + NYSE Wired: AI Factories – Data Centers of the Future interview, theCUBE’s Dave Vellante sits down with Mindy Cancila, VP of Corporate Strategy at Dell, to unpack why on-prem and hybrid AI are gaining momentum and how data gravity, latency, security and ROI are shaping deployment choices. Cancila shares fresh insights from Dell’s research, including a notable data point: 58% of surveyed large enterprises report moving from POC to production – paired with a more precise focus on generative AI outcomes. The discussion explores where customers are seeing wins today (coding assistants and packaged capabilities) and why the next phase is about treating AI as a thought partner across data center, edge and devices.
The conversation dives into enterprise patterns – fine-tuning models and scaling inference, emerging agentic workflows and the architectural realities of compute, storage, networking and multi-cloud. Cancila outlines Dell’s approach: simplify end-to-end complexity with solutions and services, partner broadly (Dell isn’t trying to be an LLM provider), and lead with data – emphasizing quality, secure collection and placement via platforms and investments spanning “power of scale,” “object scale,” Edge/NativeEdge and services. They also touch on hybrid/multi-cloud operations (sometimes called “super cloud”), how model churn raises adoption complexity and how Dell’s AI Factory efforts bring evaluated models and integrated stacks to customers. If you’re mapping AI factories to enterprise outcomes, this conversation offers practical signals on infrastructure choices, success metrics and what’s next.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Register for AI Factories - Data Centers of the Future
Please fill out the information below. You will receive an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for AI Factories - Data Centers of the Future.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Mindy Cancila, Dell Technologies
Mindy Cancila of Dell Technologies explores the future of artificial intelligence deployment strategies. Cancila, vice president of corporate strategy at Dell Technologies, joins Dave Vellante of SiliconANGLE Media at the New York Stock Exchange to discuss AI Factories and the evolving data centers of the future. Drawing from a wealth of experience as a Gartner analyst, Cancila provides valuable insights into AI and hybrid cloud opportunities from both market and strategic perspectives.
In this episode, Cancila shares expertise on how enterprises navigate the landscape of on-premise and hybrid AI deployments. They detail the team's research findings on AI utilization in major organizations, revealing that 58% have successfully transitioned from proof-of-concept to production. The conversation addresses the intricacies of AI integration and the dynamics between data proximity, security challenges, and return on investment considerations.
Key takeaways from the discussion include insights into how enterprises strategically leverage AI to drive innovation and efficiency. Cancila emphasizes the importance of tailored solutions and partnerships in overcoming the complexities associated with AI technology, highlighting Dell's efforts in simplifying these processes for organizations. The evolution of AI from traditional models to modern applications such as agentic AI is highlighted as a promising pathway for future adoption.
In this theCUBE + NYSE Wired: AI Factories – Data Centers of the Future interview, theCUBE’s Dave Vellante sits down with Mindy Cancila, VP of Corporate Strategy at Dell, to unpack why on-prem and hybrid AI are gaining momentum and how data gravity, latency, security and ROI are shaping deployment choices. Cancila shares fresh insights from Dell’s research, including a notable data point: 58% of surveyed large enterprises report moving from POC to production – paired with a more precise focus on generative AI outcomes. The discussion explores where customers ...Read more
exploreKeep Exploring
What is the on-prem and hybrid AI opportunity as understood by experts in the field?add
What percentage of surveyed large enterprise organizations have successfully transitioned from proof of concept to production?add
What are the advancements and emerging roles of AI in coding and data centers?add
What are the core use cases for applying AI within Dell?add
>> Hi, everybody. Welcome back to the New York Stock Exchange. We're here at the Buttonwood podium overlooking the options exchange. My name is Dave Vellante, and you're watching AI Factories, Data Centers of the Future, and we're thrilled to have Mindy Cancila here. She's the Vice President of Corporate Strategy at Dell, and a longtime friend and a great market watcher. Mindy, awesome to see you, thanks so much for coming in remotely to our studio. All right, so let's just get right into it. We've been talking all week here about the trend toward on-prem AI, what's happening in the on-prem stack. So, you are a former Gartner analyst, you look at the market, and of course you're in strategy now. You've looked at the market for cloud as an analyst, and you're definitely helping shape Dell strategy now. What does that look like? What's the on-prem and hybrid AI opportunity look like? What do deployments look like? What's the migration, bringing data or AI to the data? What are you seeing out there?
Mindy Cancila
>> Yeah, well first, it's really great to be here. As you mentioned, I love these emerging technology conversations. Really enjoy talking with customers and understanding where and how they're going to intersect those technologies, how they'll think about them, and really, how they can use them to drive value. As you mentioned, I started back as a cloud analyst and so I spent a lot of time with enterprise organizations just seeking to understand, what exactly are they going to do with public cloud and how are they going to think about it? How will they think about their on-premises data centers and what will evolve to an on-prem cloud-like architecture? And what I found as I worked with so many of those customers, is cloud at least was its tops-down mandate. It was like, let's get some things in the cloud, let's check some boxes, and then we'll put in place this decade-long strategy to get to where we think we want to be. When I think about AI and I talk to customers, it looks completely different. My team, you mentioned corporate strategy within Dell, we do a lot of external research and we spend a lot of time talking with customers. And in a recent survey that my team had fielded, this is global, it was large enterprise organizations, we found in a pretty large N I should add, a large sample size, we found that 58% of survey respondents have already successfully transitioned from POC into production. That is a staggering number, 58%. And when we think about that survey, because this is one that we're doing year on year, and we're actually doing it more frequently because of how quickly the market is moving. Last year, customers had said they were 64%, so you take two things from that. One is that it looks like it's slowed, but not really, because what we're finding is there's a bit more maturity in what they understand. A year ago, a lot of customers when you said generative AI, they talked about the traditional AI work they've been doing for decades. Whereas now, they're getting much more prescriptive and what they're doing is actual generative AI. Not only that, they're talking more about what success looks like. It's one thing to transition something from POC to production, it's entirely different to transition it and to say, "That worked and that delivered what I wanted it to." And from what I see, customers are getting more prescriptive about not only defining and measuring that as they accelerate that adoption. And so, and I know we'll talk more about success factors and how to make sure you do things right. But at the intersection of this multi-cloud and AI conversation, there's a natural connector, where organizations are looking across their data centers, their public and hybrid cloud environments, and trying to figure out, what are the types of workloads I need to run and bear? They're getting more specific and purposeful and we just know a lot more than we did when many started down this journey a decade back. So now, that conversation is really progressing into on-prem. Why on-prem? And it really comes back to the data. You'll hear me say that a lot, the data, the gravity of it, the latency associated with the things that they need to do, and then it'd be a miss of me not to tack on long-term security challenges, we're always concerned with security. But then that ongoing cost conversation, which again, we see really pivoting from just, what does this cost, to what am I actually going to get out of this from an ROI perspective?
Dave Vellante
>> So let me ask you a follow-up, Mindy, if I can, because you see the MIT study, it says everybody's failing on AI, which 95% I thought was overstated. But when I talk to customers, they're having great successes, which somewhat aligns with your data. I don't know about the exact percentages, but they're having real successes in coding. The productivity in coding use cases is off the charts, a lot of sales and marketing use cases as well, but what are you finding in the customer data in terms of where they're having those successes?
Mindy Cancila
>> Yeah, in terms of what use cases?
Dave Vellante
>> Yeah. Yes.
Mindy Cancila
>> When I look at the AI use cases and I think about the things that organizations are really advancing, it's sort of like in the early days with cloud conversations, we would start out and talk about software as a service. Those were of the degree of complexity, they were some of the easiest places to get started. The coding assistance and the things that are packaged in capabilities look very similar. These are the types of workloads that you can naturally take advantage of, something that's embedded within an application, and shift the way that you're getting value. Basically in a coding assistant, you're enabling your organization to get started in a much more complete and successful set of capabilities. Earlier in career tend to really love and learn from that, but that's very different than when we look at the broad set of use cases where I believe we're really going to see value, and those really come back to the data. AI from my view, is all about the data. AI loves data, it loves to consume, it loves to create it. And so when you think about the applications in your environment, they solve different problems and they each have unique architectural requirements, when you think about the compute, the storage, the networking implementation. And so what we find is, one size is not going to fit all much. Much like you don't have one application that solves every business problem, you're not going to have one AI use case that solves every business challenge or thing that you're looking to harness from an advantage perspective. And as those models continue to evolve and you think about the use cases tied back to specific data sets, we're going to evolve from training into inferencing, and that's going to happen at the Edge on PCs, on smaller devices. And so while we've been on this specific path with generative AI, here comes agentic. And agentic, these agentic workflows, they're going to happen in the data center, they're going to be all around you. When we talk about AI, we talk about thinking about AI as a thought partner. So going back to your question in coding, initially you're using that specific generative AI set of capabilities to help you create code faster. You might create a prompt and plug it in and ask it to do something for you. Now, we're evolving from AI being sort of a tool that would do something very specific and targeted that I defined, into leveraging AI as a thought partner. And that thought partner is going to be in your data center, it's going to be at the edge, and it's going to be all across all of your devices.
Dave Vellante
>> Well, and that deal that Intel and NVIDIA just struck is, we think, an accelerant for on-prem and enterprise AI. You've got a massive X-86 install base, and it just got an on-ramp into AI. So the public cloud obviously has dominated, startups and enterprise AI initially, and certainly IT momentum and growth has been there for over a decade. What are the drivers that are getting customers to really start deploying private AI, on-prem AI, whatever you want to call it? They're moving AI to the data, they're saying they don't want to move data into the cloud, at least for certain data. Why? Is it cost? Is it performance? All of those above? What are you hearing from customers?
Mindy Cancila
>> Yeah, I hear that it's all of the above. And I mean, when I think about what we learned and what we went through in our own experience, first, I think it's super important for organizations to start with the question you asked. What am I trying to accomplish? And that's not a finite, discrete answer, there's going to be multiple answers there. So one size is again, as I was saying, one size isn't going to all. When you think about the things that you're trying to achieve, most organizations are trying to further their automation journey. They're trying to take a set of capabilities and drive a next layer of automation on top of that to drive efficiencies, performance gains, productivity gains. And I feel like that was sort of the initial starting point for most organizations, but we're moving quickly far beyond that. As a matter of fact, I would wager if organizations aren't doing those things today, they are quickly falling behind. We're now moving into the mode, as I was saying, where AI is becoming this thought partner. We're moving from doing things more efficiently that we already did, that traditional take the automation steps that you're doing and automate where it makes sense to automate. This is like, how do I think about leveraging this as a, this being AI, as a way to drive incremental revenue? How do I think about AI as a way to move the needle in areas to solve problems I never could have before? My favorite use cases are when you hear these real feel-good ones in the medical industry. You think about healthcare and the things that they're able to find faster that they maybe never would've been able to have found before. But as we move beyond academia and core research and those types of use cases into mainstream enterprises, we're starting to find that the core traditional bones, even within Dell, and I look at the use cases where we're applying AI, sales, you think about software development, we talked about coding assistance, you think about supply chain. Those core use cases, these are the bones of our company. We're leveraging generative AI and looking into agentic for lots of different cases to actually solve problems we were never able to solve before, in a far more efficient way. So that use case conversation is quickly evolving, and to me, agentic is just an accelerant on a whole different scale.
Dave Vellante
>> Well, that may be instructive for my next question, which is, what does that AI stack on-prem look like? You've got the AI factory with NVIDIA, and so you've got the hardware, you've got the CUDA stack, but you're doing a lot of stuff internally as well. So I'm curious, what does that on-prem AI look like in a steady state? Compute storage, networking, the data stack, governance, security, you've got applications on top. In a steady state, Mindy, how do you envision that?
Mindy Cancila
>> Yeah, I would say AI is super complex, and this isn't just an application for leveraging a data set, it's all of these things stitched together. I go back to my days in the data center, and this is sort of that best case, solutions that really come together. It's compute, it's storage, it's multi-cloud, it doesn't live in a singular data center, it spans across, that imposes unique networking requirements. On top of that, we have models, we have automation. It's not happening, as I mentioned, just in a data center, it's on the Edge, it's at personal devices. That really begs for two things from my perspective. It's solutions that more easily stitch those things together, and services, which in our case are learned from what we've done in our own journey, that help bring all of this together. Again, it starts with the data. But when you think about the list of things that I just went through, Dell is one of the few vendors with breadth and industry across all of those. And Michael has publicly said, "We're not looking to be a large language model provider." You don't do a lot of AI without models, which means we're going to forge partnerships at all layers and bring the breadth of our portfolio together with those partnerships. So, it's really intended to remove the complexity, learning from the things that we've done in our own journey. Again, it starts with the data, which means you look at the things that we're forging around the AI data platform. It's not about the volume of data, it's about quality data. So, how do you get that placed where you can process it and do it in a secure way where you can collect it? A lot of that data sits on Dell technology storage and you see us really investing in power of scale and object scale and driving better compression and performance for both AI workloads and non-AI workloads. And then bringing in the breadth of portfolio with things like Edge and NativeEdge, and then of course wrapping all of that with services.
Dave Vellante
>> . I'm glad you brought it up, because a couple, two-part question. You think about hybrid and multi-cloud, we sometimes call it super cloud. It's aspirational in a lot of cases, but sometimes hard to operationalize. We had Walmart in earlier this week and they have this kind of innovative triplet model, but they've had to do it, and it's Walmart, so they have massive capabilities like JPMC. We saw JPMC at Dell Tech World this year. They can actually do that engineering work to make hybrid and multi-cloud, super cloud work. What are you seeing for adoption patterns and how is Dell making it simpler, especially in the context of AI? And the second part of this question is, the future of AI factories. The future has to be multi-cloud, hybrid cloud. You're not going to just do it in one place or the other. So, how do you see that all playing out?
Mindy Cancila
>> Yeah, so on your first question, I mean, I think it's important to start to segment the types of customers. Again, we're a big company, so let me just start because I really want to focus on enterprise, but it would be a miss of me to not start with training. Dell is an infrastructure provider that sells gear to a lot of large types of vendors, whether they're neoclouds or global companies at massive scale, these training providers, GPU as a service providers. And so, that training market is one that we're very present. That set of use cases of what they need because of the scale looks different than when you look at an enterprise, and it's where you were going with a JPMC or a Walmart. When we think about those types of customers, those large enterprise organizations, we know that they're going to do some training, but they're more likely to at scale of all of the types of generative AI workloads they run, they're more likely to take a model, do some fine-tuning, and then do a lot of inferencing with it, and that's really the enterprise class portfolio. We're in those earlier days with beyond enterprise, you think MBs in a small businesses. We're really looking at the enterprise part of the market. For those types of customers, getting into your second question, the types of things that I see and hear from customers that they need to make that easy, they need solutions that have already been brought together. I mean, gosh, we've talked about model evolution, so I'll just pick on that one. These models are moving so quickly. It reminds me back to my cloud days when customers used to say, "There's a new service every time I turn around and I don't know that I want to move to adopt these new services," or, "There are so many that it's complex for me to know where to even get started." A lot of the work that we're doing within AI Factory is learning from the things we've applied ourself. We've done a tremendous amount of model evaluation. We do that in our technology teams and our engineering teams, and we're taking those solutions and bringing them to our customers in a more streamlined way. That's what the AI Factory is really all about.
Dave Vellante
>> Exciting times, Mindy. I can't wait to see the next chapter. We'll see you soon around the block. Looking forward to catching up next month in Austin. Thank you so much for coming back to the theCUBE.
Mindy Cancila
>> Great to see you as always. Cheers, Dave.
Dave Vellante
>> All right, you bet. And thank you for watching AI Factories, Data Centers of the Future, brought to you by Dell and NVIDIA. Thanks for watching. We'll be right back right after this short break. TheCUBE plus NYSE Wired.