We just sent you a verification email. Please verify your account to gain access to
Dell Technologies World 2025. If you don’t think you received an email check your
spam folder.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Register For Dell Technologies World 2025
Please fill out the information below. You will recieve an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for Dell Technologies World 2025.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
Dell Technologies World 2025. If you don’t think you received an email check your
spam folder.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Sign in to gain access to Dell Technologies World 2025
Please sign in with LinkedIn to continue to Dell Technologies World 2025. Signing in with LinkedIn ensures a professional environment.
John Roese, chief AI officer and global chief technology officer for products and operations at Dell Technologies Inc., joins theCUBE’s John Furrier and Dave Vellante at Dell Technologies World 2025 to discuss the future of AI infrastructure. The conversation centers on how Dell is reimagining enterprise workflows through strategic AI integration.
Roese outlines Dell’s focus on embedding AI into core business processes to enhance productivity and innovation. He emphasizes that success in AI adoption hinges not only on the right technology but on tran...Read more
>> Welcome back, everyone, to theCUBE's Live coverage here, day two at Dell Technology World in Las Vegas. I'm John Furrier with Dave Vellante, and all our team is here getting all the data, sharing that data. Of course, Dell has the AI factories, and we're bringing all the live action here. Seamus Jones, director for Server Networking and Sustainability at Dell Technologies here. We're going to talk about smart and scalable unlocking efficiencies in AI architectures. Alyson Freeman, innovation leads, sustainability and ESG at Dell Technologies. Alyson, great to see you, Seamus, as always, a pleasure.>> Hi, John.>> We were talking before we came on camera. Dell has a lot of impact for profit initiatives going on. You're involved in that. Seamus, we know the work you do with the labs you've got going on and all the power and cooling. We've been talking on theCUBE many times, but sustainability is not just like a checkbox anymore. It's core to the mission of our world and society, but the data center and the energy and all the tokens that are being produced are all right in the wheelhouse of the discussion. So the conversation around energy, sustainability, climate change, it's all coming together. I know it's not a climate change conference, but the data centers will have a footprint impact, Alyson, on the world. This is a big, big deal.>> Well, and the good news is that everything that we're talking about for energy use, customers want to use less energy because it costs money or because they're power-capped, and they need to have as much compute as they possibly can for the power that they have available. All of that, the natural consequence of that is reducing emissions. So depending on how different people are approaching it, they're all focusing on the same thing, which is as much compute as possible with the power that they have available.>> Seamus, you've got to take us through the announcements here because we were talking the other day about the innovations in cooling. I interviewed a company called Soliton. They got cooling on the chip coming. They have amazing small 122 terabytes. It looks about the size of my phone.>> Yeah, it's crazy.>> So there's a lot of efficiencies in the hardware and software. Take me through the key announcements here at Dell Tech World.>> Yeah, you absolutely nailed it earlier, talking about efficiency. Sustainability in our mindset means energy efficiency of the platforms from design as well as implementation. So when we look to design the new server platforms, we announced four new XE portfolio platforms, but those four platforms are actually purpose-built for AI infrastructure. They're purpose-built to deploy the multiple GPUs which are absorbing more power. We recognize the fact that those are drawing more power, generating more heat, and as a consequence, need things like an enclosed rear door heat exchanger, which is a brand new product that we're really excited to bring to market. That product specifically can reduce the power to cool by up to 60%. It is a hundred percent heat capture, which means you're not circulating that hot air in the rest of your data center. It's capturing it, cooling it in the cabinet, and able to recycle the air down to 32 or 36 degrees Celsius, which means you don't have to lower it to a lower temperature to circulate around, which means you're not spending the energy on the air conditioning units in a secondary loop.>> So that's a big saving. Now, just to drill down and not to take a tangent, but is that a new rack? Does that work on existing racks? Take me through the product.>> It is a completely new design that we brought to market. We're very excited about it. It's on the show floor over here. It's getting a lot of attention. It's built to be able to be attached on the back end of our racks, so that way, we can make our existing XE portfolio even more sustainable in nature.>> So it works on existing racks?>> Absolutely. Well, it works on the IR 7000 rack, which is->> Okay, it's the open compute standard?>> Correct, yeah. That's the OC->> OCI.>> OCP .>> Alyson, talk about the innovations you're seeing. You and I were in a round table with the analysts. Seamus was there as well. There's a lot of sustainability goodness coming in and converging. I want to get the story in around the coral reefs because I think that's one of those things where there is some tech being involved. You guys actually have a deployment, and I was fascinated because I didn't think it's already out there for good already. So this is an AI for good story, but it's also a techie story.>> It really is. I'm so glad you asked about this. This is one of my favorite projects of all time. So AI, as you were saying, uses lots of energy, but AI can be part of the solution for how we optimize the data centers. And you mentioned coral reefs, and that's because we did a project with Scripps Institution of Oceanography at UC San Diego, and they go take photos of coral reefs around the world and then they turn them into 3D models to monitor their health over time. Turning those into models uses a lot of energy, and we have deployed an actual proof of concept that we call Concept Astro for the first time in trying to make sure that they run those workloads with lower cost and lower energy. So it uses AI to predict how long it's going to take to create that model and then feeds in information from the energy grid, a forecast of 72 hours into the future in five-minute increments about the cost of the energy that they need and the emissions associated with it. So you can optimize when and where you run that workload, so whether it was in Scripps's data center in California or Seamus's lab in Texas, you can do your compute when it costs less or when it has lower emissions. And we haven't seen that optimization happen yet at a data center level scale, and that will save a lot of energy.>> We're trying to give transparency within that dashboard, so that way, we can understand where we have more power available and it can be run more efficiently.>> That's right. For workloads that are containerized and are not immediately needed to be run, we can optimize when and where we run them. And in this initial pilot, we saw a 20% cost savings and a 32% emissions reduction so far.>> Stuart's the champion of that. I won't say his last name because->> Stuart Sandin. Professor Sandin.>> You're doxing him now on theCUBE. Okay.>> Oh, no.>> I'm sure he is okay with that.>> I was trying to give him credit. Sorry, Stuart.>> Stuart Sandin in that case is his name. I'm sure. I'm just being polite. What I liked about, because he had the chill San Diego look, but I was really intrigued by some of his commentary. They're using the data, and this is something that we talked about at Supercomputing, Seamus, and I was riffing in my mind in the session where the data, we don't yet know what we don't know about the data. The data, you're learning about the configurations. You come back into product design, and so there's now a new loop of data coming in. Could you share some of that dynamic? Because what I noticed in that deployment, he's learning too from the data, so he's got software built in to understand workload management, when to use the energy winds. Like at home, he's like, "Don't use your washer dryer during these hours." Common sense.>> It is just like that, and you think about how much energy a washer or dryer uses and then how much a whole data center uses and what more of an impact you can have, but you're exactly right. That is how it's working.>> There is a huge amount of learnings from Concept Astro, and the interesting thing is that we're doing this lighthouse type of project, so that way, we can take these learnings and then implement them into our products later on down the road so we can understand what is impactful for customers, what is impactful for the environment, and what is impactful for our portfolio products.>> I want to just put a pin in that. I want you to explain what the Concept Astro is, because I think this is the bigger picture.>> Agreed.>> That's one example of it and a good one, one of my favorites too. It's a digital twin. It's figuring out the real world optimization. Explain what it is. You guys are doing it. I don't believe . Go ahead.>> Absolutely. Do you want to start?>> Yeah. So the digital twin aspect is we have to model out what would happen if you ran a workload at the exact time that you hit submit versus later, and so we need the digital twin of the data center to create that modeling. And it's using old style AI, machine learning, to decide how long it's going to take for that workload to run, and then agentic AI to run it autonomously without a human having to get involved.>> That way, we can effectively schedule it, so that way, it's running at the most efficient time. But it's really using AI for AI's sake and to be able to have an impact on the planet.>> So all these are examples of what I see as an algorithmic, software enabled, agentic way to manage resource.>> Fair.>> Okay. So take me through your vision on that. How does that play out in your mind's eye, or practically today?>> It matters for customers who are paying their energy bills as well, but it also matters for the grid operators. We want data centers to be responsible stewards of the grids where they operate in. I certainly, if there's a data center in my community, there's a ton in Texas and Virginia, I want to make sure that my home isn't impacted by this. So we're really seeing a lot of positive feedback from grid operators, that if we can help balance that load and manage this transition into using more energy in a responsible and thoughtful way, then that's useful to everyone.>> One of my bullets, my sections of the blog post I wrote on my research note coming into Dell Tech World was new usage patterns and model efficiency, but what you guys are getting at is efficiency of the systems. What's the customer saying? What do you guys see from customers? What do the early returns look like? I know you've got the lab, you've got the digital twin, Project Astro. What are the customers, I guess I won't say struggling with, or thinking through?>> The model wars are happening. These models are just proliferating, and they're coming at such a faster speed every day that it's one of those things that the model will actually impact the performance of the system and the energy that it consumes as well. And what we're trying to do is not only look at the energy consumption of each platform and each GPU, but understand, well, some customers don't need to run that application and its time sensitive. Other customers are absolutely performant. When AI is your business, then it's absolutely performant that you have that framework. We want to enable them. That's the large 72 rack, right? Whereas other customers, you can run AI on CPU. Some people don't even realize that. If you have a 20 parameter model or smaller, you could run AI on CPU if it doesn't matter if you get an answer within let's say 10 minutes. So there is a horses for courses mentality, and what we try and do is understand what those performance metrics are and then give customers good advice on when to deploy an AI factory that is a full-fledged deployment reference architecture versus just maybe even repurposing an existing system and adding GPUs to it.>> Or even an AI PC with NPUs.>> Exactly.>> Yeah. And one of the things that's interesting about Dell Tech World, Alyson, is that this year, it's very customer zero you guys call it. I call it eating your own dog food, drinking your own champagne, where you guys are actually making efficiencies internally at Dell, deploying and getting your hands dirty with the action, and then bringing that to customers. So I have to ask, and I think the data center is the computer. I love that metaphor. I've been using it a lot, because it is a power envelope, and edge is a computer too. They're all compute things. It's not one thing, it's a system. As you look at the future, what do you guys see as the key design areas to optimize around? Obviously power envelope. Is it token demand? More reasoning, more intelligence means more tokens. Forget prompts. Machines are going to need tokens, software needs tokens, computer vision. We're living in a token tsunami coming.>> We're going to need optimized utilization and a lot more cooling options to meet customers where they're at, and that's something that you work on a lot.>> Absolutely. What happens is as those token counts increase, you look at coding for example. When you're looking at Anthropic or a coding architecture, when you make an addition or a subtraction to a code, you're going to run it right back through the AI framework, so that way, you can see, did that have a good result or not? Which means that you're consistently adding more and more token count to your AI deployment. As a consequence then, you're generating more power, you have to deal with more heat, and we're trying to make things more efficient for the platforms all together. It's the models though that are really having a big challenge in this space.>> Talk about the relationship you guys have with your customers. I know we talked a lot about your labs, which are like a playground to get workload management, understand things. Alyson, you're doing these projects, you've got these digital twins going on. What's the customer relationship? Has it evolved? Because their patterns are changing too. They have to, I won't say... In the old world, hardware refresh was a buzzword, right? Yeah, more money coming in for Dell, more servers.>> Which we don't mind.>> Which is a good thing, but not only is it a hardware refresh, it's a system reset.>> True.>> And what Jeff Clark was saying on stage and here on theCUBE is he was wearing a shirt that had all this token stuff on. He's a walking token factory, but these factories, they're going to be really producing a lot of value that will consume energy, so that means the data center is not just the rack anymore. A series of racks with power for rack. I always hear x power per rack. Well, that's if you're standing up a system like Nvidia, they're racked up, but sometimes you might configure other racks differently.>> Yeah. You're talking about scoping out to half a megawatt per rack. That's a huge scale out. That's what our framework can do. The reality is that today, the power consumption per rack, there's no denying it, AI is doing some amazing things for businesses, absolutely amazing things, but it's also drawing more power to do that. And what we are trying to do is do that in a responsible way that has an impact for customers, that actually draws a good ROI, so that way they can know that, okay, if I'm going to deploy this framework, it's going to give me a result for my business that's going to make sense. A big thing about our initiatives is to drive that efficiency within the platform.>> And I think there's two different types of customers that we meet with all the time, and one customer can be both, but the people that are trying to take what they have and optimize it and bring it into the future and the people that are building something brand new, and there's very different solutions and services that we think through for those types.>> Got it. And greenfield, brownfield.>> Greenfield, brownfield. Yes.>> Old classic sense. Right, Alyson, I want to get this in there because I know you mentioned before we came on camera that you're doing some work. You guys have an impact mission.>> Yes.>> You're committed to share that.>> Driving technology through human progress, and we have a whole AI for human progress, suite of projects. Concept Astro is one of them. There's also digital twins of humans for medical research. There's digital twins that you can see here in the expo floor. Some of them even do this interview thing with HopeWorks where there's an AI version that you can talk to and use it for interviewing skills for students that are learning about AI as well.>> That's super impactful. That digital assistant, effectively.>> That's the word I was trying to think of.>> It's a digital assistant that then can interact with the students, and it's really exciting to watch.>> You guys have this sales enablement thing nailed down. I heard use cases that Jeff Clark came and talked about on theCUBE as well as in the keynote, that you guys have cleaned your data, you've ingested it in, you got the data fabric in Dell. The unstructured sales data now has agents. Why can't the salespeople be agents?>> That's taking it to a little extreme. At the moment, what we're doing is we're using AI to enable our salespeople so that even if they're the newest salesperson on the floor, they're some of the smartest.>> I wanted to ask Jeff Clark if he thinks his job will be replaced by agents. I couldn't get it in. I would have loved to see his response to that. He's so clever with, he's got always a quick comeback. All right, now summarizing. In all seriousness, the AI factors are coming. They're here. We've got 3,000 of them. I didn't even know that stat was legit. That sounded like I didn't know that you were out there, but they're out there.>> And more every day.>> And more coming. And it's data in, tokens out. That's the model. That's what's happening. What do you guys see as opportunities where people watching could get involved? How they can get involved in their jobs? How should they think? If they're going to be capital budgeting around energy in the data center or edge, the whole mindset's changed. It seems to be like a new mindset on how to build and innovate has changed. What do you guys see as opportunities for folks? How they engage with Dell, how they engaged with their company, their own innovation plans?>> I can take that.>> Go ahead.>> One thing that I see immediately is that customers need... We spoke with dozens of customers this week. There is an initial need today to be able to run, deploy, and understand proof of concepts. Because customers aren't going to jump their toe into a large deployment for AI, which can be very costly and without an expected ROI. And it's really, if you do POCs in the cloud, you can't always translate that to an on-prem deployment that's productive and gives you the same ROI. So what we're trying to do is give customers a route to make POCs both on-premise, either in our data centers or in theirs through our customer solution centers. We can actually do that for them, with them, and we can ensure that, look, you can translate this to production. It's that translation from the POCs to production that's hard.>> I think that's a huge opportunity, Seamus, because AI could help there too.>> Absolutely. Yes. Yeah.>> The physical AI convergence with digital has happened. You mentioned digital, so I love this topic. The world of physical assets, and I use that word intentionally because it could be money, it could be labor.>> Time.>> It could be factories like manufacturing, it could be theCUBE. It could be coral reefs.>> Yeah. The number of times I say AI can help with that, you'd be shocked, in everyday life and at work.>> John, we could be being interviewed right now by an AI digital agent.>> During John Rose's keynote->> A digital assistant.>> It was suggested that we're in a simulation. Guys, thanks so much. Alyson, get the final word. What's the coolest thing you're working on right now that you're most excited about? What's the project? You already talked about the coral reef one. Chuck, that one as already said. What's your second most exciting thing that's going on for you?>> Well, it's sort of the same one, so it's how do we grow Concept Astro? How do we use AI and digital twins to optimize things within the data center? This was workload scheduling and management and bringing in grid insights. What can we do next? Is it deciding what should go on PC workloads versus server workloads? Is it how do we optimize overall cooling for the whole data center? The possibilities are endless.>> Seamus, what's your coolest thing you're working on?>> Enclosed rear door heat exchanger, liquid cooling options. Just having more efficiency within the platforms. We're already doing planning for our 18th generation of servers, and when we look at the development life cycles of those, there are some really fun things to come.>> Yeah, you guys are moving fast as a company, and I like the North Star. Make it easier to consume and deploy the AI and bring more intelligence, smaller, faster, smarter.>> Absolutely.>> That's a good thing. Alyson, Seamus, thanks for coming on theCUBE.>> Thank you.>> Okay. I'm John Furrier for Dave Vellante, Samantha Peterson, Kristen, Martin, Nicole, we're all on the ground all over the place here at Dell Tech World, getting the stories that matter for you here in theCUBE in real time. Thanks for watching.