Join Dan Roberts, co-founder and co-CEO of IREN, as they discuss the evolving landscape of AI factories in an engaging session with John Furrier of SiliconANGLE Media. This discussion is part of theCUBE's AI factory series at Media Week, held at NYSE CUBE studios. It delves into the realm of distributed computing and data centers in modern technological ecosystems.
Roberts shares expertise on the rising demand for AI factories, highlighting how these entities redefine the traditional data center model. They discuss the transition from Bitcoin mining to AI-driven solutions, explaining the growth trajectory of AI applications and the integral role of power and innovative data center designs. Insights from analysts at theCUBE Research further illuminate the strategies and advancements in AI infrastructure enhanced by companies such as Dell Technologies and NVIDIA.
The conversation also explores the adaptive approaches and strategic aspects of deploying AI infrastructure, emphasizing sustainability and efficiency. Roberts asserts that understanding customer needs, coupled with a long-term sustainable development strategy, is crucial for success in the AI factory domain. Further discussions examine the rapid expansion of AI use cases, the synergy between energy sources, and the diversification of cloud-based AI services.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Register for AI Factories - Data Centers of the Future
Please fill out the information below. You will receive an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for AI Factories - Data Centers of the Future.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Dan Roberts, IREN
Join Dan Roberts, co-founder and co-CEO of IREN, as they discuss the evolving landscape of AI factories in an engaging session with John Furrier of SiliconANGLE Media. This discussion is part of theCUBE's AI factory series at Media Week, held at NYSE CUBE studios. It delves into the realm of distributed computing and data centers in modern technological ecosystems.
Roberts shares expertise on the rising demand for AI factories, highlighting how these entities redefine the traditional data center model. They discuss the transition from Bitcoin mining to AI-driven solutions, explaining the growth trajectory of AI applications and the integral role of power and innovative data center designs. Insights from analysts at theCUBE Research further illuminate the strategies and advancements in AI infrastructure enhanced by companies such as Dell Technologies and NVIDIA.
The conversation also explores the adaptive approaches and strategic aspects of deploying AI infrastructure, emphasizing sustainability and efficiency. Roberts asserts that understanding customer needs, coupled with a long-term sustainable development strategy, is crucial for success in the AI factory domain. Further discussions examine the rapid expansion of AI use cases, the synergy between energy sources, and the diversification of cloud-based AI services.
In this theCUBE + NYSE Wired: AI Factories – Data Centers of the Future segment, theCUBE’s John Furrier sits down with Dan Roberts, co-founder and co-CEO of IREN, to unpack how AI factories are reshaping the data center model into a new asset class. Roberts details IREN’s shift from Bitcoin mining to GPU cloud at scale, highlighting a fully vertically integrated approach that owns land, substations, grid connections, buildings and servers to deliver resilient, low-cost service. He shares timely metrics on surging demand (“next to insatiable”), the rapid ramp ...Read more
exploreKeep Exploring
What is the current state of large-scale systems in distributed computing and how is this reflected in the performance of companies like IREN?add
What challenges do legacy data centers face in meeting the demands of modern enterprise computing and what trends are emerging in the market?add
What AI initiatives is the company currently pursuing or planning to develop?add
>> Hello, I'm John Furrier with theCUBE. We are here at our NYSE CUBE studios overlooking the Stock Exchange here in New York. And of course, we've got our Palo Alto studios connecting Silicon Valley and Wall Street, tech and money together again. We are here as part of the AI factory and inaugural kickoff series of our Media Week, ongoing series. We feature the leaders in AI factories, the future of the data center. As distributed computing continues to expand, large scale systems are powering. Daniel Roberts here, co-founder and co-CEO of IREN, doing extremely well on the public markets right now. Stocks at a 50-week high this week. Congratulations. Daniel, thanks for coming on theCUBE.
Dan Roberts
>> Thanks for having us, John.>> So you guys are really a great illustration of an AI factory business. You're also a great illustration of a data center business, because you have a lot of data centers, a lot of power. You've done some Bitcoin applications, but now AI factory and the energy story is, obviously, now understood. Stock prices almost literally straight up in the past eight months. Business is good. Everyone's talking about more power, data center stories, front page of all the top news outlets. This billion going to this state, another billion going over there, 100 billion going to OpenAI from NVIDIA. I mean, really, it showcases the fact that data center demand is at an all-time high. I mean, you got to love that.
Dan Roberts
>> Oh, look, it's a good time to be in the sector, absolutely. I think it's both sides of the coin. It's the demand side. We're seeing that continue to escalate, but also on the supply side because what we are dealing with AI factories is fundamentally a new asset class. So legacy data centers based in metropolitan areas just aren't geared up for the level of power required, the rack density and the architecture.>> Yeah. And we're seeing the enterprise market getting ready and it's not fully as opened up as we thought it would be this year, we think maybe next year. But we see massive growth on the hyperscalers and then the rise of the neoclouds or AI as a service, GPU as a service. Anywhere anyone can get horsepower or any kind of compute for these applications for training and inference, those are the top applications. How has your AI factory changed? What was the moment you knew, okay, Bitcoin's great, Bitcoin miners, they're bounded by power, AI is bounded by power. What was the moment that was like, "Okay, AI is here?" And then when did it click in on the AI factory piece? Take me through that mindset, because that's a huge business decision.
Dan Roberts
>> Yeah, look, it is. But seven years ago, when my brother and I founded this business, it was on the premise that the digitization of society was coming. We were moving into the cloud as humans and we're seeing all these exponential digital adoption curves where things are going zero to one overnight. So Bitcoin was worth nothing 15 years ago. Today, it's a $2 trillion asset class. Two years ago, AI was confined to the hallways of PhD researchers. Today, it's the next humanity defining step change. I think its fundamental core is it's really hard to build out data centers at scale, yet these end addressable markets are going vertical in terms of their appetite for compute. That's where we come in. We've got an enormous amount of data center capacity.>> Dan, it's funny, we just moved a new studio here, opening up a new hub in New York. Obviously, having the access point in New York. We hear things in the hallway here at the NYSE, things like, "Maybe we should turn the data center into a commodities trade." Energy's there. So kind of like this is an asset class. I mean, there's already discussions of, "Maybe if I pre-buy the energy now from the data center or buy GPU cycles now, I can resell them." That's New York thinking. You guys are there with it, it's almost options. It's option pricing.
Dan Roberts
>> Yeah, it is. Because we're seeing that option be cashed in now. We're seeing next to insatiable appetite for GPU cloud capacity. We've expanded from 1,900 servers at the end of June and we've now got 23,000 either operating or being installed in the coming months, and we've got capacity to 3x that in short order as well.>> Talk about the data center topology. How many data centers, can you share a number? And what are you guys doing that others aren't? Because the old data center model was a bunch of REITs, real estate play, hosting providers and then cloud comes in, then they need capacity. And then you start to see the real systems design of a data center as not just a building with power and putting racks of servers on it. It was like, "Whoa, let's design this as a supercomputer." That is now obvious, but it wasn't seven years ago. You guys saw that. What do you guys do? Take us through the mechanics of your business.
Dan Roberts
>> Yeah. So I think the first thing we did was lock up a lot of power. So we've got almost three gigawatts of secured land and power and we've got about 800 megawatts of operating data centers today where because we've got those data centers operating, we're actively swapping out Bitcoin mining ASICs, and installing NVIDIA GPUs every day at the moment to service AI training, AI inference, and even to your earlier point, we're starting to see the emergence of some enterprise demand via our partners.>> It's interesting, you did all the hard work up front. Now, everyone is spending a lot of dough to do that. You guys have it.
Dan Roberts
>> That was a tough few years, but yeah, right now, we're in a really good spot.>> Tailwinds and understatement. What's the role of the Dell AI factory? Because one of the things we're seeing, first of all, Jensen Huang coined the term AI factory I think two years ago at GTC. And then this year, he said, KV cache is the operating system, which was really a tailwind signed that networking is a part of it, so it's really in the weeds there. And then Dell co-opted that. So Dell used to sell servers, they still do, but now an AI factory is just a lot of servers and there's packaging, there's all kinds of things that they've done. What was the role of NVIDIA and Dell technologies in your deployments and your success?
Dan Roberts
>> Yeah, look, Dell's role is far more broad than just the servers these days. They've been a really valuable partner on the AI factory side. So their air-cooled variant, the XE9680 has a lot of practical benefits over other OEMs, so we've had a really good success with that. But as we move into liquid-cooled systems, the way they view the data center is similar to us. It's the data center is the unit of compute, and it's about how the whole data center interfaces together. So it's the servers, it's the network, it's the cooling, it's the GPU and how all that comes together in terms of reliability, resiliency. So we're in the process of rolling out an NVL72. So these are the Grace Blackwell 300s from NVIDIA in partnership within Dell at the moment.>> Yeah. Well, we interviewed Michael Dell. We've been covering Dell for a long time, but a couple folks years ago, Ihab and John Roese saw this well, and they talk about it the same way you do. It's like a factory is a system and then there's a collection of kinds of software on the scale-out servers and systems and fabrics and storage fabrics and network fabrics. But what's interesting was is that they go by OCP standards. And what I noticed with your business, I saw the video of the time progression, when you do the swap-outs. It looks easy. I mean, I know it's not easy, but it's not hard. I mean, it's not like you're tearing the building down to the studs or, in this case, complete ripping out. So talk about that switching cost, because this is where I think the AI factories plays well, because you can just drop it in. I know you guys do that. Take us through what that was like. Scope that plan, that transition to AI factories.
Dan Roberts
>> So we built multipurpose data centers, specs for AI factories from day one in terms of the end rack density levels. So for us today, we've got 160 megawatts of data centers ready to go for these AI factories. So we're actively swapping out those mining racks, installing Blackwells and a few AMD chips as well.>> And then the customer side. Okay. So you've got the factories up and running. It abstracts out the complexities. When you see factories, you think output value. What are the customer value points on the AI cloud servers? What are some of the things, and obviously, you'd probably adore being knocked down. What are some of the examples?
Dan Roberts
>> Yeah. And the thing I love about the term factories is every other prior industrial revolution has been defined by factories and the workers in it. And you come to the fourth industrial revolution, which is all about leveraging human intelligence. And within these AI factories, the workers are the GPUs, they're the clusters, and they don't clock off, they don't tire, they work, they can scale up to millions, and that's how we're scaling human intelligence. And the ability for us to do that is super exciting.>> And the performance side, you have to hit that SLA, you got to make sure you got good performance. How are those factories working? What's some of the result? Can you share some anecdotes or stories?
Dan Roberts
>> Yeah, so this is where we're relatively unique. We're fully vertically integrated. So we own the land, we own the substations for the power connect. We own the grid connection. We own the buildings all the way down to the server level. And what that allows us to do is to provide a very resilient, low-cost service to customers, because we don't have intermediaries. We don't have third-party co-location fees that we need to pay. We don't have to get on a call to a co-location partner under their SLA. We have boots on the ground, people in the data centers, our customers have direct access to those people, and it creates a really seamless, reliable service for it.>> And they get what they want. They get what they have during the-
Dan Roberts
>> At the end of the day, that's the most important thing.>> On the business model, one of the things that I love about your logo, it's green. Green is money, stock price is doing good. You guys are healthy on the business side, great momentum, so congratulations. But the sustainability piece is huge. Can you explain that portion of your business model?
Dan Roberts
>> Look, 100% of all power we've consumed in the last seven years, it's been from renewable energy sources. So it's been an absolute defining part of our business. And when you look at the projections for this data center industry, McKinsey forecasting another 100 gigawatts of data center demand over the next five years, you need to do that sustainably. And our view is go to the source of low-cost renewables and monetize that into GPU cloud.>> How do Dell and NVIDIA translate to that? Because people who aren't informed, they think that these AI clusters are power suckers. They just suck all the power out of the Earth. What's the partnership with NVIDIA and Dell around aligning with the eco-friendly approach?
Dan Roberts
>> Look, a lot of it comes down to efficiency. It's efficiency of the compute layer and it's efficiency of the ancillary power that you need to consume and the level of innovation, the progress they're making around that, the ability of us to integrate that into our data centers and keep that efficient is super important.>> Now that you got the nice AI factories going and the success of the business model, what's next? What's on the horizon? What are you guys looking at? You're vertically integrated, you control your own destiny, you got a lot of power, which is in high demand. Love that story. You got the eco-friendly, got the sustainability. What other AI initiatives are you guys looking at because you got the footprint and you got everything you need?
Dan Roberts
>> So you may have put an unintended pun in there because we're developing a liquid-cooled data center called Horizon. Horizon 1, Horizon 2. They're capable of supporting 19,000.>> A lot of the Horizon.
Dan Roberts
>> There is a lot happening, but the immediate focus is filling up the current 23,000 GPUs that we've ordered that can scale up in the very short term to 60,000. And then we've got our liquid-cooled facilities coming online for the end of this year capable of supporting the GB300s.>> What are you guys doing that's different than others? Because when you look at some of the things that are happening, you see people spending all this money and they're buying all this gear, they build it, they will come. That movie, Field of Dreams, is kind of a cliché. It's been around for a couple decades. There's a lot of people saying, "Whoa, whoa, they're buying a lot. Where's that capacity?" Most of the hyperscalers people know that they can go in and get a good nine-month projections, but people are worried that the risk may fall on the real estate or the vertical integrated. And so a lot of people think there might be a little bit of a bubble on some of those over the big money CapEx spends. What's your reaction to that? Because you guys are, again, you're controlling your own destiny and you own everything.
Dan Roberts
>> And I think that's key, because we could throttle up with demand, we can throttle back when demand slows down. But what we've seen noticeably in the last six weeks is demand going up and it goes to the mid-market segment of the market. So AI labs, AI companies that are scaling up, a level of inference, enterprise via our demand partners. But what we're also seeing is appetite at the hyperscale level.>> And the thing about the news this week with NVIDIA's, or last week, the NVIDIA has $100 billion investment with OpenAI, and all the data centers being built, they're breaking ground. They're building, so it'll take some time. And I was asked, "What does that mean?" I'm like, "Well, it means it's demand." So talk about that piece, because you guys are up and running. How long do you think that demand will be? I know you probably can't say as being a public company without any forward-looking projections, but just your gut feel, the demand curve.
Dan Roberts
>> Look, right now, sitting here today, demand looks exceedingly strong. It's very robust. But at the end of the day, to forecast out a year, three years, five years, we don't know what's going to happen. This industry is so fast moving, but at the fundamental heart of what's happening in this sector is this dislocation between the real world and the digital world. So as I mentioned before, all these digital exponential growth trends, but the ability to build new data centers, to get more power online, it's very hard.>> Yeah, that's why I started with you guys being in the data center because even if demand might shift, say inference and reasoning changes or some sort of new architecture, there's still need for a data center. They don't go away. So the power and the physical plant is the asset. Did I get that right?
Dan Roberts
>> Spot on. And case in point is the ability for us to bootstrap this business with Bitcoin mining where we sold the Bitcoin each day, it was a cashflow play. And today, a higher and better use case for those data centers is emerging, so we're swapping out those ASICs, replacing them with NVIDIA GPUs to provide that GPU cloud service.>> Okay, so you guys got the nice playbook. What's different with you guys? How would you describe that?
Dan Roberts
>> Look, there's probably a couple of things. One is I guess we came early to the sector and we locked up a lot of power and land, so we've got an enormous amount of growth. So even those 23,000 GPUs I referenced that we've got on order or operate in, less than 2% of our entire footprint, so enormous room for growth. And the other aspect I would say is our ability to execute and deliver has been noticed by both investors as well as customers. We've never missed a milestone. If we say we're going to deliver something by a certain date, we at least hit it, if not exceed it.>> You mentioned you started this company with your brother. How did that go? How's that going?
Dan Roberts
>> That's great. Yeah.>> It's great that brother started the company. I started the company with my brother years ago.
Dan Roberts
>> Yeah. No, look, it's been a fantastic journey along the last seven years and we're loving it.>> It goes in and does the cabling. That's what I wanted.
Dan Roberts
>> Fortunately, we've got people better at cabling than us today.>> Early days, you're probably in there. Again, final question. As you look at the factories, this is like what we love about this is that we think it's a step function change. I like the asset class angle, but the demand and the apps coming are still waiting. I mean, it's almost the innovations that the infrastructure right now. That's why these data centers are so key. How do you see the developers and some of your use cases with customers? Are they chomping at the bit? Give me more. What are some of the patterns? Can you share any patterns that you're seeing with the market? I saw Sam Altman gave a talk and he's like, "Well, when we started ChatGPT, we didn't thought it was going over here, but then people were just talking to it, and then we realized that was the use case." This is going back to 2018.
Dan Roberts
>> Yeah, there's probably two ways I'd think about this. One is the demand that we're seeing life time in terms of the customer conversations right now, and this can change quickly. But right now, it's exceedingly robust and there's a lot of interest. The second one is when I step back and look at this sector, AI generally, we're still in the first innings. We're talking about large language models. Have you tried to render a video or an image? Look at the time it takes to do that. That is an indicator of how much more capacity that we need. We should be able to click a button and render an image in two seconds. Instead, we're sitting there for 20 or 30. So what does that need? That needs more GPU compute, that needs more AI factories? And as we get better at it, as we make it faster, as we make it cheaper, you unlock more demand and you get that self-fulfilling growth.>> Daniel, you are an AI factory builder. What's your advice for folks out there that need to do this for their business, whether it's a data center for an enterprise or an enterprise trying to figure out how to take their existing data centers? If they still have, most still do, most of the people with data like the banks and other people. But they also have to work with services. So what's your advice for, as an experienced factory builder, what's your best practice? Some people are like, "This is maybe too much CapEx." What's the reality?
Dan Roberts
>> The reality is never take a shortcut because it'll come back to bite you. Build for the long term, build sustainably, build with a long-term perspective, because that way, you underwrite a reliable platform. The second thing is customers are number one. Listen to them. Provide the most reliable, resilient, cost-efficient service, and you should do well.>> Well, thanks for coming in. Appreciate it.
Dan Roberts
>> Thanks, John.>> We got a factory builder here on theCUBE. The AI factory wave is here. As the factories get built, there'll be large, medium, and small ones, but they will be pumping out value tokens. Sustainable energy usage, a key part of the design. Again, these are large-scale supercomputers. We are in the supercomputer era. Of course, we're doing our part to bring that data to you. I'm John Furrier with theCUBE. Thanks for watching.