Sid Nag of Tekonyx participates in a discussion at theCUBE's New York Stock Exchange Studios. The event, titled "AI Factories - Data Centers of the Future," explores the transformational shift in data center architecture due to advancements in AI. Hosted by John Furrier, this conversation bridges insights from Wall Street and Silicon Valley.
The video examines how AI factories revolutionize data centers with supercomputing capabilities. Nag, leveraging their expertise as a former Gardner analyst, shares insights on current trends and challenges businesses face in adopting AI technologies. The discussion, anchored by theCUBE Research, highlights the evolving landscape where enterprises navigate a complex mix of on-premise and cloud-based AI environments.
Key takeaways from the conversation include the importance of AI factories in creating scalable intelligent environments that drive efficient outcomes, as explained by Nag. They contrast traditional hardware-focused approaches with the contemporary need for integration and simplicity in AI infrastructure. According to Nag, while companies such as Dell and NVIDIA continue to innovate, it is crucial for Chief Information Officers to focus on seamless and practical AI processes that align with their strategic goals.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Register for AI Factories - Data Centers of the Future
Please fill out the information below. You will receive an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for AI Factories - Data Centers of the Future.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: The AI Factory - Data Center of the Future. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Sid Nag, Tekonyx
Sid Nag of Tekonyx participates in a discussion at theCUBE's New York Stock Exchange Studios. The event, titled "AI Factories - Data Centers of the Future," explores the transformational shift in data center architecture due to advancements in AI. Hosted by John Furrier, this conversation bridges insights from Wall Street and Silicon Valley.
The video examines how AI factories revolutionize data centers with supercomputing capabilities. Nag, leveraging their expertise as a former Gardner analyst, shares insights on current trends and challenges businesses face in adopting AI technologies. The discussion, anchored by theCUBE Research, highlights the evolving landscape where enterprises navigate a complex mix of on-premise and cloud-based AI environments.
Key takeaways from the conversation include the importance of AI factories in creating scalable intelligent environments that drive efficient outcomes, as explained by Nag. They contrast traditional hardware-focused approaches with the contemporary need for integration and simplicity in AI infrastructure. According to Nag, while companies such as Dell and NVIDIA continue to innovate, it is crucial for Chief Information Officers to focus on seamless and practical AI processes that align with their strategic goals.
In this interview from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future, Tekonyx founder Sid Nag joins theCUBE’s John Furrier at the NYSE CUBE Studios to unpack how AI factories are reshaping enterprise infrastructure. Drawing on his decade-plus as a former Gartner analyst and hands-on experience across major vendors, Nag explains why the conversation must shift from chips and racks to scalable intelligence and outcomes. He frames the AI factory as an operating model, not just hardware, where data inputs, model training/fine-tuning/inferencing,...Read more
exploreKeep Exploring
What are the key developments in AI and data center technology that are shaping the future of computing?add
What are the key considerations for developing a scalable intelligence environment in an AI factory to enhance outcomes for organizations?add
>> Welcome back. I'm John Furrier here, host of the CUBE at our New York Stock Exchange CUBE Studios. Of course, we've got our Palo Alto series connecting Wall Street and Silicon Valley Tech and Money. As the world changes, as AI factories and large scale infrastructure and more data centers being built, the chips are getting faster, smaller, cheaper, large scale up and scale out systems are now coming on data centers, which were a bunch of machines are now one unit, a super computing. We are living in the super computer era, and that is going to power this entirely new AI native applications. And with that transformation, you're going to have people that are going to transform and net new capabilities. So to break this AI factory future, the data center down, Sid Nag is here, former Gardner analyst for over a decade now on his own. Has his own capabilities, new firm. Sid, great to see you. Last time I talked to you, you were a Gardner analyst. We were with a company called Emma at AWS. You're now out on your own. Tekonyx is the name of the firm?
Sid Nag
>> Yep.>> That's great. Congratulations.
Sid Nag
>> Thank you.>> Thank you. Super awesome. And you cover the area. We're certainly at Gardner for over a decade. Before that you've been in computing. You know this market. The computer revolution was storage, networking and compute. Then hyperconvergence came.
Sid Nag
>> Yeah.>> Then you had hyperscalers, then you have now, we introduced, not issues, but we've been discussing a fourth pillar, compute, networking and storage, and database. If you look at Oracle, they're launching data centers. They're like, what does Oracle have to do with Open AI? So you start to see that the data, the role of the data, the large scale data centers with all the capexes is top of mind. Of course, some say it's a blue bubble. I think it is certainly bubble-ish. It's got some risks there. But this is pointing to the infrastructure change that's needed for the scale. NVIDIA calls it AI factories. Dell adopted that. I love the name because it implies stuff's pumping out. It is like the better process is better. Remember the old days Intel? So what is your analysis? Because if you look at the macro, cloud was great, multi-cloud we cover that, sure, multiple clouds. But now you have this data center, neo-cloud thing going on, being built, but also the enterprises are looking at on-premise because that's where their data is. So now you have cross-environment. So what's your take on this? Set the stage for us. What's your assessment?
Sid Nag
>> Yeah, so I think you absolutely put it correctly here, right? I mean, the world is exploding with data. Think about what happened when ChatGPT got introduced first in this planet. People went crazy and these LLMs Frontier models were massive in science. But I think, when you think about the enterprise trying to adopt all these technologies, they run into a lot of barriers. And that's setting what the real problem is. If you read everything, everyone's read the MIT study, 95% of AI projects are failing, right? And I think the failing, because there's a lot of technology around us, and I think->> First of all, do you believe that to be true that study or do you think it's just sample a sample? It's natural. We see-
Sid Nag
>> Yeah, I don't know if it's as high as 95%, but I do know that there's a big chasm between what the vendors are doing, spending $500 trillion a year, building all the data centers, AI data centers, and what the buyers are looking for. And the buyers are looking for, hey, that's great you guys are doing all that. But for me as a CIO to take that infrastructure that you're building on my behalf or for me to use instead of a rented model like the cloud model, I need to make sure that whatever I'm deploying with that infrastructure helps me build a better outcome. So I think it's less about focusing on the hardware and the infrastructure from an AI factory perspective, but the AI factory should really think about how do you drive a scalable intelligence environment for the CIO? That's what's going to really drive the adoption of AI, and that's what's going to drive the consumption of these AI data centers that people are spending trillions of dollars on. So I think of it as a data input. You got the model training, you've got all the orchestration engines, you've got the digital applications, that outcomes a scalable, intelligent recommendation. And the nice thing about building a... And this is not something new. CIOs have been doing that forever. They hire smart people on their staff and these guys go and do it. But that's a very asymmetric process. Why is it asymmetric? Because each person has their own way of doing it and the output they get is similar but not exactly the same. Whereas in AI factory that hold them thing, it's automated and it's predictable, it's efficient, it's scalable. There's governance issues around it, there's cost savings around it, and that's what the CIO is looking for. Give me an AI factory that drives scalable intelligence. Don't talk about the hardware, the chips and this and that Dell and NVIDIA are doing, in my opinion.>> Yeah. You like the factory idea?
Sid Nag
>> I like the factory idea. I like->> Yeah, I do too. And I think Dell and NVIDIA get, look at the old days of selling servers are gone. But then to quote Larry Ellison from his famous Churchill Club speech where he said the cloud is just a bunch of servers that Dell is still selling more servers in the factory because the factory needs servers. They just configure them differently.
Sid Nag
>> I mean, don't get me wrong, that's important stuff. Dell's got to build that. NVIDIA's got to build that. But the integration of all that to drive that output, which is scalable intelligence, for me as the CIO, to get the outcome that I'm looking for, that helps me drive my revenue stream->> Yeah, it's called outcomes.
Sid Nag
>> IT efficiency->> It's called outcomes.
Sid Nag
>> That's right.>> Yeah. And this is why I like your comment because I think, again, this is really nuanced, but I want to kind of just stave it for a second. This is the evolution, and this is what some people don't get. NVIDIA and Dell sell stuff. NVIDIA's got chips and hardware and ton of software with CUDA and other things. Dell sells servers is now a lot of software. He talked to John Rose will say the same thing, but they're in that game. So a data factory is just designed to be one supercomputing, which is a collection of servers. When you had a server, you'd open up the hood. It's like a motherboard, a processor, other chips. I mean they had stuff in there. There was subsystems and systems in there. So I think that's smart for Dell and NVIDIA and others to think of it holistically, to abstract out the normal gear mindset of selling hardware to go in saying AI factory. And that's why I like it too. It's like it's simple to understand. Give me a factory, it's going to pump out tokens and value.
Sid Nag
>> Yeah.>> Okay. You had me at hello? Okay, good. Now what runs on the factory? This is where we're at. So super important to explain that they still got to innovate. NVIDIA still got to innovate. So the Dell, but the next question is, okay, I'm a customer. What are they saying to you? You talk to a lot of clients and practitioners. What are you seeing from the customer standpoint on the AI factory? Because I'm hearing startups want to run on the factories, which is basically Dell and NVIDIA or NVIDIA only or Dell only. So what is the operating system? I mean, we heard Jensen Wong say that KV.Cash is the operating system of the AI factory. He said that on stage at GTC. I'm like, whoa, that's networking. How does networking become an OS? Well, when you look at the big picture, it's connecting the black wells together. The interconnect is the coordination and links and loads, all the resource. That to me passes the operating system test.
Sid Nag
>> This reminds me of the cloud days. What is cloud? Everybody talked about cloud being an operating model.>> Yeah.
Sid Nag
>> It's not a physical thing, it's not a service. It's a model, operating model. And that's what the CIOs were looking for and they got that from the cloud. So now as you fast forward to the AI world, the same thing. What is the AI operating model for me? What is it? I mean, that is a factory, right?>> Yeah.
Sid Nag
>> If I can get an operating model using all these AI technologies, be it fine-tuning, training, inferencing, all the data inputs from multiple sources, whether it's sitting in an LLM or all the databases of records and CMDBs that I have, which I only put in the model, I want to create a domain-specific SLM that is pertinent to my industry, my specific company. And that is used to do this training, inferencing and fine-tuning. And then furthermore, I have all the orchestration engines, I have all the tools, the picks and shovels that are integrated into the factory and the "robot" that gives me this output that I can go and say to my people within the company, hey, this is the recommendation as a CIO I'm making to you to change your way that you operate. To me, that's a successful outcome of AI. If that's the way that the world can think about it and all the vendors can respond to that, I think that's a winning ticket, right?>> And I was talking to a customer, I totally agree with you by the way, and customers validate that. I was talking to a customer of these vendors, an end user, they call them, like a company, and they said beauty's in the eye of the beholder, meaning every enterprise is different. So it looks like a one-off, but it's not. They have unique needs. They've got workflows, they've got data, and they're not going to leave the cloud anytime soon. But they're also not going to want to move their data with them. If they can put an AI factory in the on-premise activity, that's great. Now, not all data centers will have the power requirements to run the Superchip, the Blackwell, and all these chips, but they'll use a NeoCloud. I just interviewed Tomoro.ai, a great company. They're highly focused on AI acceleration as a managed service. But now you have the combination of these NeoClouds. So what's your vision on that? Because I think this is interesting because if you look at the on-premises market, it was the mental model used to be a big data center, a glass house in the old mainframe days, power in a building, but not necessarily that anymore. It's just single tenant, fully secure. But I'll go where the GPUs are. If CoreWeave got some GPUs, I'll take them.
Sid Nag
>> Yeah.>> Tomoro.ai's got-
Sid Nag
>> I think the vendors are very caught up in the innovation growth, and I think that's the right thing for them to do. But I think you also have to step back and think about how do the consumers, which is the end users, are going to view all that. So in other words, I'll give you a simple example. Let's say I go and invest in a particular frontier model, like, I don't know, Lower from Amazon. Six months later I decide, no, that was not the right thing for me. I want to go with Gemini, with Vortex AI, with Google. We don't just tools today to roll back that investment and to roll into the new world. Everybody's looking forward, but they're not thinking about how to make this thing backward compatible, right? I'll give you another example. I go to a vendor conference and hear about 15,000 different flavors or GPUs and CPUs and TPUs to support. But if I'm a CIO, I'm bringing an AI workload. You know what? I should have to go and figure out which GPU and TPU do you serve it up for me? Like the functions capability or serverless maybe. You call it siliconless, right?>> Yeah, yeah, exactly.
Sid Nag
>> So those are the->> Make it easy.
Sid Nag
>> Make it easy. Here's the easy button. So give me those practical capabilities on top of all the fantastic innovations you're doing, Mr. Vendor, give me those things. So it makes my adoption journey simpler. I call that the part of the factory.>> Yeah. Sid, it's great to have you on. You are now officially a contributor on the CUBE research team and CUBE TV digital program.
Sid Nag
>> Thank you.>> You're so awesome. You've got such great experience and you got your own firm out there now. Put a plug in. What's the focus? What are you targeting? What are some of your objectives with the new opportunity that you're pursuing? Obviously, you're a seasoned veteran. I got a clean sheet of paper. Give us-
Sid Nag
>> Thanks for the opportunity, first of all, also appreciate that. What I'm trying to do is look at the world a little bit differently than what I did when I was at Gardner, right? There was a very structured mechanism, and that's good. That has value. There's no doubt about it. What I'm bringing to the table now with this new thing of mind or this new opportunity form, it's really all my learnings from when I was a Bell Labs researcher, I was a scientist and researcher. Yann LeCun was in the same division as me in Home Dell, New Jersey. It's kind of funny, right? So I learned a lot then. And then I went on to work for a company like Dell, and I worked for Cisco and all these vendors. I was a general manager and I was a practitioner. So I'm bringing those findings into the equation. And then, of course, I learned a lot at being an analyst at Gardner. So it's a mix of all those experiences->> It's a melting pot.-
Sid Nag
>> that I'm bringing, and I am hoping that I can combine all of those learnings in my career to bring a unique perspective to my clients and help them, as I call it, help the world build better products and services. That's my tagline.>> Yeah. Makes society better. Well, we'll certainly love to tap you on this AI fact. We're also doing a robotics AI series. Of course, we had a mixture of expert series, which is always fun. Putting together a real good trust network here and the CUBE, and also the NYSE wired community, which is an open network of players, leaders. So thanks for being part of the community. Appreciate it. Thanks for coming.
Sid Nag
>> Thank you, John, and congratulations for getting this NYSE studio going. It's an amazing thing.>> We got New York and Palo Alto. We're going to pump finance money up to Silicon Valley and money content and bring tech content here. By the way, huge community in New York too, by the way. Course, I mean migration. It's a great tech scene here. So just we love it. It's an access point. Two hubs. All right-
Sid Nag
>> Fantastic.>> We're doing our part, connecting Silicon Valley and Wall Street, but also bringing in our community because getting the word out, telling the stories to understand what an AI factory is, or even understand these large shifts need real expertise in domain or a mixture of experts. We're doing our part. Thanks for watching. All right, thank you. We'll bring you in. So AI factory, we should have you be a set piece because what I want to do with AI factories next, this is not-