In this interview from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, Glean co-founder and CEO Arvind Jain joins theCUBE’s John Furrier to unpack what’s really working in enterprise AI today and what comes next. Jain explains why knowledge access remains the first successful AI use case at scale and how Glean’s enterprise search brings AI into everyday work. He details the past year’s lessons with AI agents – from the need for guardrails, security, evaluation and monitoring to democratizing agent building so business owners (not just data scientists) can create production-grade agents.
The conversation dives into Glean’s vision of the enterprise brain powered by an enterprise graph, highlighting the importance of deep context, human workflows and behavior to reduce “noise” and drive outcomes. Jain outlines core building blocks – hundreds of enterprise integrations and a growing actions library – that let agents securely read company knowledge and take actions across systems (e.g., CRM updates, HR tasks, calendar checks). He discusses how organizations are standing up AI Centers of Excellence, prioritizing “top 10–20” agents across functions like engineering, support and sales, and why a horizontal AI data platform that unifies structured and unstructured data – accessed conversationally and stitched together via standards like MCP – sets the foundation for AI factory-scale operations. Looking ahead, Jain says Glean’s upgraded assistant is evolving from reactive tool to proactive companion that anticipates tasks and accelerates productivity.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: AI Factories - Data Centers of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Register for AI Factories - Data Centers of the Future
Please fill out the information below. You will receive an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for AI Factories - Data Centers of the Future.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: AI Factories - Data Centers of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: AI Factories - Data Centers of the Future
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: AI Factories - Data Centers of the Future. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Olzhas Amirov, Enegix Global
In this interview from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, Glean co-founder and CEO Arvind Jain joins theCUBE’s John Furrier to unpack what’s really working in enterprise AI today and what comes next. Jain explains why knowledge access remains the first successful AI use case at scale and how Glean’s enterprise search brings AI into everyday work. He details the past year’s lessons with AI agents – from the need for guardrails, security, evaluation and monitoring to democratizing agent building so business owners (not just data scientists) can create production-grade agents.
The conversation dives into Glean’s vision of the enterprise brain powered by an enterprise graph, highlighting the importance of deep context, human workflows and behavior to reduce “noise” and drive outcomes. Jain outlines core building blocks – hundreds of enterprise integrations and a growing actions library – that let agents securely read company knowledge and take actions across systems (e.g., CRM updates, HR tasks, calendar checks). He discusses how organizations are standing up AI Centers of Excellence, prioritizing “top 10–20” agents across functions like engineering, support and sales, and why a horizontal AI data platform that unifies structured and unstructured data – accessed conversationally and stitched together via standards like MCP – sets the foundation for AI factory-scale operations. Looking ahead, Jain says Glean’s upgraded assistant is evolving from reactive tool to proactive companion that anticipates tasks and accelerates productivity.
In this interview from theCUBE + NYSE Wired: AI Factories - Data Centers of the Future, Olzhas Amirov, chief business development officer of Enegix Global, joins theCUBE's John Furrier to discuss how vertically integrated energy ownership is becoming the defining competitive advantage in AI factory infrastructure. Amirov traces Enegix's evolution from its Eastern Hemisphere bitcoin mining origins — where it developed more than 250 megawatts of data center infrastructure — to its strategic pivot into AI factories, anchored by the acquisition of a Canadian oil ...Read more
exploreKeep Exploring
Why did the company acquire an oil-and-gas producer and convert it into a data-center/AI operation, and how does owning energy production fit into its vertical integration strategy?add
What business are you in now?add
What is your long-term competitive strategy, and what target electricity cost (in US cents per kilowatt-hour) are you aiming for?add
Can your power supply scale — will you run out of power, and can this model be replicated using other power sources (e.g., renewables or nuclear)?add
>> Welcome to theCUBE. I'm John Furrier, your host here at theCUBE's NYSE Studio, part of the NYC Wired program, a Cube original. Of course, it's an open network of tech. Global network. Of course, we want to bring you all the action. We have Palo Alto Studio connecting Silicon Valley. Got a great guest here in our AI factory series. Little bit of crossover with some crypto infrastructure, but that's all merging. The data centers and the energy is the story. And energy, if you have it, you can do all kinds of cool things and build out the kind of systems that are needed to power this AI generation. Olzhas Amirov is here. Chief business development officer at Enegix Global. He's got a lot going on, so I'm looking forward to this conversation. Welcome to theCUBE. Thanks for coming on.
Olzhas Amirov
>> Thank you for the invitation, John.
John Furrier
>> We love talking to leaders who are on the frontier. And we were talking before we came on camera about the synergies around bitcoin mining and AI infrastructure. Very similar DNA. Get the data centers. Get that energy. Mine that bitcoin. Get those bitcoins. And the price of bitcoin's obviously where it is. And all the early pioneers made a lot of money. Things are good. It's harder to do. That's a great market, and we love bitcoin. AI is similar. All the winners that are going on this neocloud sector AI infrastructure are all coming from the same DNA. Energy, but purpose-built data centers that were designed and engineered for the AI systems. I mean, bitcoin had some specialty to it, but it wasn't as complex as AI. But the bottom line, you got the energy.
Olzhas Amirov
>> Yes.
John Furrier
>> You have it. You guys are doing it. Explain what you guys are doing because this is a really interesting story.
Olzhas Amirov
>> Thank you, John, for the question. I believe it's better to start with our story. We're quite new to the market. We're Enegix Global company. We originally started in the Eastern Hemisphere. By this moment, developed more than 250 megawatts of data center infrastructure, both grid connected and off-grid with our own generation. And as you said, we've been in the bitcoin mining business doing hosting, and we love this market, and still there, but we definitely see how big is the AI market and what are the big opportunities over there. What we are doing right now, we acquired an oil and gas producer in Canada and transformed it into a data center AI factory company.
John Furrier
>> You bought an AI, I mean, oil and gas producer company?
Olzhas Amirov
>> Yes.
John Furrier
>> And you're taking that into your system?
Olzhas Amirov
>> Yes. Yes. That's exactly-
John Furrier
>> Explain that because this is where I think you mentioned it prior that you have your own energy.
Olzhas Amirov
>> Right.
John Furrier
>> And that's going to be a key part of the story. Explain the oil and gas relationship.
Olzhas Amirov
>> Everybody today speaks about vertical integration. We see all these players, including our peers, who were in bitcoin mining business before. They're transitioning into AI, but the story that they have is a grid connection. They-
John Furrier
>> And they got to get the energy. They don't make it.
Olzhas Amirov
>> At the end, they're renting. They're buying the power. And also, we want to own the power that we operate. And that's basically the different story. And even speaking from the vertical integration standpoint, it's not about the power plant. The gas power plant, sorry. It's about gas production itself. It's a source that at the end produces the energy. And we believe that, since we're in the business where the energy consumed in just large volumes, the long story and the safer play is about controlling your cost which definitely means controlling the energy.
John Furrier
>> You guys have roots in Kazakhstan, and I mentioned that the data centers and AI data centers are converging. Sovereignty is a big discussion. Geopolitical. You're in Kazakhstan. You're in the Eastern Hemisphere. You got North America. You guys got a nice footprint. Explain the international and the geopolitical situation. First, at Kazakhstan, explain what's it like there right now. What's the climate? Headwind? Tailwind? Robust? And how do you guys connect that out going forward?
Olzhas Amirov
>> Well, thank you, John, for the question. Actually, I think it's important to point it out that Kazakhstan is separate stories for now. We separate both the ventures, the Eastern Hemisphere projects and the Northern American projects. Right now, speaking about AI factories, it's more about Northern America because we basically see the lot of demand is here. But speaking about the Kazakhstan and everything we have there, the regulation in the country is really stable, so a lot of investors coming in from US to a lot of big deals happening. And from our standpoint, we actually been in this market from our inception, since 2017, and faced different stages of the government perspective on the bitcoin mining and the data center business. We had the problems with ... Not the problems but misunderstandings with the government where they didn't know how to regulate the industry. And that's where we helped back in 2023. We helped to make bitcoin mining and data center business legal in our country and overall crypto. John, honestly, it's quite often about the taxes and the compliance.
John Furrier
>> I mean, most people think about bitcoin mining as chasing coins, kid in the garage, but you're talking about big, mega ... What do you got? 250 megawatts. The question I have for you ... What business are you in now? Sounds like you're an AI company. You're an AI factory producer.
Olzhas Amirov
>> Right.
John Furrier
>> Explain your business. What business are you in?
Olzhas Amirov
>> I believe the best way to explain ourselves is that we are independent power producer. We have the data center infrastructure in the Eastern Hemisphere. But in Canada, it's an independent power producer where we can collaborate with a hyperscalers and data center companies and be data centers by ourselves. And that's really unique because previously it was all about the grid connection. Historically, everybody wanted to connect to the grid to find their end customer. Right now, we are living in the era in such a, how to say, booming market that you don't need to connect to the grid to have a client. You can be your own client, and that's actually what we have.
Bitcoin mine is still our story. Our revenue is generated from bitcoin mine, and long term is going to be AI.
John Furrier
>> AI and energy infrastructure is what we're talking about. I wrote a post last week called The Third Leg of the AI Infrastructure. I posted it on LinkedIn. I should post it on Silicon Angle, but it was about energy is always the bounding function. That's what everyone talks about. It's true. Jensen has on the bottom of his chart at GTC, "Energy is the lowest level of Maslow's Hierarchy of Needs in the AI world." Shelter, food, energy, but money is also a constraint because there's a lot of expansion. You're sitting on a wave of demand on the AI infrastructure. Take us through how you're thinking about building out these systems because you have your own energy. You're going to vertically integrate. That's what you just said. What does that look like? Because all the major players don't want to get locked into, "Oh, NVIDIA's got a great rack. I spend millions of dollars, and that's great training." And then they go, "Oops, we need inference, and we designed everything for training." You're starting to see design going into a versatility mode saying, "Okay, I want to have the best system for training and inference." Inference obviously is expensive, and it's super important. How do you think about the design, given you got the energy?
Olzhas Amirov
>> Long term, what we believe and what is our competitive mode is, as we said, having the lowest energy cost. What we are targeting is below two cents per kilowatt-hour. Just something-
John Furrier
>> Which one?
Olzhas Amirov
>> Below two cents per kilowatt-hour. It's US cents. It's extremely low. And we believe that's a fundamental concrete to grow in a more valuable and value-generating ventures like, for example, hosting the GPU companies. The company that own GPUs.
John Furrier
>> You control your own destiny with the power. You get better unit economics.
Olzhas Amirov
>> Yes. And we believe that's the story how we can-
John Furrier
>> That's the story.
Olzhas Amirov
>> That's the story.
John Furrier
>> You're optimizing for energy optimization, consumption, efficiency.
Olzhas Amirov
>> And that's it. When we can show our expenses, we know how to get into, how to say, more riskier projects, for example, buying the GPUs from NVIDIA, because we know that fundamentally we are secure. And that's basically how we differ from other companies because, most of them, they are buying the energy, so they're not controlling their cost, and they're also having these very high CapEx on their business, so it really makes it difficult-
John Furrier
>> High price to rent from the grid or other energy source, but also NVIDIA talks a lot about their GPUs. And in fact, insiders talk openly. It's a public secret that a lot of the GPUs are underutilized and has nothing to do with the fact of the latency of the package. It's the energy availability. Energy is, again, constraining and bounding that piece. NVIDIA must like that. Take us through how your energy control point feeds the GPUs or makes the GPUs better.
Olzhas Amirov
>> Of course, there's a lot of, how to say, data center designs and the companies that helping us through this journey. We are having several OEMs that are helping us to design it all. But speaking from the infrastructure standpoint, we basically know that GPUs consume a lot of energy. And we already see the story even with the racks. The data centers that were developed three or four years ago, they're having the racks for up to 30 kilowatt-hours, kilowatts per rack, and right now it's only one new GPU server that consumes more than 100. And that's basically the constraint. What we believe is, once we secure the power, the hardest asset and the most, how to say, important asset within all this infrastructure ... I'm not telling that GPU is not important, but what I'm telling is the GPUs are announced once in half a year, in 12 months, so it's always you have to be quite updated with the infrastructure they have. But the way of energy is a core. And we believe that, with the partners that we work with, we can have this combination all in one place.
John Furrier
>> You also have 21pool. Explain what that is. Is that part of the business or a separate company? Explain the relationship with 21pool.
Olzhas Amirov
>> Let me tell about the bitcoin mining story broader, then. The bitcoin mining is still our business. And you asked about the energy. GPU is not taking all the energy, but we always have a bitcoin mining as an option to get our excessive power. It's just basically the client that we always have in place.
John Furrier
>> Basically, you have bitcoin as backup reserve to use the power to make more bitcoin, make more money.
Olzhas Amirov
>> Right.
John Furrier
>> We can't serve AI. Swap in the bitcoin.
Olzhas Amirov
>> Right.
John Furrier
>> That's what you basically have. You're utilizing the power.
Olzhas Amirov
>> We don't feel that utilizing the power won't be a problem. Even we have an option to connect to the grid. I'll return to the 21pool question later. But regarding the power, we always have an option to connect to the grid. The grid utilities in many countries we already know are interested to not bring only the consumer, the data centers, but the suppliers to the market too. And they're interested to make it a fast track. If you are data center producing your own power or even your own gas for this power, you have a faster track to connect to the grid because, in this case, the grid is used just as a redundancy.
John Furrier
>> All right. How about the sovereignty angle? Because maybe this relates to you. But I'd imagine, from my standpoint, if I want to dig in digitally, I'm a country, given all the geopolitical and all the money to be made in AI and bitcoin, I'd love to have this model vertically integrated in my country. Is there demand there? Are you seeing that on your radar? Is that on your business plan?
Olzhas Amirov
>> Actually, we know that we can scale this project design, and our vision is basically finding the lowest energy source within the region. It's either gas or renewable energies and to power the data centers.
John Furrier
>> I'm fascinated. I love how you weave in the bitcoin as the backup or ... No AI demand? We'll go to the bitcoin. Either way, you win. Where are you guys now? Explain the progress. What are you hoping to do? You're in New York. Obviously, it's money conversations. Capital's important. Talk about the company status, where it is, the progress it's making, the plans, the funding. What are your goals?
Olzhas Amirov
>> We already proved our concept in Canada regarding that we can produce the energy below two cents per kilowatt-hour. We're planning to scale this story to higher and higher megawatts. Actually, our long-term plan is gigawatt-plus just over there. And we believe globally we can grow even more aggressive.
John Furrier
>> Are you looking for capital?
Olzhas Amirov
>> We are looking for capital. We're having some private rounds right now, and I believe public listing will be one of the options even in near future.
John Furrier
>> How many data centers do you have now?
Olzhas Amirov
>> We operate two data centers in Kazakhstan and one in Canada. But Canada project, the one that will probably go public, is a different story. We can sell it's one fragment project in Canada-
John Furrier
>> Well, congratulations. I love the story. I love how you got the control your own destiny with the power. And that's replicatable across other opportunities. Do you run out of power with that relationship? Or what's the power envelope look like for you? In other words, can you replicate the model with other power sources?
Olzhas Amirov
>> For sure, we can do. We can look into renewable energies. Right now, we see emerging technologies in nuclear generation. We believe there's all the options we have.
John Furrier
>> You're doing the hard stuff first. Everyone else is going the other direction. Let's get the racks, and then where's the power? Congratulations. Thanks for coming on theCUBE. Appreciate it. I'm John Furrier, host of theCUBE. This is our AI Factory Series, one of the most popular series. AI infrastructure continues to boom along. And obviously the agents are about a year away on a full, steady state, we believe. But the AI infrastructure on a global scale ... It converges all that bitcoin mining, the sovereignty conversations, and AI infrastructure. All kind of come into one thing. It is infrastructure. It's powering tokens. It's powering value. And again, this is where the energy is the key. And you control the energy, you control your own destiny. We're going to see more of this for sure on theCUBE soon. Thanks for watching.