Kush Bavaria of Ornn, co-founder and chief executive officer, and Wayne Nelms of Ornn, co-founder and chief technology officer, join theCUBE at the NYSE Wired RAISE Summit 2026 to discuss compute marketplaces and artificial intelligence infrastructure. Bavaria explains Ornn’s mission to create standardized rails for transacting compute as an asset class and to build a platform to buy, sell and index compute hours; they list indices on exchanges such as Bloomberg and operate with beta users ahead of a public launch. Nelms highlights the importance of market design and transparency, arguing compute should be treated like electricity and noting that clear pricing and settlement mechanisms are essential.
theCUBE hosts and analysts guide a conversation that covers compute indices, marketplace design and data-center economics while exploring the evolving inference versus training demand profile. They examine how capital allocation, rather than chips or power alone, emerges as the next bottleneck and how volatile inference demand drives marketplace innovation across AI compute markets.
This episode provides actionable insights for investors, infrastructure operators and AI startups that are evaluating compute procurement, pricing benchmarks and regulatory considerations. Viewers gain a practical perspective on market structure, liquidity and the role of standardized instruments in making compute a tradable asset.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
RAISE Summit 2025. If you don’t think you received an email check your
spam folder.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Register For RAISE Summit 2025
Please fill out the information below. You will recieve an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for RAISE Summit 2025.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
RAISE Summit 2025. If you don’t think you received an email check your
spam folder.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Sign in to gain access to RAISE Summit 2025
Please sign in with LinkedIn to continue to RAISE Summit 2025. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Kush Bavaria & Wayne Nelms, Ornn
Kush Bavaria of Ornn, co-founder and chief executive officer, and Wayne Nelms of Ornn, co-founder and chief technology officer, join theCUBE at the NYSE Wired RAISE Summit 2026 to discuss compute marketplaces and artificial intelligence infrastructure. Bavaria explains Ornn’s mission to create standardized rails for transacting compute as an asset class and to build a platform to buy, sell and index compute hours; they list indices on exchanges such as Bloomberg and operate with beta users ahead of a public launch. Nelms highlights the importance of market design and transparency, arguing compute should be treated like electricity and noting that clear pricing and settlement mechanisms are essential.
theCUBE hosts and analysts guide a conversation that covers compute indices, marketplace design and data-center economics while exploring the evolving inference versus training demand profile. They examine how capital allocation, rather than chips or power alone, emerges as the next bottleneck and how volatile inference demand drives marketplace innovation across AI compute markets.
This episode provides actionable insights for investors, infrastructure operators and AI startups that are evaluating compute procurement, pricing benchmarks and regulatory considerations. Viewers gain a practical perspective on market structure, liquidity and the role of standardized instruments in making compute a tradable asset.
>> Welcome back. I'm John Furrier, host of theCUBE here at theCUBE's NYSE Studios. Of course, we have our Palo Alto studio connecting Silicon Valley to Wall Street. It's our AI factory series. Also, a preview for the Ray Summit where compute and neoclouds will be in the front and center, sovereignty, all the top conversations. As the AI infrastructure continues to build out, we got the returning CUBE alumni here. Kush Bavaria, co-founder and CEO of Ornn, good to see you again.
Kush Bavaria
>> It's great for having us again.
John Furrier
>> And Wayne's first time on, Wayne Nelms co-founder and CTO. Ornn guys, congratulations. Get a lot of momentum. Been short time since you were on, but you guys are in the hottest market right now. AI infrastructure, build out. I mean, it's still underfunded. I just wrote a blog post that the third leg of the stool is capital. Power and capital right now are the two hottest things, funding the area, bounded by power, bounded by capital. Thanks for coming back on. Good to see you.
Kush Bavaria
>> Thank you for having us, John. It's exciting to be here again.
John Furrier
>> All right, give us the update. First, explain what you guys do, because I want to get back because you guys are building a platform for compute like electricity, which is pretty much the conversation we have on theCUBE all the time. When can we get into it where inference and reasoning and all AI can be powered like electricity? Start.
Kush Bavaria
>> Look, I think compute has become more and more of an asset class. And if you look at it, every other commodity market such as oil, electricity, as you've mentioned, has a way to sort of transact and exchange that resource. And so, we're building the rails to be able to transact and exchange compute, like you do with electricity. You can think of us as the power lines in an electricity plan. And the data centers are the generators.
John Furrier
>> The technology, you guys are fairly new. Wayne, CTO, you're building the platform.
Wayne Nelms
>> Yes.
John Furrier
>> Explain what your thesis is, what you guys are doing.
Wayne Nelms
>> Yeah. So effectively, like Kush mentioned, all we think is really valuable in the space, is having the efficient transfer of compute as a resource, and having efficient transparency in the marketplace. Right now, today, no one has real insights into what does compute actually worth in the market, based on where it's located. How much does it actually cost to transact between counterparties in the space? No, it's very unstandardized. And I think Ornn's mission to make this process more standardized, so data center development is an easier, more seamless process.
John Furrier
>> And it's such an evolutionary market. It feels like discovering fire or the wheel. It's like the big data center build out's happening. Everyone talks about the CapEx, but you guys are coming at it from a different angle. You're thinking, okay, how do we do this at scale? We think about AI factories, they're centralized, and then you got the edge coming online. We were riffing about metro factories, distributed computing all over the place. So that means everything's connected. They're going to need power. They're going to need compute. What's the plan? Where are you guys at? Take us through the progress of the venture.
Kush Bavaria
>> Yeah. So team size has tripled since last time we spoke, so been growing at an incredible rate. I think the other good part about it is the market is starting to start to realize that, hey, compute is more of a asset class and how it's transacted and moved. You look at investment grade tenants and all these sort of things people are talking about. As you mentioned, capital is the next bottleneck that's going to take place in data center economy. Look, we have power now. We have the chips. What's the next sort of thing? What's blocking? And really, if you look down to it is capital. It's these hyperscaler tenant commitments that are happening.
John Furrier
>> Yeah. And the commitments are like multi-year, not 10 years. I mean, look at the data center builders. They're like real estate. Now that's moved into more transactional systems.
Kush Bavaria
>> Of course. And it's more of that. There's not a lot of people in the world that can afford a hyperscaler commitment. No startup today can afford a 10 billion, $20 billion commitment on a data center. And so it's finding creative ways to sort of create a marketplace where people can transact and buy compute.
John Furrier
>> Talk about the startup, because the hottest area outside of infrastructure is the AI native startup. You're seeing people, small teams come into the market, and they're going to need tokens. They're going to need compute. They're going to need all that stuff. They don't have massive funding. How does that work? Take us through the use cases of how you envision the customers.
Wayne Nelms
>> Yeah. So right now, all these companies, you've mentioned all these new hot startups, foundational labs, think OpenAI, Anthropic. What they're doing is they're raising a lot of debt capital. They're raising a lot of fundraising to just build these data centers. But their core competency is not building infrastructure. Their core competency is training models that serve the world. And I think what we see today is, there's not a bifurcation of what a company really does, what they specialize in. Because the market is so unstructured, everyone wants to do everything. I think Ornn is sitting in this interesting position where we see all the labs and what they're trying to do, but we also see all the data center infrastructure players and what they're trying to do. And which one would be in the middle and facilitate an easy transfer of compute.
John Furrier
>> All right. Take us through where you guys are at in the progress. What's the status of the platform, where you're at?
Kush Bavaria
>> So we work with beta users today that transact through our platform and then we go through that. And then we're doing a public launch coming soon on the actual platform and the way to transact compute. Our indices are also listed on many retail exchanges, as well as us doing our own, sort of transactions on the indices. We list on Bloomberg, for example, as one large platform.
John Furrier
>> What about the models? At Google Next last week was very clear is that you started to see the decoupling of the models from the actual infrastructure. How do you guys view that?
Wayne Nelms
>> I mean, so what we see is there's a bunch of players, again, on the infrastructure side. There's a bunch of players on the model side. And, being able to switch between models is something that every end user wants to be able to do. You use OpenAI GPT, you might use Anthropic, Opus. There's so many different models in the world that serve specialized use case. And so, Having the compute be almost fungible across different infrastructure to serve different types of models, is really where we see the feature headed.
John Furrier
>> The agent wave is coming. We saw enterprises adopt this year and last year, the end of last year, coding kind of broke through. I mean, everyone's been using coding assistance, but now the enterprise see real value. Any thoughts on how you see this agent market playing out, and how you guys are going to vector into it?
Kush Bavaria
>> Of course, I think inference is going to take a lot of workloads on compute. If you look how inference demand is shaped, it's very volatile. I don't know when I'm going to click the button. I don't know when I'm going to generate code. And so, it's creating the same way. It's like you consume electricity in a very volatile way. And I think inference demand is also very volatile, where you compare it to training where you know exactly when you're going to train the model. For inference, the demand for compute is unknown. And that market that we help create, sort of ties into where the market's heading.
John Furrier
>> All right. Talk about the founding story. How did you guys come up with this? You guys just hanging around doing trades, MIT. How did it all come together?
Wayne Nelms
>> Let me go first. Let me go. So I met Kush freshman year. We're best friends at MIT. I was a math major. I ended up going to trading as a career, actually here in New York. And so Kush was calling me every weekend after work saying we should do something, we should do something. And this is my own version, but I think he really convinced me to jump ship and really explore the AI infrastructure landscape.
Kush Bavaria
>> When you start a company, you know you want to start with only a few people. And me and Wayne have known each other for a very long time and were friends. And so that's part of the story, right? Most companies succeed because the founders work together. And so, it's been incredible so far.
John Furrier
>> All right. So what was the hook, Wayne? What made you hang it up and jump into the arena?
Wayne Nelms
>> It's always about trust. I trusted Kush with everything, and I really believe that we could do something great together.
John Furrier
>> What was the sales pitch could you say, "Hey, look, we're going to do this. Market's massive." I mean, first of all, TAM is massive.
Kush Bavaria
>> I think that's part of the story, but also if you look at where the world's heading, there's a lot of these old enterprises. If you stay at a current job, and I think many companies will adopt AI, but you don't know which ones will at the certain rate that they are. And I think there's going to be a lot of disruption that happens in the world. It's a sort of revolution that's going on, and we want to be at the forefront of that revolution.
John Furrier
>> And when you get the investors in, you guys have seed investors, you got a round coming together, what was their reaction to the story and the vision?
Kush Bavaria
>> I think our investors are very favorable of what we're doing, and they've been happy so far about the progress that we've made. And I think they're most excited about the opportunity that they're taking on. I think they understand, look, the work we do is not easy. And being able to go after that large goal and not be scared of that, is I think part of the mission that we work with.
John Furrier
>> You think about companies like AWS, which was founded inside Amazon. They had their first use case. I remember when Amazon, actually WebSource came out, it was like very basic. The basic building blocks and then it just dominated the market. Similar pattern happening here in the compute market, where you don't have to boil the ocean over. The market is in need. The demand for compute is so high. How do you guys view that? And what's the vision? Once you get the building blocks and core platform out there, how do you guys think about that evolution?
Kush Bavaria
>> I think the end goal is becoming very similar to our electricity sort of exchange that exists. And that's the end state of the market. Today we're starting a very simple platform where you can buy and sell and exchange compute hours and all these sort of different things. We have our indices, of course, that we've listed on many exchanges. That's the start. And then obviously we take that and bring it into the future.
John Furrier
>> How do you guys view all this dormant energy that's laying around? Because if you look at, it comes up a lot. It's not really talked about in the mainstream, but there's a lot of energy that's been disaggregated in the infrastructure footprint. I mean, just in telecom alone, if you look at AT&T and Verizon, they got all these central offices, they got all these towers, they got all these other small little data centers all over the world. You kind of have a grid.
Wayne Nelms
>> Almost. Yeah. So right now, power is fully a constraint for AI. I think people are getting very creative today with how to source energy from anywhere that they can. So as behind the meter, energy generation is so important today. I think we'll see, especially as AI infrastructure becomes more robust, how exactly, how creative people can get really with energy production.
John Furrier
>> So how do you guys view the whole market making side of it? Because again, if you're a startup, you're going to want to tap into this platform. What do you guys need to do from an ecosystem standpoint to put all this together?
Kush Bavaria
>> Look, for us, it's working with a lot of people in the market. We want to be as neutral as possible, and we work with everyone, every data center provider and all of these sort of AI app builders, companies, model, trainers, everyone that exists on one side of the market. For us, our goal is to... We work with as many people as possible. We build that whole ecosystem around us, and we become the central party where everything comes together.
John Furrier
>> Yeah. And like the market here, they're trading behind us here on the floor, like electricity and futures, like there is a lot of quant involved.
Wayne Nelms
>> Yeah.
John Furrier
>> Take us through that. How do you guys see that? Not a lot of people see that vision of the math behind the market.
Wayne Nelms
>> I mean, so our backgrounds are all finance, math, and CS in our company. I think these skills are super valuable when you think about building infrastructure for a marketplace, market design, auction theory, making it as seamless as possible, frictionless as possible. It requires a lot of math, like it or not. And I guess we're super happy that we have a great group of guys that work for us, and we're super excited to keep going.
John Furrier
>> Yeah. And you're seeing a lot of secondary markets for a lot of the GPUs too. I mean, there's going to be a lot of fluctuation. People want the lowest price possible for compute.
Wayne Nelms
>> Sure.
John Furrier
>> Okay. Final question. What are you guys optimizing for now? You said you're tripling in size. What are you guys looking for, talent-wise? Go to market's going to be coming, get a public launch coming. What are you looking for? Put a plug in.
Kush Bavaria
>> I think engineering and go to market are two functions of the business, and we're looking for people that are talented and have done work before. Look, we hire anyone that is talented. That's our only requirement, of people that can actually build things.
John Furrier
>> What's the culture?
Kush Bavaria
>> I think our culture is very startup-y, but at the same time, I think we have this notion of like, we have sort of the heritage and class in the way we treat our employees in the company. And so building that culture is huge.
John Furrier
>> Guys, excited for what you guys are doing. Congratulations. We'll keep in touch. You're in New York, so we'll have you back on. Thanks for coming on theCUBE. Appreciate it.
Kush Bavaria
>> That's good. Thanks so much, John.
John Furrier
>> All right. AI infrastructure's compute, it's dominating the market. And again, this is an acceleration of agents coming. We're starting to see code development. The enterprise issue will probably crack with the agents. We're going to see massive growth. And again, the infrastructures continue to thunder away. I'm John for theCUBE. Thanks for watching.
>> Welcome back. I'm John Furrier, host of theCUBE here at theCUBE's NYSE Studios. Of course, we have our Palo Alto studio connecting Silicon Valley to Wall Street. It's our AI factory series. Also, a preview for the Ray Summit where compute and neoclouds will be in the front and center, sovereignty, all the top conversations. As the AI infrastructure continues to build out, we got the returning CUBE alumni here. Kush Bavaria, co-founder and CEO of Ornn, good to see you again.
Kush Bavaria
>> It's great for having us again.
John Furrier
>> And Wayne's first time on, Wayne Nelms co-founder and CTO. Ornn guys, congratulations. Get a lot of momentum. Been short time since you were on, but you guys are in the hottest market right now. AI infrastructure, build out. I mean, it's still underfunded. I just wrote a blog post that the third leg of the stool is capital. Power and capital right now are the two hottest things, funding the area, bounded by power, bounded by capital. Thanks for coming back on. Good to see you.
Kush Bavaria
>> Thank you for having us, John. It's exciting to be here again.
John Furrier
>> All right, give us the update. First, explain what you guys do, because I want to get back because you guys are building a platform for compute like electricity, which is pretty much the conversation we have on theCUBE all the time. When can we get into it where inference and reasoning and all AI can be powered like electricity? Start.
Kush Bavaria
>> Look, I think compute has become more and more of an asset class. And if you look at it, every other commodity market such as oil, electricity, as you've mentioned, has a way to sort of transact and exchange that resource. And so, we're building the rails to be able to transact and exchange compute, like you do with electricity. You can think of us as the power lines in an electricity plan. And the data centers are the generators.
John Furrier
>> The technology, you guys are fairly new. Wayne, CTO, you're building the platform.
Wayne Nelms
>> Yes.
John Furrier
>> Explain what your thesis is, what you guys are doing.
Wayne Nelms
>> Yeah. So effectively, like Kush mentioned, all we think is really valuable in the space, is having the efficient transfer of compute as a resource, and having efficient transparency in the marketplace. Right now, today, no one has real insights into what does compute actually worth in the market, based on where it's located. How much does it actually cost to transact between counterparties in the space? No, it's very unstandardized. And I think Ornn's mission to make this process more standardized, so data center development is an easier, more seamless process.
John Furrier
>> And it's such an evolutionary market. It feels like discovering fire or the wheel. It's like the big data center build out's happening. Everyone talks about the CapEx, but you guys are coming at it from a different angle. You're thinking, okay, how do we do this at scale? We think about AI factories, they're centralized, and then you got the edge coming online. We were riffing about metro factories, distributed computing all over the place. So that means everything's connected. They're going to need power. They're going to need compute. What's the plan? Where are you guys at? Take us through the progress of the venture.
Kush Bavaria
>> Yeah. So team size has tripled since last time we spoke, so been growing at an incredible rate. I think the other good part about it is the market is starting to start to realize that, hey, compute is more of a asset class and how it's transacted and moved. You look at investment grade tenants and all these sort of things people are talking about. As you mentioned, capital is the next bottleneck that's going to take place in data center economy. Look, we have power now. We have the chips. What's the next sort of thing? What's blocking? And really, if you look down to it is capital. It's these hyperscaler tenant commitments that are happening.
John Furrier
>> Yeah. And the commitments are like multi-year, not 10 years. I mean, look at the data center builders. They're like real estate. Now that's moved into more transactional systems.
Kush Bavaria
>> Of course. And it's more of that. There's not a lot of people in the world that can afford a hyperscaler commitment. No startup today can afford a 10 billion, $20 billion commitment on a data center. And so it's finding creative ways to sort of create a marketplace where people can transact and buy compute.
John Furrier
>> Talk about the startup, because the hottest area outside of infrastructure is the AI native startup. You're seeing people, small teams come into the market, and they're going to need tokens. They're going to need compute. They're going to need all that stuff. They don't have massive funding. How does that work? Take us through the use cases of how you envision the customers.
Wayne Nelms
>> Yeah. So right now, all these companies, you've mentioned all these new hot startups, foundational labs, think OpenAI, Anthropic. What they're doing is they're raising a lot of debt capital. They're raising a lot of fundraising to just build these data centers. But their core competency is not building infrastructure. Their core competency is training models that serve the world. And I think what we see today is, there's not a bifurcation of what a company really does, what they specialize in. Because the market is so unstructured, everyone wants to do everything. I think Ornn is sitting in this interesting position where we see all the labs and what they're trying to do, but we also see all the data center infrastructure players and what they're trying to do. And which one would be in the middle and facilitate an easy transfer of compute.
John Furrier
>> All right. Take us through where you guys are at in the progress. What's the status of the platform, where you're at?
Kush Bavaria
>> So we work with beta users today that transact through our platform and then we go through that. And then we're doing a public launch coming soon on the actual platform and the way to transact compute. Our indices are also listed on many retail exchanges, as well as us doing our own, sort of transactions on the indices. We list on Bloomberg, for example, as one large platform.
John Furrier
>> What about the models? At Google Next last week was very clear is that you started to see the decoupling of the models from the actual infrastructure. How do you guys view that?
Wayne Nelms
>> I mean, so what we see is there's a bunch of players, again, on the infrastructure side. There's a bunch of players on the model side. And, being able to switch between models is something that every end user wants to be able to do. You use OpenAI GPT, you might use Anthropic, Opus. There's so many different models in the world that serve specialized use case. And so, Having the compute be almost fungible across different infrastructure to serve different types of models, is really where we see the feature headed.
John Furrier
>> The agent wave is coming. We saw enterprises adopt this year and last year, the end of last year, coding kind of broke through. I mean, everyone's been using coding assistance, but now the enterprise see real value. Any thoughts on how you see this agent market playing out, and how you guys are going to vector into it?
Kush Bavaria
>> Of course, I think inference is going to take a lot of workloads on compute. If you look how inference demand is shaped, it's very volatile. I don't know when I'm going to click the button. I don't know when I'm going to generate code. And so, it's creating the same way. It's like you consume electricity in a very volatile way. And I think inference demand is also very volatile, where you compare it to training where you know exactly when you're going to train the model. For inference, the demand for compute is unknown. And that market that we help create, sort of ties into where the market's heading.
John Furrier
>> All right. Talk about the founding story. How did you guys come up with this? You guys just hanging around doing trades, MIT. How did it all come together?
Wayne Nelms
>> Let me go first. Let me go. So I met Kush freshman year. We're best friends at MIT. I was a math major. I ended up going to trading as a career, actually here in New York. And so Kush was calling me every weekend after work saying we should do something, we should do something. And this is my own version, but I think he really convinced me to jump ship and really explore the AI infrastructure landscape.
Kush Bavaria
>> When you start a company, you know you want to start with only a few people. And me and Wayne have known each other for a very long time and were friends. And so that's part of the story, right? Most companies succeed because the founders work together. And so, it's been incredible so far.
John Furrier
>> All right. So what was the hook, Wayne? What made you hang it up and jump into the arena?
Wayne Nelms
>> It's always about trust. I trusted Kush with everything, and I really believe that we could do something great together.
John Furrier
>> What was the sales pitch could you say, "Hey, look, we're going to do this. Market's massive." I mean, first of all, TAM is massive.
Kush Bavaria
>> I think that's part of the story, but also if you look at where the world's heading, there's a lot of these old enterprises. If you stay at a current job, and I think many companies will adopt AI, but you don't know which ones will at the certain rate that they are. And I think there's going to be a lot of disruption that happens in the world. It's a sort of revolution that's going on, and we want to be at the forefront of that revolution.
John Furrier
>> And when you get the investors in, you guys have seed investors, you got a round coming together, what was their reaction to the story and the vision?
Kush Bavaria
>> I think our investors are very favorable of what we're doing, and they've been happy so far about the progress that we've made. And I think they're most excited about the opportunity that they're taking on. I think they understand, look, the work we do is not easy. And being able to go after that large goal and not be scared of that, is I think part of the mission that we work with.
John Furrier
>> You think about companies like AWS, which was founded inside Amazon. They had their first use case. I remember when Amazon, actually WebSource came out, it was like very basic. The basic building blocks and then it just dominated the market. Similar pattern happening here in the compute market, where you don't have to boil the ocean over. The market is in need. The demand for compute is so high. How do you guys view that? And what's the vision? Once you get the building blocks and core platform out there, how do you guys think about that evolution?
Kush Bavaria
>> I think the end goal is becoming very similar to our electricity sort of exchange that exists. And that's the end state of the market. Today we're starting a very simple platform where you can buy and sell and exchange compute hours and all these sort of different things. We have our indices, of course, that we've listed on many exchanges. That's the start. And then obviously we take that and bring it into the future.
John Furrier
>> How do you guys view all this dormant energy that's laying around? Because if you look at, it comes up a lot. It's not really talked about in the mainstream, but there's a lot of energy that's been disaggregated in the infrastructure footprint. I mean, just in telecom alone, if you look at AT&T and Verizon, they got all these central offices, they got all these towers, they got all these other small little data centers all over the world. You kind of have a grid.
Wayne Nelms
>> Almost. Yeah. So right now, power is fully a constraint for AI. I think people are getting very creative today with how to source energy from anywhere that they can. So as behind the meter, energy generation is so important today. I think we'll see, especially as AI infrastructure becomes more robust, how exactly, how creative people can get really with energy production.
John Furrier
>> So how do you guys view the whole market making side of it? Because again, if you're a startup, you're going to want to tap into this platform. What do you guys need to do from an ecosystem standpoint to put all this together?
Kush Bavaria
>> Look, for us, it's working with a lot of people in the market. We want to be as neutral as possible, and we work with everyone, every data center provider and all of these sort of AI app builders, companies, model, trainers, everyone that exists on one side of the market. For us, our goal is to... We work with as many people as possible. We build that whole ecosystem around us, and we become the central party where everything comes together.
John Furrier
>> Yeah. And like the market here, they're trading behind us here on the floor, like electricity and futures, like there is a lot of quant involved.
Wayne Nelms
>> Yeah.
John Furrier
>> Take us through that. How do you guys see that? Not a lot of people see that vision of the math behind the market.
Wayne Nelms
>> I mean, so our backgrounds are all finance, math, and CS in our company. I think these skills are super valuable when you think about building infrastructure for a marketplace, market design, auction theory, making it as seamless as possible, frictionless as possible. It requires a lot of math, like it or not. And I guess we're super happy that we have a great group of guys that work for us, and we're super excited to keep going.
John Furrier
>> Yeah. And you're seeing a lot of secondary markets for a lot of the GPUs too. I mean, there's going to be a lot of fluctuation. People want the lowest price possible for compute.
Wayne Nelms
>> Sure.
John Furrier
>> Okay. Final question. What are you guys optimizing for now? You said you're tripling in size. What are you guys looking for, talent-wise? Go to market's going to be coming, get a public launch coming. What are you looking for? Put a plug in.
Kush Bavaria
>> I think engineering and go to market are two functions of the business, and we're looking for people that are talented and have done work before. Look, we hire anyone that is talented. That's our only requirement, of people that can actually build things.
John Furrier
>> What's the culture?
Kush Bavaria
>> I think our culture is very startup-y, but at the same time, I think we have this notion of like, we have sort of the heritage and class in the way we treat our employees in the company. And so building that culture is huge.
John Furrier
>> Guys, excited for what you guys are doing. Congratulations. We'll keep in touch. You're in New York, so we'll have you back on. Thanks for coming on theCUBE. Appreciate it.
Kush Bavaria
>> That's good. Thanks so much, John.
John Furrier
>> All right. AI infrastructure's compute, it's dominating the market. And again, this is an acceleration of agents coming. We're starting to see code development. The enterprise issue will probably crack with the agents. We're going to see massive growth. And again, the infrastructures continue to thunder away. I'm John for theCUBE. Thanks for watching.