In this interview from theCUBE + NYSE Wired: AI Factories, Faris Sbahi, chief executive officer and co-founder of Normal Computing, joins Craig Churchill, chief business officer of Normal Computing, to talk with theCUBE's John Furrier about how AI-powered chip design is reshaping the economics of next-generation AI infrastructure. Sbahi announces a $50M accelerator round led by Samsung, with strategic investors including Micron and Celesta, to scale Normal's mission of making custom silicon radically easier to design and deploy. He introduces thermodynamic computing — a new paradigm combining in-memory processing to resolve memory bandwidth bottlenecks with non-deterministic hardware matched to the approximate nature of intensive AI workloads like video generation. With inference now representing roughly two-thirds of all AI compute and a projected 49-gigawatt global power shortfall on the horizon, Sbahi frames tokens per dollar per watt as the defining efficiency metric of the AI era.
The conversation also explores Normal Computing's Electronic Design Automation (EDA) platform, which targets a chip-design discipline that has gone largely undisrupted for 40 years. Churchill points to a stark industry benchmark: first-pass silicon tape-out success rates have fallen to a historic low of 14%, and Normal's AI methodology — including an auto formalization technique that converts thousands of pages of engineering prose into machine-readable formal language — is designed to address that crisis directly. Sbahi illustrates the velocity of agentic workflows with a striking internal example: an engineer on paternity leave deployed agents for 43 consecutive days that collectively wrote 580,000 lines of production-quality code. From tackling a global shortage of roughly one million silicon engineers to maintaining a strategic advantage in EDA amid intensifying geopolitical competition, Sbahi and Churchill outline why the winners of the next infrastructure cycle will be defined not by the software they run, but by the systems they build.
Subscribe for ongoing coverage of AI infrastructure, custom silicon, semiconductors and edge computing.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: AI Factories - Data Centers of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Register for AI Factories - Data Centers of the Future
Please fill out the information below. You will receive an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for AI Factories - Data Centers of the Future.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: AI Factories - Data Centers of the Future. If you don’t think you received an email check your
spam folder.
Sign in to AI Factories - Data Centers of the Future.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open the link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: AI Factories - Data Centers of the Future
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: AI Factories - Data Centers of the Future. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Faris Sbahi & Craig Churchill, Normal Computing
In this interview from theCUBE + NYSE Wired: AI Factories – Data Centers of the Future event, Glean co-founder and CEO Arvind Jain joins theCUBE’s John Furrier to unpack what’s really working in enterprise AI today and what comes next. Jain explains why knowledge access remains the first successful AI use case at scale and how Glean’s enterprise search brings AI into everyday work. He details the past year’s lessons with AI agents – from the need for guardrails, security, evaluation and monitoring to democratizing agent building so business owners (not just data scientists) can create production-grade agents.
The conversation dives into Glean’s vision of the enterprise brain powered by an enterprise graph, highlighting the importance of deep context, human workflows and behavior to reduce “noise” and drive outcomes. Jain outlines core building blocks – hundreds of enterprise integrations and a growing actions library – that let agents securely read company knowledge and take actions across systems (e.g., CRM updates, HR tasks, calendar checks). He discusses how organizations are standing up AI Centers of Excellence, prioritizing “top 10–20” agents across functions like engineering, support and sales, and why a horizontal AI data platform that unifies structured and unstructured data – accessed conversationally and stitched together via standards like MCP – sets the foundation for AI factory-scale operations. Looking ahead, Jain says Glean’s upgraded assistant is evolving from reactive tool to proactive companion that anticipates tasks and accelerates productivity.
In this interview from theCUBE + NYSE Wired: AI Factories, Faris Sbahi, chief executive officer and co-founder of Normal Computing, joins Craig Churchill, chief business officer of Normal Computing, to talk with theCUBE's John Furrier about how AI-powered chip design is reshaping the economics of next-generation AI infrastructure. Sbahi announces a $50M accelerator round led by Samsung, with strategic investors including Micron and Celesta, to scale Normal's mission of making custom silicon radically easier to design and deploy. He introduces thermodynamic co...Read more
exploreKeep Exploring
What does Normal do for semiconductor companies, and what performance metric and efficiency improvements is it targeting?add
What is your perspective on the recent trend of faster chip tape-outs—driven by entrepreneurs and large players like NVIDIA—and the resulting pressure to boost U.S. productivity given historically low first‑pass silicon success rates, and can AI help address this?add
What methodological innovations are being pursued on the ASIC side, and what is "thermodynamic computing" — how does it work and how does it help with AI workloads such as video generation?add
How do you expect the data center and related infrastructure market to evolve over the next two to three years, and what will be the main constraints?add
How does a solution fit into the energy envelope and network/latency constraints when scaling GPU infrastructure — i.e., what is the "secret sauce" for enabling more GPUs given projected power shortfalls?add
What does the business development and timetable look like for the transition to agent-driven AI and AI-native silicon/software systems?add
What is auto formalization, and how does it translate human prose and block diagrams into formal language that purpose-built LLMs can use to improve chip design and verification?add
>> Welcome back everyone. I'm John Furrier, host of theCUBE. We are here in theCUBE's NYSE studio. Of course, we have our Palo Alto studio connecting Silicon Valley and Wall Street. This is our AI Factory series. We talk to the leaders who are making it happen and building out the next generation AI infrastructure, that's creating a ton of value, creating a ton of tokens, and all the new AI native applications will sit on it from core to edge and everything in between. We've got two great guests here with some funding news. Faris Sbahi, he's the CEO and co-founder and Craig Churchill, chief business of Normal Computing. Gentlemen, thank you for coming on the program.
Craig Churchill
>> It's my pleasure.
Faris Sbahi
>> Thanks for having us.
Craig Churchill
>> Yeah .
John Furrier
>> AI Factories is our hottest series, because there's more money being spent on AI infrastructure. Some say, a lot. Crusoe's SVP of engineering told me last week, they need more funding. So, a lot more action is coming into the AI infrastructure, because the demand is off the charts for tokens. Tokens are the new intelligence vehicle.
Faris Sbahi
>> Absolutely. Yeah, I think Jensen had a great equation that he shared last week, which is that, revenue equals tokens per watt times available gigawatts. And we're seeing the demand for tokens go up tremendously as we move to agentic AI and this sort of new paradigm of always on AI, decentralized AI. And so, we're seeing that, increasingly the constraint is essentially power, which is a big part of what we're focused on at Normal and part of the new-
John Furrier
>> He's got his favorite slide too, that I call the Maslow's hierarchy of needs of AI stack. Energy is at the bottom. That's like food and shelter.
Faris Sbahi
>> Yes.
John Furrier
>> Its kind of like, the energy is the bounding function.
Faris Sbahi
>> Correct.
John Furrier
>> What's happening, but yet the stack still kind of emerges, you got a data layer emerging. So, it's really robust, it's a full stack. It's almost like cloud, went half stack, now we're back to kind of full stack with energy. So, super exciting. And for what you guys have done and you got some funding news. So, first explain, what does Normal Computing do? Explain where you guys came from, because it's a great story and how Google was involved, you guys incubated the project.
Faris Sbahi
>> Yeah.
John Furrier
>> Now you got some funding news. Tell the story.
Faris Sbahi
>> For sure. Yeah. So, Normal builds AI software with the world's largest semiconductor companies. So, right now we're working with more than half the top 10 by revenue, semiconductor companies. And we help them design and scale custom silicon, especially silicon that you could consider to have been out of reach before working with us. And in addition to that, we're using our own software and new methodology to bring new kinds of ASICs to market ourselves. And the main metric we're focused on is exactly what we were talking about before, which is tokens per watt. So, we're helping to bring ASICs to market, that are multiple orders of magnitude, 100X, 1,000X more efficient for their respective workloads and use cases.
John Furrier
>> And the funding, big news.
Faris Sbahi
>> Yeah. So, we're really excited to announce... I'm sorry, losing my voice, John, already. Really excited to announce that our $50 million accelerator round, which is being led by Samsung, a lot of other amazing investors, involve strategic investors, Micron's in the round. We have Celesta in the round, who's one of Lip-Bu Tan, the current Intel CEO's, founded firms.
John Furrier
>> That was his gig. Now, it's a side gig.
Faris Sbahi
>> Yeah.
John Furrier
>> He's doing a good job. So, yeah, great investor.
Faris Sbahi
>> Yes. Yes.
John Furrier
>> You got some heavy hitter chip folks in this deal.
Faris Sbahi
>> For sure.
John Furrier
>> One of the things that's interesting to me is that, if you go back, even five years ago, the whole game was, you need volume, that's clear, you need volume in the chip business. Intel kind of licking their wounds because of it. But now you're starting to hear things like, we're taping-out superfast.
Faris Sbahi
>> Yes.
John Furrier
>> So, you're seeing, and also the pressure on the American agenda under the current political regime is, we need productivity in the U.S.
Faris Sbahi
>> Yes.
John Furrier
>> So, it's almost as if the entrepreneurs are filling the void, because now we're seeing new cycles of tape-out.
Faris Sbahi
>> Yes.
Craig Churchill
>> Yeah.
John Furrier
>> What is your take on that?
Craig Churchill
>> Yeah. And I think a lot of that pressure is being forced by the superplayers like Nvidia, who are doing exactly that. I think one of the challenges though, John, that we're also seeing with our customer base is that, first-pass silicon tape-out is at a historic low. According to Siemens, it's like at a 14% historic low.
Faris Sbahi
>> Yes.
Craig Churchill
>> And so that's one of the crises that we're actually looking to address, which is going in with AI for silicon engineering start to finish, and using innovation, like an Iron Man suit, if you will, for the verification engineers and the silicon engineers across the board, so they're not suffering that ill fate of late silicon surprises being existentially expensive.
John Furrier
>> All right. So, what's the secret sauce? Give us a state of the art, because this is a huge, not only national agenda, as I pointed out, but it's also the demand is off the charts. And also the way these systems are being built, it's not your classic, Hey, here's a motherboard, put a chip on, put some memory on it. These are engineered systems and the ASICs are the key because, one little advancement in, say, photonics, changes the Internet game. I mean, networking is becoming the new silent bottleneck.
Faris Sbahi
>> Yeah.
John Furrier
>> But you start to see now, okay, the pressure to configure around GPUs. I mean, Jensen is like, "Don't make those GPU idle, because that's a sin in the world of infrastructure. Making those GPUs hum is what the goal is, pumping out those tokens."
Faris Sbahi
>> Correct. Yes.
John Furrier
>> What's your thoughts on that?
Faris Sbahi
>> Yeah, absolutely. So, broadly, we're making a few different methodological innovations. On the ASIC side, we're pioneering a new paradigm, which we call thermodynamic computing. So, the idea of thermodynamic computing is, there's two key aspects. One is, making it possible to process where the data lives, right? Which means that you're doing actual processing in the memory, in the memory cells themselves, which helps to resolve some of those bandwidth issues, which are the main reasons that in many cases for LM inference, you're not kind of maximizing the usage of the chip itself. The second aspect is, we're targeting workloads. If you think about AI video generation, some of these most intensive workloads and models, a good example is actually Sora, where OpenAI just shut down Sora a few days ago, because it was costing them $15 million a day to run, and that was against $2.1 million in cumulative revenue. The thing about those workloads is that, they are sort of noisy and approximate workloads. But we're running them right now on hardware that's 100% deterministic. And so, the idea with thermodynamic computing, the second aspect of it is, we are going to use hardware that is also non-deterministic and the physics of the hardware matches the workload itself, and so, I don't need to maintain these very expensive, deterministic, zero entropy states. It's a real breakthrough and it kind of connects to some of the early scientific work we did when we were spinning out of Google and now we're actually bringing this technology to market. We did our first tape-out last year, which we called CN101, where we kind of proved the kind of physics and a huge impetus for this round. And our next growth is now going through the next phase of tape outs towards production silicon and the data center. Maybe I know you're, as you were talking before the show, very enthusiastic about the edge and decentralized compute. That could be what the future looks like, physical AI.
John Furrier
>> I think it's 100% going to be the future. I mean, disaggregated is just distributed nodes. I mean, that was a big theme, GTC, disaggregated storage, disaggregated... But all that's the theme. The other theme is unification, which implies connectivity and kind of harmony between the systems. So, it's a systems architecture and switching and co-packaged optics, the trend in co-packaged optics speaks to kind of what's laying right out in front of us, which is networking.
Faris Sbahi
>> Correct.
John Furrier
>> That's a huge issue. So, you get disaggregated.
Faris Sbahi
>> Correct.
John Furrier
>> You need superfast networking.
Faris Sbahi
>> Yup.
John Furrier
>> You got to have processing and memory all there. And then you got the energy bounded problem and form factor.
Faris Sbahi
>> Yes.
John Furrier
>> Not everyone could have a monster data center with liquid cooling.
Faris Sbahi
>> Correct.
John Furrier
>> This is kind of where it's going.
Faris Sbahi
>> And I think, in that sense, there might be a slight... I think the market is starting to pick up on it now, but we're going to see, I still think, a lot more winners than people realize, in this kind of journey and this space in general. Because of the way that things are evolving towards heterogeneous data centers, also edge, networking, all these areas, the world is evolving in a way, that the data center is going to look completely different from what it does right now, in two to three years. Right? And you're going to see a bunch of different players who are coming with custom silicon, different form factors, different solutions, that are enabling this kind of future. But I just wanted to add that, from our perspective, the number one constraint, which relates to this, is energy. It's all about energy in the end. So, we have to somehow find a way... We're expected in about two years to have a 49 gigawatt shortfall of power.
John Furrier
>> Yeah.
Faris Sbahi
>> And right now, most of the solutions are targeting just finding ways to uncover more energy, make more energy available. I think that's great.
John Furrier
>> How are you guys fitting into that energy envelope? Because it used to be a data center conversation, but now it's more a pick your node in the network. And so, everything that comes out, whether it's a new CPO product switching, if they're not reducing the hops, if they're not reducing the energy, Because that energy is going to take away from how many GPUs you can load. Also, the networking increases latency, which is what the GPUs don't want. You need low latency, super great power. Where do you guys fit in there? What's the secret sauce? How do you see that unfolding?
Faris Sbahi
>> It's all about enabling custom silicon and this idea of radical hardware, software co-design. So, the idea is that, we're going to have an ASIC for more or less every workload in the data center, right? So, right now we're still using a small number of silicon. We're seeing that NVIDIA is starting to disaggregate. So, now they have two large buckets of training and inference, it's going to disaggregate much, much more, to the point where, you want to have a custom system, a custom solution, for more or less every application that you're running in the data center and beyond. The only way that we can get to that sort of scale is by making it much easier to design custom chips. And that's why the AI software and the methodology that we're pioneering right now with our partners, is right at the center. It's, we need to have thousands of more products than we do right now.
John Furrier
>> What's the timetable? Because obviously you guys are pedaling as fast as you can, obviously.
Faris Sbahi
>> Yes.
John Furrier
>> You got partners, you got Micron, you got Samsung, both legends in producing silicon. So, you got that done. Ecosystem is also radically moving fast too. So, you almost have the pressure points as everyone's going to meet in the middle. I just did a story this morning on, Oracle's database is now fully agentic, but their philosophy to keep it all within the database, because it's Oracle. So, it's not a bad design because they have all the data there. So, why not put the agentic origination there?
Faris Sbahi
>> Makes a lot of sense.
John Furrier
>> Okay. So, if that's going to run, they're going to have a lot of agents running around. So, if you got tokens per watt, which is now the key metric.
Faris Sbahi
>> Correct.
John Furrier
>> Agents-
Faris Sbahi
>> We like to say tokens per dollar per watt or...
John Furrier
>> Tokens per dollar per watt, because it factors in energy's money too.
Faris Sbahi
>> Yes.
John Furrier
>> Thanks for that correction.
Faris Sbahi
>> Right.
John Furrier
>> The tokens, agents eat tokens for lunch and dinner. Breakfast, lunch, and dinner. They can feed on tokens. So, whether it's Oracle having a zillion agents or anyone having agents, they got to be fed tokens.
Faris Sbahi
>> Yes.
John Furrier
>> So, now they've got the ecosystem doing stuff. You have vector surge, but now you're starting to see agentic boom.
Faris Sbahi
>> Correct.
John Furrier
>> You got a hungry market.
Faris Sbahi
>> Yes.
John Furrier
>> What's the biz dev look like? How's the business developing? What's the timetables?
Faris Sbahi
>> Well, let me give a kind of technology perspective on the timetable. I'm sure Craig will chime in on kind of to add additional color. I think that this change is going to happen much faster than people expect. Right? So, just to give an analogy, if we look at what happened in the software markets a few weeks ago, where basically Anthropic effectively wiped $285 billion in market value from SaaS stocks or Software as a Service stocks. The reason for that was that, they showed that, developing an application is much easier than we thought. Now that we have these agentic tools, now that AI has reached a certain threshold on the benchmarks and intelligent scale, software is becoming commoditized. To a large extent... It's becoming commoditized and also democratized, I would want to say. To a large extent, that's part of what we're enabling with products like Normal EDA. Which is, we want to make it radically easier for folks to develop custom silicon solutions using AI. And a large part of that depends on agents and sort of taking this problem of, hey, we have a shortage of silicon engineers, that's about a million right now, that we're going to make up for that with agents, with AI that's able to actually do discovery and innovation on its own. And this is not a 5 to 10 year plan. This is like a two to three year plan, right now from our perspective.
Craig Churchill
>> Yeah. Yeah. And John, you'll know this from old. Whenever a new technology comes about, there's always this kind of rush to embrace the technology and let a thousand flowers bloom inside organizations, which is like the kiss of death in many respects.
John Furrier
>> Some of those flowers don't turn into flowers, they're thorns.
Craig Churchill
>> Exactly. Exactly, right. So, as we're dealing with customers and clients in the market, they're kind of at that stage. And in the silicon engineering space, especially, they've allowed it to happen, but now they're coming to us and they're saying, "Guys, we need a proper solution here to radically change the methodology and radically change the way in which we're designing workflows for silicon engineering."
John Furrier
>> And I think, that thousand flower blooms, I had a big practitioner series yesterday with Mayfield Fund in theCUBE. We're doing a special, we're doing a lot more conversations with people who are the customers, ultimately of all the AI native companies that you guys are going to enable. The common theme there was, yeah, it's a double-edged sword. And the old expression was garbage in, garbage out. Now it's like, bad data in, bad agents out. So, you have a problem of the thousand flowers blooming, because they're not all flowers. They have thousand seeds being thrown on the soil and some thorns will grow, weeds will grow, but ultimately there'll be winners. That was one theme. Also writing a story right now and doing an analysis on the Y Combinator demo day. And if you look at all the companies that are there presenting, AI is disappearing into everything. So, technology is the market here and Wall Street now. Silicon Valley used to be the market for the tech, but it's now coming to the capital markets as converging. And so, my takeaway, and I want to get your reaction, because I think this is kind of in line with your thesis, is that it's not about the software you run, it's the systems you run.
Faris Sbahi
>> Correct.
John Furrier
>> So, to your point about the commoditization, when the SaaS apocalypse happened on the Anthropic announcement, which I think was way overblown, but to illustrate the point.
Faris Sbahi
>> Fair enough.
John Furrier
>> Is software is going to be commoditized? Yes. However, if you're a SaaS player and you don't change your product, you're screwed.
Faris Sbahi
>> Yes, correct.
John Furrier
>> Okay. So, that's the lesson, that's a telegraph sign. Look it. If you're going to sit around workday, you're going to get disrupted, because soon, I get the data moat.
Faris Sbahi
>> Yes.
John Furrier
>> So, Salesforce is protected now, but they might not be. So, I'm expecting all of them to move to an AI native. But if AI goes into everything, it's not about the software you run, it's the systems you run.
Faris Sbahi
>> Correct.
John Furrier
>> Because, if it's commodity software, it's already being built. I mean, there's software out there that's writing, helping software compile and run. So, there's bug fix software. It's all agents now. So, the software development life cycle that we once knew, that went to agile, is now completely agentic.
Faris Sbahi
>> Correct.
John Furrier
>> If that happens, and you believe it does, what's ultimately the value? We think it's the system.
Faris Sbahi
>> Makes Sense.
John Furrier
>> What Do you guys think about that and how to... Because if AI is infused in everything, it's not a category. It's a native ingredient.
Faris Sbahi
>> Yes.
John Furrier
>> So, then the value goes to, where's the scale?
Faris Sbahi
>> Correct.
John Furrier
>> What's the system?
Faris Sbahi
>> Yes.
John Furrier
>> What's the use case? How do You process the data? Am I edge case or am I a big factory case?
Faris Sbahi
>> Yes, I would agree. And it's, add one additional color there, which is, we think it's going to also be the platforms that enable developing those systems. Right? So, we're going to have to develop more and more systems. What are the technologies and the products that enterprises are going to use to build and scale those systems? But in the end, it really is about the systems at the end of the day. And it's making it easier to close the feedback loops between the products we're developing, whether it's software or hardware, and value, and the metrics that these businesses want to optimize for at the end of the day. And actually for us, we consider feedback loops to be a really kind of core part of our thesis. So, we ourselves are developing both software and hardware with normal EDA and our own Normal ASICs. And it's kind of like, all about enabling this feedback loop between a software platform you can use to develop radically more efficient silicon, bring it to market faster with less intensive human resourcing, but then also the hardware that you enable in the end. And it creates this virtuous cycle, where ultimately we're going to fully close the loop and you're going to have what effectively looks like AI workloads designing their own substrate that they run on in the end. This is like the radical idea.
John Furrier
>> I mean, the software stack approach is genius because it allows for flexibility.
Craig Churchill
>> Yes.
Faris Sbahi
>> Yes.
John Furrier
>> And so, you don't have to kind of pick, we're going to be an HBM alternative.
Faris Sbahi
>> Right, exactly.
John Furrier
>> I don't need to... So, you can figure out where the demand is. So, I guess my question is-
Faris Sbahi
>> Exactly.
John Furrier
>> With the software stack, is there demand... I guess my question is, what's the demand curve look like for you guys? Is there use cases that are low hanging fruit? How are you guys coming into the market? You got the 50 million. You're going to run hard, you're going to get great partners, Micron, and big investors. What's the plan?
Faris Sbahi
>> Yeah. Great question. On the hardware side, we're starting with multimodal generative AI inference. So, I mentioned Sora earlier, that's a good example. Video generation, image generation, actually materials discovery, a few different areas there, which are using these kinds of models, that we found are super inefficient. And it's one of the biggest areas of growth in GenAI right now, as we move to more multimodal kind of use cases. So, that's kind of where we're starting. But to your point, it's really about enabling us to meet demand where it is and as it evolves and continue to target that disaggregation at scale.
John Furrier
>> But you like disaggregation.
Faris Sbahi
>> That's the best thing for us.
John Furrier
>> That's going to create demand for you guys.
Faris Sbahi
>> Totally.
John Furrier
>> That's going to play out.
Faris Sbahi
>> Yes.
John Furrier
>> Inference is hot, obviously. It's more inference coming in than we've ever seen before.
Craig Churchill
>> Exactly.
Faris Sbahi
>> I think we're at two thirds of AI's inference right now. I think three years ago was one third. I mean, that's a big kind of change that we've-
John Furrier
>> Who would have thought the Mac mini would be a box that people would be buying . But that's a tell sign though, then you can see someone coming out saying, "Hey, I'm going to build a purpose built box." That could be an entrepreneur and say, Hey, you know what? I'm going to code some software. I'll take a...
Faris Sbahi
>> Arm brought to market a CPU, a physical product, for the first time in 40 years. Right? And to us, that fits in with this kind of idea that now it matters to ship custom silicon. Every company is prioritizing and has a strategy around it.
John Furrier
>> And you can write your own custom software to manage security, identity, compliance.
Craig Churchill
>> Exactly.
John Furrier
>> That's a huge issue. And OpenCloud's great, but it's running wild. It's fun. I mean, everyone in the 30s doing, it's like drugs. It's like tech drugs. How's your OpenCloud doing? You don't see any enterprises touching it.
Faris Sbahi
>> How is your OpenCloud doing, if you don't mind me asking?
John Furrier
>> It's doing great. It's actually screwed up my calendar this morning. I was running a little bit late. But no, it's fun, because what it does is, it shows what steady-state might look like.
Faris Sbahi
>> Correct.
John Furrier
>> And it's really an illustrated point and the developers are feeding on it. So, that's just the great sign, of where the-
Faris Sbahi
>> It's push vs pull. The agents are working for you, you're checking in on them, you're managing them, but to a large extent, the workforce is going to be partially agentic. We really believe that.
Craig Churchill
>> You said something in your Michael Dell interview, where Michael said... I worked at Dell, huge respect for the company. And he said, "It's going to take about three or four years to reinvent the entire business, embracing AI," which I thought was astounding that he's taking a swing at that. It's funny, we're a small company. We're only three, four years old. We're reinventing our business on a day-to-day basis. I can absolutely vest to that. And Faris's point about this beautiful recursive loop between our hardware and silicon design team and our software design team, the hardware guys are not shy in terms of telling what the software guys need to do to support their needs. And so, we've got this beautiful cycle to the top in terms of efficacy of software. So, it's happening everywhere.
John Furrier
>> Craig, that's a great point. I want to talk about that for a minute, because it's my pet peeve and love talking about it, in that, there's a lot of cliches, a lot of frameworks, people talk about agile, I pump up systems thinking, kind of get that out there. It's now out there. But when you think about what's going on, it's a generational revolution.
Faris Sbahi
>> It really is.
John Furrier
>> I mean, and everything you just talked about, even just recently, just in the past couple of weeks, Jensen's comment about token budgets and compensation, you're seeing that already kind of infiltrating. You're hearing companies, if you're not using AI, Amazon's leading this way, Andy Jassy's like, use AI tooling. And if you're not using AI as a tool, it's already kind of happening, even in the older workforces. Nevermind the younger generation. So, I'd love to get you guys to weigh in on this, because you've seen before and after, we've already crossed the threshold in my mind to a cultural... It reminds me of the '60s.
Faris Sbahi
>> Yeah, totally.
John Furrier
>> '60s, the freedom of love, drugs, sex, rock and roll. It's kind of like a tech version happening.
Faris Sbahi
>> It is, absolutely.
John Furrier
>> What is your thoughts? Because it's like a whole nother mindset. And companies that are sitting around saying, "Well, can someone do this for me?" And if they're not using AI, they're going to be extinct. So, the word on the street today is, if you are not using AI in your job, you will be replaced. I Believe that's going to be true. So, every company is going to start looking at employees and saying, don't be afraid to use AI because, oh, I'm using ChatGPT. No, you're not using AI to do your work. You using AI to augment your intelligence.
Craig Churchill
>> That's right.
Faris Sbahi
>> Right.
Craig Churchill
>> Indeed.
John Furrier
>> Talk about this generational thing.
Craig Churchill
>> Yeah. So, I've been around a bit, right? So, I've worked at some really interesting companies over a long arc. So, Microsoft, Silicon Graphics back in the day, Sun Microsystems.
John Furrier
>> Were you in the old Google building? Googleplex before it became Googleplex?
Craig Churchill
>> Yes, I was.
John Furrier
>> I'm old, I can remember those days.
Craig Churchill
>> No, they were great.
John Furrier
>> Great workstations. Great graphics.
Craig Churchill
>> Exactly, yeah.
John Furrier
>> Now you're going to-
Craig Churchill
>> And so, the planning cycle of those businesses, we do a five-year plan and we meet quarterly, maybe, maybe on a six monthly basis. That doesn't exist anymore. I mean, we meet daily and we meet daily to talk about substantive issues in the business and what do we do to fix them. That's the pace of which, that's the cultural revolution you're talking about.
John Furrier
>> What are you guys doing, just internally? I'm just curious, that people could learn from, in how you're in your day-to-day, because you guys are running hard and fast. And you guys, essentially 50 million, I won't say seed funding, you're well along, but you're growing.
Faris Sbahi
>> Yes.
John Furrier
>> You have targets. You got to get beachhead. You got to move fast. Great value. How are you guys using AI?
Faris Sbahi
>> We have agentic workflows for everything at this point. I mean, I was telling Craig that my own personal productivity has increased, I think, by almost 10X in the last 6 to 12 months, which is kind of insane when you think about it, right? And so, we have multiple evangelists internally at the company. But just to give you a sort of quantitative example of what AI can enable and has enabled our company, for a long time, certain kinds of software, especially in EDA, which is kind of the core area that we're building in, has been considered almost impossible to build, right? It's too difficult to build a competitive software product in this area. We had one of our AI leads, who was one of the LM leads at Meta previously, go on paternity leave at the beginning of this year. And he basically-
John Furrier
>> He got bored . Nap Time, let's code.
Faris Sbahi
>> Well, his agents coded, right? So, he wasn't coding. He was on paternity leave, doing I'm sure what he should be doing. But he set up his agents to run for 43 days. They wrote collectively 580,000 lines of code. And it turned out that, in the end, it wasn't bad code. It was really strong software that was effectively able to do this thing at the state-of-the-art, that folks thought was impossible previously.
John Furrier
>> So, essentially, iterate. Didn't sleep, agents don't sleep. They were iterating-
Faris Sbahi
>> That's a big part of the strategy. It's if you don't have a certain number of agents running and working alongside you, pretty much every day, while you're sleeping, while you're doing anything else, it's a missed opportunity and you really have to start now.
John Furrier
>> I love that example, because back in the day when someone would take leave or go on vacation or maternity leave, it's like you had to go to your manager, who's your backup, have a meeting, here's my workbook, here's what I'm working on. Don't break anything. And they would just mail it in, just check things. Maybe it's not 100%. Here, he just replicated a digital twin of himself.
Faris Sbahi
>> Yes, exactly. Yes.
John Furrier
>> I mean, not simulation, but he was actually using digital and technology to essentially be productive.
Faris Sbahi
>> Of course, checking in, but it's like, we have another person on our team, Erik Meijer, who co-founded one of the AI groups at Meta called Probability. He's very well known for his work in programming languages. So, he actually co-wrote Haskell, C#. So you can think he's a coder's coder by training. He has told me that he has not written code for at least over a year now. So, since this-
John Furrier
>> All that syntax, getting in, doing all that setup work.
Faris Sbahi
>> It's management work now. So, everyone's a manager, everyone's a team lead, but it's a different kind of team that you're working with. And to a large extent, this is what we're kind of enabling for our partners too. So we're going in and working with some of the greatest, most important institutions in the world, because by the way, this is important, right? EDA, in our mind, is a major choke point. Okay? So, right now, China has-
John Furrier
>> Explain EDA, people might not know it. What is it?
Faris Sbahi
>> Electronic Design Automation. It's basically the software that people use to design chips. It's an area that hasn't really been disrupted for about 40 years. So, the two major category companies there are about 10 years older than Google. And it's an area that clearly is in need of innovation. And SEMICON China is going on right now. We've heard a couple different comments. The VP of EDA at Siemens was kind of saying that something has to change in EDA for us to keep up. But you can also contextualize it in a kind of broader geopolitical macro way, that right now China is about 10% of the way there, in terms of their EDA capabilities by certain metrics. And that's one of our strategic modes that we have as the West and allies, many of the companies we work with were investors in the round. And it's really important-
John Furrier
>> This is a great point, because the AI lesson that we've all learned and people should know if they've been come to the conclusion is that, the indifference on problems that are ungettable in the past, are gettable. Meaning, if you... We can't do that, it's been tried a zillion times. I mean, how many times have you heard that excuse in meetings? Now it's possible. The other thing about your meta example of your team members that know coding. If you know the domain of something really at a root level, you could actually do the work to understand that language, in this case, programming language and the underlying principles and primitives and codify it to scale.
Faris Sbahi
>> Yes.
John Furrier
>> You could apply that to every domain.
Faris Sbahi
>> Yes.
John Furrier
>> So, that brings up the role of a curator. I call it a curator, artisan-
Craig Churchill
>> Orchestrator.
John Furrier
>> So, we're almost going back to the old principles of like, here's a canvas, here's your paintbrushes and colors, go do something. Which is why I think this token budget's an important concept because you got the paints. I'm paying you a million dollars a year, a big salary and that's what you did?
Faris Sbahi
>> Yes. Yes.
John Furrier
>> Like what?
Faris Sbahi
>> Yes.
John Furrier
>> So, now we're back to outcome.
Faris Sbahi
>> Yes.
John Furrier
>> So, there's an artist and craft coming Back.
Faris Sbahi
>> Yes.
Craig Churchill
>> Which is interesting as well.
John Furrier
>> And not just architecture. I mean, system architecture I think is a standard skillset.
Faris Sbahi
>> Correct, yes.
John Furrier
>> That's been more of an '80s thing, not just coding, because that's brick laying. You can just have an agent do that. It's more of, what are we doing? What are the tools?
Faris Sbahi
>> Yes.
John Furrier
>> What am I painting? What am I building? That is a fundamental shift in computer science. Your thoughts.
Craig Churchill
>> Well, I think, just to pick up on that point, one of the interesting pioneering technologies that we're using in chip design, is this thing called auto formalization, where it takes human prose, thousands of pages of human prose and block diagrams in multiple different formats. And really the design engineer's job is to try and figure out from all of this information, what the chip should do and what tests they need to run to make it successful and pass silicon first time. But it's really difficult and it's awfully cumbersome. So, what Faris and the team have done is they've created this auto formalization technique, which takes this human prose and converts it into a formal language. Now our purpose-built AI can understand that. So, it's almost like the reverse of what you were saying to a degree. It's taken the craft, it's taken this artisan flair and it's turning it into something that the LLM, the purpose-built LLM can understand. And so, that's part and parcel of one of our advantages in the-
John Furrier
>> And that's the idea of agents, right? If you can have a factory of agents, it scales the intelligence of the human. So, if you do it right, you have more workers reporting to you, not just an intern, as you joke about ChatGPT, it's like an educated intern from Princeton. How Princeton got in there, I don't know. But go throw Yale in there too, whatever. Some Ivy League Princeton, but now you're going to have system designers. That is actually compelling, because to your point, agents with systems of intelligence or these factory nodes, could decide, oh, build me a new box.
Craig Churchill
>> Yes.
Faris Sbahi
>> Yes, exactly.
John Furrier
>> Or, hey, we got a thing at the a telco tower, it's got a small cabinet. I need this form factor. I need this power envelope. Go build me a box, lay out an architecture, manufacture it.
Faris Sbahi
>> Correct.
John Furrier
>> I need 10,000 units.
Faris Sbahi
>> And we're not that far from that.
John Furrier
>> Might want that. They got $90 billion upgrade they're looking at right now. Refresh.
Faris Sbahi
>> Yes. That's the next frontier. I mean, you hear folks like Andrej Karpathy talking about auto research and just this idea that we're going to start to move into actual discovery. So, right now, key precursor to that is exactly what Craig described, that the main focus right now really should be codifying intent, so that way the agents can work on designs where we've kind of formalized the intent in some particular way. But subsequently, after we've kind of done that, we're going to be able to innovate on the designs themselves. So, AI is going to help us discover architectures and methodologies that are far ahead of our time, that we would not be able to discover on our own. So, I think it's just going to keep... Because we keep thinking we're at the limit of AI, intern is the limit and it's going to keep.
John Furrier
>> Well, certainly it's not Normal at this point, but soon to be Normal.
Faris Sbahi
>> Yes.
John Furrier
>> Normal Compute. Guys, congratulations on the .
Faris Sbahi
>> Thank you very much.
John Furrier
>> Super exciting. We're pushing the limit on the Lex Fridman podcast here.
Faris Sbahi
>> Awesome.
John Furrier
>> I can go two hours with you guys. It's such a great topic. Very relevant, super cool.
Faris Sbahi
>> Thank you.
John Furrier
>> Thank you for coming in. Appreciate it. And we'll be tracking it.
Faris Sbahi
>> Thanks for having us on.
Craig Churchill
>> .
John Furrier
>> I'm John Furrier with AI Factory Series. Again, this is just another example. There is so much work being done. The discovery, the invention, the builders, the operators and the investors are all coming together, the convergence of technology and capital markets. That's the theme of our program here at theCUBE, the NYSE Wired program. And of course, the demand is off the charts. We're doing our part to bring this content to you. Thanks for watching.