We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: Mixture of Experts Series. If you don’t think you received an email check your
spam folder.
Sign in to theCUBE + NYSE Wired: Mixture of Experts Series.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Register For theCUBE + NYSE Wired: Mixture of Experts Series
Please fill out the information below. You will recieve an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for theCUBE + NYSE Wired: Mixture of Experts Series.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
theCUBE + NYSE Wired: Mixture of Experts Series. If you don’t think you received an email check your
spam folder.
Sign in to theCUBE + NYSE Wired: Mixture of Experts Series.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Sign in to gain access to theCUBE + NYSE Wired: Mixture of Experts Series
Please sign in with LinkedIn to continue to theCUBE + NYSE Wired: Mixture of Experts Series. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Brian Benedict, Eliza
In this theCUBE + NYSE Wired: Mixture of Experts segment from the New York Stock Exchange, theCUBE’s John Furrier sits down with Raj Verma, CEO of SingleStore, to unpack how the intersection of technology and finance is shaping enterprise strategy. Verma shares why SingleStore is “on course” for the public markets, reflects on brand-building through the company’s partnership with golf Hall of Famer Padraig Harrington and connects that ethos to how SingleStore helps organizations fix struggling data “swings.” The discussion zeroes in on what’s next as Wall Street watches the AI infrastructure buildout: after chips and systems, the software and data layers set the pace for value creation.
Verma outlines why enterprises must modernize “brown” data estates into “green” ones to safely bring corporate context, governance and compliance into LLM workflows via RAG – and why commoditized data-at-rest puts the advantage at the query layer that unifies data in motion with data at rest. He predicts agentic AI will gain reasoning capabilities in roughly 18 months, cites industry indicators like Google reporting ~25% of its software now built by AI and argues that high switching costs will give way to disruption as buyers reassess legacy vendors. The conversation closes with concrete momentum: ~33% YoY growth, ARR in the ~$135M range, gross dollar retention ~98%, cloud NDR ~130, ~50% of business now in the cloud, landing ~3 new customers per day, a path to cash-flow breakeven in the next two quarters and a teaser for AI-related announcements in the next two months. Listeners will find notable stats, real-world use cases and forward-looking views on how databases power reliable AI at enterprise scale.
>> Welcome back to theCUBE, broadcasting here at the New York Stock Exchange. This is our Mixture of Experts series with NYSE Wired. Today, joining me in-studio I have Brian Benedict, co-founder of Eliza. Welcome, Brian.
Brian Benedict
>> Thanks, Gemma. It's great to be here.
Gemma Allen
>> Thanks for coming on. So, tell me, we hear a lot about AI hype, about big, bold bets inside enterprise. You're betting on something smaller, faster and a little bit more practical, right? Unpack that for me. Unpack the Eliza journey.
Brian Benedict
>> Sure. So, we really started Eliza for two real reasons. One, empowering the people, which right now is becoming a really big issue for a lot of companies and a lot of the people within them as employees. So, we decided to make Eliza really a way to upskill people's skill level within organizations by building AI solutions for them to actually go and be trained on different instances, like ChatGPT and others. So, right now we feel like that's a huge gap because you can build all these great AI systems, but then if the people don't know actually how to use them and upskill them in terms of the everyday day-to-day work, then what are we really doing for these companies? So, that's been number one. And then, number two, it's our ability to actually build practical AI. And as you mentioned, it's not always about the most complex system that you can actually build. It's about actually building things that actually remove SG&A takeouts or actually put different applications inside their existing products, versus just building something brand new within AI. So, these are the things that we're really focused on at Eliza, and I think a lot of the times when you hear the word services and consultancies, it's always being born around a company that's been doing this in other areas now moving into AI. So, we wanted to bring an execution path there. So, our team is a bunch of executors building AI for companies.
Gemma Allen
>> And tell me your own background is in go-to-market, right? You come from Hugging Face and it's also on a phenomenal journey.
Brian Benedict
>> Yeah.
Gemma Allen
>> You've clearly seen the belly of the beast in terms of what's working and what's not working inside enterprise.
Brian Benedict
>> Sure.
Gemma Allen
>> Where do you think the low-hanging fruit is? Tell me about some use cases or examples where you're seeing real day-to-day impact. And also, where is this hype narrative really aligned to? Where is the challenge still?
Brian Benedict
>> Yeah, it's a really good question. It's happening right now in a lot of the departments that you wouldn't think of. So, HR, just having the ability for them to use AI in the recruiting in their own workflow side. It's happening within finance, closing books, FP&A, all of this type of work. So, these are great easy instances in which you can use AI practically to then go out and actually take a lot of stuff off the plate of people that actually have work that they don't really want to do either. It's a lot of busy work and things that people are like, I don't really want to do this work anyways. I'd rather stick to more higher-level tasks, but yet, I am stuck in manual processing. I'll give you another example. When you actually transfer money from one bank to another, actually switching bank carriers as within financial, how long does that take? That's a three to four week exercise of switching the accounts, routing all the information, signing documents. That bank is losing so much money in terms of every day that that money sits in pretty much transit. We've built solutions for these types of organizations to help them do that within a day or two, pretty much instantaneously bringing value to the organization. So, there's so many different areas that you can pick up that aren't always a threat to the employees of their day-to-day work.
Gemma Allen
>> And I think when banks are losing money, consumers are losing patience too. If we could find out a way to move money from Ireland to the US, if you can help me fix that.
Brian Benedict
>> Exactly.
Gemma Allen
>> But talk to me a little bit about that cultural element, right? I'm sure that there's a lot of.... Right now, at a time when there was reports even just this week announced around what has happened to the labor force since the introduction of ChatGPT since 2022, and it's kind of scary stuff.
Brian Benedict
>> Absolutely.
Gemma Allen
>> And I'm sure that there's cultural inertia, as well as technical inertia. Talk to me about how you address those conversations with your customers and your soon-to-be customers.
Brian Benedict
>> Yeah, that's a great question. So, when we look at your average employee at a company, we break them down into three different segments. They're either light users, they're right in the middle or they're power users. And our goal is really to take each one of those user base and bring them to that next level. Let me give you an example. If you're using chat, just using a chat interface, writing different aspects of emails and summarizations and very simplistic type stuff, we want to take you to the next level of getting you to use APIs or getting you to use an agentic workflow within your day-to-day. Companies right now are struggling because they're getting all these requests from their employees to their AI department saying, "I want to build a simple workflow to actually help my department or my business." And the AI groups are saying, "This isn't a high-level priority for us." So, they're trying to say, "There should be a DIY component to your day-to-day."
And so, what we're really helping is enabling that DIY team to say, "Okay, you can now do this yourself. We can empower you and train you to get that workflow done yourself, so that those other AI groups can really work on those bigger tasks at hand."
Gemma Allen
>> And talk to me a little bit about size because it feels as though we're in the next monolithic era of technology and size seems like a very important factor around speed, agility, circularity. We see what's happening with OpenAI and the daily or weekly change in partnerships that are announced. You are really betting on something that's smaller and more practical. How do you think about that from a competitive perspective when all of the same infrastructure, costs, challenges, layers will continue to exist?
Brian Benedict
>> Well, I think why a lot of these production-level pilots have failing is because people aren't taking into account the cost, people aren't taking into account these different things. And it's not always about size of model as that main mechanism. There's a lot of degrees of difficulty that happen to bring something in production. And then, a lot of times, even when it's in production, you say, "Oh, wow. That was just really too expensive."
So, for example, is we just moved the company from one of the agentic frameworks that's very popular out there today to AgentKit with OpenAI, and we saw a token count dramatically drop by 80%, we saw the cost drop by about 75%. Same workflow, same everything. So, it's not always about, oh, it has to be a small language model or something. It just has to be just well thought through in terms of what the goal is. If the ROI of something is only going to be a few million dollars, you don't want build a system that generates $6 to $10 million worth of cost. And I think a lot of times right now, people are experimenting with just getting something to work versus getting something fine-tuned to work within the confines of what they're trying to get done. And that's really where we come in and do a lot of legwork.
Gemma Allen
>> It's so interesting because even you mentioned large financial services. I was thinking about this last week to myself. Growing up, I had a credit union account. I'm like, "How are credit unions approaching AI?" How are you competing on that scale if you don't have these huge budgets and huge technology advocates into you on a daily basis? So, talk to me a little bit about the journey you've been on founding this company, I think it's 2025, so you're pretty new-
Brian Benedict
>> Pretty new....
Gemma Allen
>> growing-
Brian Benedict
>> Right. Yes....
Gemma Allen
>> and exciting, in a very competitive landscape.
Brian Benedict
>> Yeah, when I was at Arcee, one of our first investors was a man by the name of Stephen Garden. Stephen had built three different consulting firms, Caylent, phData as well as Onica, which they actually sold into Rackspace. And Stephen and I were talking all summer about what we could do to actually make an impact right now because there are so many companies struggling with this. And so, we decided to build this brand, but we didn't want to just build it with just 5 to 10 people of forward-deploy engineers. So, we have some extremely relevant strategic partners that have hundreds of developers that work with us on the AI front, so that we can really scale and work with all these organizations. But to your point, when you mentioned a credit union, those are perfect types of companies we love working with because one, they are understaffed in this arena. Two, even if they do hire an AI expert and whatever they're going to pay that person, we could probably build all of their systems for the cost of just bringing that one person in or even touching it. And I think that's really where today there's a fundamental gap between skill and hiring versus building, and there's such a quick way that you can build and get some significant value if you have the right team around you.
Gemma Allen
>> And tell me a little bit about the name, Eliza. It's got an interesting origin story that I certainly didn't know.
Brian Benedict
>> Yeah, sure. So, I mean, Eliza was the first NLP computer that was actually named back at MIT in the '60s. And so, we actually paid homage to that by using that naming convention as we built Eliza, which we feel is a new type of AI services business.
Gemma Allen
>> A Boston-based company. Staying patriotic?
Brian Benedict
>> Exactly. That's it.
Gemma Allen
>> I love it. Okay, so talk to me a little bit about mission control. We hear a lot about an agentic world, agent-to-agent, a lot of control sprawl, maybe compliance sprawl too, which I'm sure, again, in a world like credit unions, that we just spoke about, is quite important. How do you think about that as a differentiator for large LLMs versus something that is more proprietary and more practical?
Brian Benedict
>> Yeah, I mean, I think what we're seeing out there right now is a lot of companies want to keep that control, to some degree, within their four walls. And I think a lot of the solutions there are starting to build around that. I think OpenAI is doing it actually a great job of building around that, Anthropic as well, of actually putting more controls in guardrails within their systems. So, as we continue to grow, I feel like these giant model providers are going to get better at this. Now, is it good enough for every enterprise? Well, only time will tell, but there's different things that you can easily go to build and safeguard around these types of things. Whether if you're using your own data, making sure it's within those different four walls, if you're putting different evals and guardrails around different aspects of permissioning for your employees, and obviously, you're not going to get access to everyone for every different dataset in terms of everything you do because you don't do it today. So, there's a lot of things that are being thought about and thought through right now for those types of companies.
Gemma Allen
>> And talk to me a little bit about model control. How do you think about that? Who owns the brain? If you're working with a mid-size enterprise, who owns the nervous system and the brain that's then dictating processes?
Brian Benedict
>> Well, I mean, I think the bigger question is not just who owns the brain, but who owns the context. I think a lot of brands want everything to be done in their context and their voice. So, regardless of what the model is, and new models are coming out all the time, I think the context is super important for these companies to own. So, when you think about the context of, hey, listen, UBS and I've had hundreds of years of history of my brain, my tonality, the way in which we like to do business, how we like our employees to interact with our customers. We don't want to lose that to being whitewashed off into a general-purpose model. So, how do we make sure that that context, which then we can switch to any models as new elevations come out, which it's like every day right now there's a new model. You owning that context I think is super important to these companies.
Gemma Allen
>> And on that point of every day, it seems like there's a new model, it certainly does. It's feeling hard to keep a pace. It feels like everything is built backed on one or two or three LLMs. But what do you say to that when people, I'm sure clients even, in these discussions must say to you, "Too many models, not enough differentiators"? What are the differentiators?
Brian Benedict
>> Yeah, I mean, I think it really depends on what you're trying to solve for. There's a lot of domain adaption you can do in fine-tuning. You mentioned it earlier, the brain. Having a model that just has that type of brain, whether it can do reasoning tasks, whether you're actually looking at instruction following, whether you're looking at bringing in tools and function calling, finding the right model that fits that use case is super important. And now, it's like you've got a variety. It's like now entering the supermarket era of models, there's just one on the shelf for anything that you're trying to do, whether it's something on an edge device to something that's really powering a research group. So, that's the great news. I think it's going to continually evolve and I think how we think about utilizing those models and context switching with those models is going to continually evolve over time.
Gemma Allen
>> And do you think verticals will play a role in that? Do you see a role where there will be very specialized models or at least modular interfaces focused on specific spaces? That's something you're asked about.
Brian Benedict
>> Absolutely. Yeah. I mean, when you talk to a healthcare company and they have very specific language that they use versus very different than if you're talking to a software company. So, I think companies are going to continually try to either inject their own data into these models or use synthetic data to actually help to really fine-tune these models. But I think what you're seeing right now is you're seeing these model providers and you're seeing the data providers really start digging in on getting proprietary data to actually go and train on. They're looking for that information right now, so that they can make their models just more specific in certain areas.
Gemma Allen
>> Tell me about the founder journey. Are you a first-time founder?
Brian Benedict
>> This is number two.
Gemma Allen
>> Number two?
Brian Benedict
>> Yeah.
Gemma Allen
>> But a lot has changed, I'm sure, since the last time.
Brian Benedict
>> Oh, has it? Absolutely. Yeah.
Gemma Allen
>> It's like the Wild West, right?
Brian Benedict
>> Well, you know what it was? When we started my last company, Arcee in 2023, we had no idea at the pace in which things would actually move. I mean, the summer of 2023, everybody was talking about RAG. RAG was this new solution that everyone was betting on and every conversation was about RAG, RAG, RAG. And we were coming talking about small language models. And now, I feel like the pace of evolution has continually just evolved with the number of companies out there, the number of different models out there, the number of new techniques and things that companies are trying to do. So, to your point, I think for companies to say, "I'm going to wait and hold still until the playing field levels out," you're going to be gone by that happens. I don't know if that's happening in the next few years. I think it's going to continually see massive involvement right now.
Gemma Allen
>> It's dog-eat-dog out there.
Brian Benedict
>> It is. It is. But I thought it was crazy in 2023, it's way more competitive now.
Gemma Allen
>> So, what's on the roadmap? How do you plan to grow Eliza or how do you think about growth and change and... Fill us in.
Brian Benedict
>> It's a great question. A lot of what we're thinking about and doing is really working with private equity firms, companies that have very specific needs and tasks around their portfolios. We're doing a lot of work right now and doing testing and working with them today. We do see ourselves growing vertically. So, we've already hired a domain expert on the insurance side. We have domain experts that we're working with in financial services, and these people are forging just that expertise that helps with real-world applications and experiences coming from really established brands that can say, "I've built these solutions other places. I know what we can actually do with your brand." So, we're not coming in with just, hey, let's put a pie in the sky and say, "Oh, let's try building this and see what happens." It's all practical, real-life use cases that we've implemented or done from our expertise. And I think that's really where I see our practice growing.
Gemma Allen
>> And what keeps you up at night? What's the key concern? Is it about speed to market? Is it about really nailing the product set? What is it that you are obsessive over right now?
Brian Benedict
>> I mean, I think the obsessive part right now is finding the companies that are just really ready for this journey. There's a lot of companies that talk a big game that they're ready, but they really aren't. Their data's not ready, the leadership isn't ready. I mean, it's an investment and we've been down the path with a bunch of companies that we've been lucky that are really ready and are willing to make that investment of time and resources, more so than anything else. And the hard part, really, is now transitioning the employees to getting them on our side of the fence of saying, "Hey, this is not to replace you. This is to empower you." So, a lot of the work we really do is training the employees, so when these systems are built, they know how to functionally use them and can see their path with them, and that's slowly starting to come over time.
Gemma Allen
>> And I guess people, process, technology, right? That's where it needs to lead.
Brian Benedict
>> There it is. Hasn't changed. Hasn't changed.
Gemma Allen
>> Brian, thanks so much for coming on. Fascinating to chat with you.
Brian Benedict
>> Yeah, thank you so much.
Gemma Allen
>> Wish you guys all the best with Eliza and the journey ahead. And hopefully, see you back here soon to hear what's happening, what's going down.
Brian Benedict
>> Sounds good. Thank you, Gemma.
Gemma Allen
>> I'm Gemma Allen, here at the New York Stock Exchange with theCUBE and NYC Wired. This is our Mixture of Experts series. Thanks so much for watching.
>> Welcome back to theCUBE, broadcasting here at the New York Stock Exchange. This is our Mixture of Experts series with NYSE Wired. Today, joining me in-studio I have Brian Benedict, co-founder of Eliza. Welcome, Brian.
Brian Benedict
>> Thanks, Gemma. It's great to be here.
Gemma Allen
>> Thanks for coming on. So, tell me, we hear a lot about AI hype, about big, bold bets inside enterprise. You're betting on something smaller, faster and a little bit more practical, right? Unpack that for me. Unpack the Eliza journey.
Brian Benedict
>> Sure. So, we really started Eliza for two real reasons. One, empowering the people, which right now is becoming a really big issue for a lot of companies and a lot of the people within them as employees. So, we decided to make Eliza really a way to upskill people's skill level within organizations by building AI solutions for them to actually go and be trained on different instances, like ChatGPT and others. So, right now we feel like that's a huge gap because you can build all these great AI systems, but then if the people don't know actually how to use them and upskill them in terms of the everyday day-to-day work, then what are we really doing for these companies? So, that's been number one. And then, number two, it's our ability to actually build practical AI. And as you mentioned, it's not always about the most complex system that you can actually build. It's about actually building things that actually remove SG&A takeouts or actually put different applications inside their existing products, versus just building something brand new within AI. So, these are the things that we're really focused on at Eliza, and I think a lot of the times when you hear the word services and consultancies, it's always being born around a company that's been doing this in other areas now moving into AI. So, we wanted to bring an execution path there. So, our team is a bunch of executors building AI for companies.
Gemma Allen
>> And tell me your own background is in go-to-market, right? You come from Hugging Face and it's also on a phenomenal journey.
Brian Benedict
>> Yeah.
Gemma Allen
>> You've clearly seen the belly of the beast in terms of what's working and what's not working inside enterprise.
Brian Benedict
>> Sure.
Gemma Allen
>> Where do you think the low-hanging fruit is? Tell me about some use cases or examples where you're seeing real day-to-day impact. And also, where is this hype narrative really aligned to? Where is the challenge still?
Brian Benedict
>> Yeah, it's a really good question. It's happening right now in a lot of the departments that you wouldn't think of. So, HR, just having the ability for them to use AI in the recruiting in their own workflow side. It's happening within finance, closing books, FP&A, all of this type of work. So, these are great easy instances in which you can use AI practically to then go out and actually take a lot of stuff off the plate of people that actually have work that they don't really want to do either. It's a lot of busy work and things that people are like, I don't really want to do this work anyways. I'd rather stick to more higher-level tasks, but yet, I am stuck in manual processing. I'll give you another example. When you actually transfer money from one bank to another, actually switching bank carriers as within financial, how long does that take? That's a three to four week exercise of switching the accounts, routing all the information, signing documents. That bank is losing so much money in terms of every day that that money sits in pretty much transit. We've built solutions for these types of organizations to help them do that within a day or two, pretty much instantaneously bringing value to the organization. So, there's so many different areas that you can pick up that aren't always a threat to the employees of their day-to-day work.
Gemma Allen
>> And I think when banks are losing money, consumers are losing patience too. If we could find out a way to move money from Ireland to the US, if you can help me fix that.
Brian Benedict
>> Exactly.
Gemma Allen
>> But talk to me a little bit about that cultural element, right? I'm sure that there's a lot of.... Right now, at a time when there was reports even just this week announced around what has happened to the labor force since the introduction of ChatGPT since 2022, and it's kind of scary stuff.
Brian Benedict
>> Absolutely.
Gemma Allen
>> And I'm sure that there's cultural inertia, as well as technical inertia. Talk to me about how you address those conversations with your customers and your soon-to-be customers.
Brian Benedict
>> Yeah, that's a great question. So, when we look at your average employee at a company, we break them down into three different segments. They're either light users, they're right in the middle or they're power users. And our goal is really to take each one of those user base and bring them to that next level. Let me give you an example. If you're using chat, just using a chat interface, writing different aspects of emails and summarizations and very simplistic type stuff, we want to take you to the next level of getting you to use APIs or getting you to use an agentic workflow within your day-to-day. Companies right now are struggling because they're getting all these requests from their employees to their AI department saying, "I want to build a simple workflow to actually help my department or my business." And the AI groups are saying, "This isn't a high-level priority for us." So, they're trying to say, "There should be a DIY component to your day-to-day."
And so, what we're really helping is enabling that DIY team to say, "Okay, you can now do this yourself. We can empower you and train you to get that workflow done yourself, so that those other AI groups can really work on those bigger tasks at hand."
Gemma Allen
>> And talk to me a little bit about size because it feels as though we're in the next monolithic era of technology and size seems like a very important factor around speed, agility, circularity. We see what's happening with OpenAI and the daily or weekly change in partnerships that are announced. You are really betting on something that's smaller and more practical. How do you think about that from a competitive perspective when all of the same infrastructure, costs, challenges, layers will continue to exist?
Brian Benedict
>> Well, I think why a lot of these production-level pilots have failing is because people aren't taking into account the cost, people aren't taking into account these different things. And it's not always about size of model as that main mechanism. There's a lot of degrees of difficulty that happen to bring something in production. And then, a lot of times, even when it's in production, you say, "Oh, wow. That was just really too expensive."
So, for example, is we just moved the company from one of the agentic frameworks that's very popular out there today to AgentKit with OpenAI, and we saw a token count dramatically drop by 80%, we saw the cost drop by about 75%. Same workflow, same everything. So, it's not always about, oh, it has to be a small language model or something. It just has to be just well thought through in terms of what the goal is. If the ROI of something is only going to be a few million dollars, you don't want build a system that generates $6 to $10 million worth of cost. And I think a lot of times right now, people are experimenting with just getting something to work versus getting something fine-tuned to work within the confines of what they're trying to get done. And that's really where we come in and do a lot of legwork.
Gemma Allen
>> It's so interesting because even you mentioned large financial services. I was thinking about this last week to myself. Growing up, I had a credit union account. I'm like, "How are credit unions approaching AI?" How are you competing on that scale if you don't have these huge budgets and huge technology advocates into you on a daily basis? So, talk to me a little bit about the journey you've been on founding this company, I think it's 2025, so you're pretty new-
Brian Benedict
>> Pretty new....
Gemma Allen
>> growing-
Brian Benedict
>> Right. Yes....
Gemma Allen
>> and exciting, in a very competitive landscape.
Brian Benedict
>> Yeah, when I was at Arcee, one of our first investors was a man by the name of Stephen Garden. Stephen had built three different consulting firms, Caylent, phData as well as Onica, which they actually sold into Rackspace. And Stephen and I were talking all summer about what we could do to actually make an impact right now because there are so many companies struggling with this. And so, we decided to build this brand, but we didn't want to just build it with just 5 to 10 people of forward-deploy engineers. So, we have some extremely relevant strategic partners that have hundreds of developers that work with us on the AI front, so that we can really scale and work with all these organizations. But to your point, when you mentioned a credit union, those are perfect types of companies we love working with because one, they are understaffed in this arena. Two, even if they do hire an AI expert and whatever they're going to pay that person, we could probably build all of their systems for the cost of just bringing that one person in or even touching it. And I think that's really where today there's a fundamental gap between skill and hiring versus building, and there's such a quick way that you can build and get some significant value if you have the right team around you.
Gemma Allen
>> And tell me a little bit about the name, Eliza. It's got an interesting origin story that I certainly didn't know.
Brian Benedict
>> Yeah, sure. So, I mean, Eliza was the first NLP computer that was actually named back at MIT in the '60s. And so, we actually paid homage to that by using that naming convention as we built Eliza, which we feel is a new type of AI services business.
Gemma Allen
>> A Boston-based company. Staying patriotic?
Brian Benedict
>> Exactly. That's it.
Gemma Allen
>> I love it. Okay, so talk to me a little bit about mission control. We hear a lot about an agentic world, agent-to-agent, a lot of control sprawl, maybe compliance sprawl too, which I'm sure, again, in a world like credit unions, that we just spoke about, is quite important. How do you think about that as a differentiator for large LLMs versus something that is more proprietary and more practical?
Brian Benedict
>> Yeah, I mean, I think what we're seeing out there right now is a lot of companies want to keep that control, to some degree, within their four walls. And I think a lot of the solutions there are starting to build around that. I think OpenAI is doing it actually a great job of building around that, Anthropic as well, of actually putting more controls in guardrails within their systems. So, as we continue to grow, I feel like these giant model providers are going to get better at this. Now, is it good enough for every enterprise? Well, only time will tell, but there's different things that you can easily go to build and safeguard around these types of things. Whether if you're using your own data, making sure it's within those different four walls, if you're putting different evals and guardrails around different aspects of permissioning for your employees, and obviously, you're not going to get access to everyone for every different dataset in terms of everything you do because you don't do it today. So, there's a lot of things that are being thought about and thought through right now for those types of companies.
Gemma Allen
>> And talk to me a little bit about model control. How do you think about that? Who owns the brain? If you're working with a mid-size enterprise, who owns the nervous system and the brain that's then dictating processes?
Brian Benedict
>> Well, I mean, I think the bigger question is not just who owns the brain, but who owns the context. I think a lot of brands want everything to be done in their context and their voice. So, regardless of what the model is, and new models are coming out all the time, I think the context is super important for these companies to own. So, when you think about the context of, hey, listen, UBS and I've had hundreds of years of history of my brain, my tonality, the way in which we like to do business, how we like our employees to interact with our customers. We don't want to lose that to being whitewashed off into a general-purpose model. So, how do we make sure that that context, which then we can switch to any models as new elevations come out, which it's like every day right now there's a new model. You owning that context I think is super important to these companies.
Gemma Allen
>> And on that point of every day, it seems like there's a new model, it certainly does. It's feeling hard to keep a pace. It feels like everything is built backed on one or two or three LLMs. But what do you say to that when people, I'm sure clients even, in these discussions must say to you, "Too many models, not enough differentiators"? What are the differentiators?
Brian Benedict
>> Yeah, I mean, I think it really depends on what you're trying to solve for. There's a lot of domain adaption you can do in fine-tuning. You mentioned it earlier, the brain. Having a model that just has that type of brain, whether it can do reasoning tasks, whether you're actually looking at instruction following, whether you're looking at bringing in tools and function calling, finding the right model that fits that use case is super important. And now, it's like you've got a variety. It's like now entering the supermarket era of models, there's just one on the shelf for anything that you're trying to do, whether it's something on an edge device to something that's really powering a research group. So, that's the great news. I think it's going to continually evolve and I think how we think about utilizing those models and context switching with those models is going to continually evolve over time.
Gemma Allen
>> And do you think verticals will play a role in that? Do you see a role where there will be very specialized models or at least modular interfaces focused on specific spaces? That's something you're asked about.
Brian Benedict
>> Absolutely. Yeah. I mean, when you talk to a healthcare company and they have very specific language that they use versus very different than if you're talking to a software company. So, I think companies are going to continually try to either inject their own data into these models or use synthetic data to actually help to really fine-tune these models. But I think what you're seeing right now is you're seeing these model providers and you're seeing the data providers really start digging in on getting proprietary data to actually go and train on. They're looking for that information right now, so that they can make their models just more specific in certain areas.
Gemma Allen
>> Tell me about the founder journey. Are you a first-time founder?
Brian Benedict
>> This is number two.
Gemma Allen
>> Number two?
Brian Benedict
>> Yeah.
Gemma Allen
>> But a lot has changed, I'm sure, since the last time.
Brian Benedict
>> Oh, has it? Absolutely. Yeah.
Gemma Allen
>> It's like the Wild West, right?
Brian Benedict
>> Well, you know what it was? When we started my last company, Arcee in 2023, we had no idea at the pace in which things would actually move. I mean, the summer of 2023, everybody was talking about RAG. RAG was this new solution that everyone was betting on and every conversation was about RAG, RAG, RAG. And we were coming talking about small language models. And now, I feel like the pace of evolution has continually just evolved with the number of companies out there, the number of different models out there, the number of new techniques and things that companies are trying to do. So, to your point, I think for companies to say, "I'm going to wait and hold still until the playing field levels out," you're going to be gone by that happens. I don't know if that's happening in the next few years. I think it's going to continually see massive involvement right now.
Gemma Allen
>> It's dog-eat-dog out there.
Brian Benedict
>> It is. It is. But I thought it was crazy in 2023, it's way more competitive now.
Gemma Allen
>> So, what's on the roadmap? How do you plan to grow Eliza or how do you think about growth and change and... Fill us in.
Brian Benedict
>> It's a great question. A lot of what we're thinking about and doing is really working with private equity firms, companies that have very specific needs and tasks around their portfolios. We're doing a lot of work right now and doing testing and working with them today. We do see ourselves growing vertically. So, we've already hired a domain expert on the insurance side. We have domain experts that we're working with in financial services, and these people are forging just that expertise that helps with real-world applications and experiences coming from really established brands that can say, "I've built these solutions other places. I know what we can actually do with your brand." So, we're not coming in with just, hey, let's put a pie in the sky and say, "Oh, let's try building this and see what happens." It's all practical, real-life use cases that we've implemented or done from our expertise. And I think that's really where I see our practice growing.
Gemma Allen
>> And what keeps you up at night? What's the key concern? Is it about speed to market? Is it about really nailing the product set? What is it that you are obsessive over right now?
Brian Benedict
>> I mean, I think the obsessive part right now is finding the companies that are just really ready for this journey. There's a lot of companies that talk a big game that they're ready, but they really aren't. Their data's not ready, the leadership isn't ready. I mean, it's an investment and we've been down the path with a bunch of companies that we've been lucky that are really ready and are willing to make that investment of time and resources, more so than anything else. And the hard part, really, is now transitioning the employees to getting them on our side of the fence of saying, "Hey, this is not to replace you. This is to empower you." So, a lot of the work we really do is training the employees, so when these systems are built, they know how to functionally use them and can see their path with them, and that's slowly starting to come over time.
Gemma Allen
>> And I guess people, process, technology, right? That's where it needs to lead.
Brian Benedict
>> There it is. Hasn't changed. Hasn't changed.
Gemma Allen
>> Brian, thanks so much for coming on. Fascinating to chat with you.
Brian Benedict
>> Yeah, thank you so much.
Gemma Allen
>> Wish you guys all the best with Eliza and the journey ahead. And hopefully, see you back here soon to hear what's happening, what's going down.
Brian Benedict
>> Sounds good. Thank you, Gemma.
Gemma Allen
>> I'm Gemma Allen, here at the New York Stock Exchange with theCUBE and NYC Wired. This is our Mixture of Experts series. Thanks so much for watching.