Join hosts Rebecca Knight and Dave Vellante, along with esteemed analyst George Gilbert of theCUBE, as they delve into the transformative announcements at the Snowflake Summit 2025 held at the Moscone Center in San Francisco. This session offers a comprehensive exploration of Snowflake's strategic evolution within the data landscape, focusing on its openness initiatives, partnerships and future directions.
Rebecca Knight and Dave Vellante introduce George Gilbert, highlighting their expertise in enterprise technology analysis. They discuss the significance of Snowflake's architectural transformation and the emphasis on integrated simplicity, separating compute from data. The conversation sets the stage for exploring how Snowflake continues to innovate in response to competitive pressures.
The discussion unveils several key insights, including Snowflake’s advancements in artificial intelligence, its evolving pricing models and the importance of open table formats such as Iceberg. Gilbert explains that the movement towards an open governance catalog signifies a shift in Snowflake’s control dynamics as it navigates customer demands for greater data accessibility and interoperability.
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
Snowflake Summit 2025. If you don’t think you received an email check your
spam folder.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Register For Snowflake Summit 2025
Please fill out the information below. You will recieve an email with a verification link confirming your registration. Click the link to automatically sign into the site.
You’re almost there!
We just sent you a verification email. Please click the verification button in the email. Once your email address is verified, you will have full access to all event content for Snowflake Summit 2025.
I want my badge and interests to be visible to all attendees.
Checking this box will display your presense on the attendees list, view your profile and allow other attendees to contact you via 1-1 chat. Read the Privacy Policy. At any time, you can choose to disable this preference.
Select your Interests!
add
Upload your photo
Uploading..
OR
Connect via Twitter
Connect via Linkedin
EDIT PASSWORD
Share
Forgot Password
Almost there!
We just sent you a verification email. Please verify your account to gain access to
Snowflake Summit 2025. If you don’t think you received an email check your
spam folder.
In order to sign in, enter the email address you used to registered for the event. Once completed, you will receive an email with a verification link. Open this link to automatically sign into the site.
Sign in to gain access to Snowflake Summit 2025
Please sign in with LinkedIn to continue to Snowflake Summit 2025. Signing in with LinkedIn ensures a professional environment.
Are you sure you want to remove access rights for this user?
Details
Manage Access
email address
Community Invitation
Keynote Analysis
Join hosts Rebecca Knight and Dave Vellante, along with esteemed analyst George Gilbert of theCUBE, as they delve into the transformative announcements at the Snowflake Summit 2025 held at the Moscone Center in San Francisco. This session offers a comprehensive exploration of Snowflake's strategic evolution within the data landscape, focusing on its openness initiatives, partnerships and future directions.
Rebecca Knight and Dave Vellante introduce George Gilbert, highlighting their expertise in enterprise technology analysis. They discuss the significance of Snowflake's architectural transformation and the emphasis on integrated simplicity, separating compute from data. The conversation sets the stage for exploring how Snowflake continues to innovate in response to competitive pressures.
The discussion unveils several key insights, including Snowflake’s advancements in artificial intelligence, its evolving pricing models and the importance of open table formats such as Iceberg. Gilbert explains that the movement towards an open governance catalog signifies a shift in Snowflake’s control dynamics as it navigates customer demands for greater data accessibility and interoperability.
Principal Analyst, Data & AISiliconANGLE & theCUBE
Snowflake’s opening keynote set a new tone for Snowflake Summit 2025 – and theCUBE’s Rebecca Knight, Dave Vellante and George Gilbert break down every signal that matters. Broadcasting live from Moscone Center, the team dissects why this year’s 15K-strong gathering marks a decisive shift from cloud-native warehousing to an AI Data Cloudbuilt on open table formats, granular scaling and cost-savvy pricing.
Key takeaways include: • Open, but still simple – Snowflake’s embrace of Apache Iceberg plus the open-source Polaris catalog synchronised with i...Read more
exploreKeep Exploring
What were the key points discussed in the keynote presentation?add
What did they do that was very clever last year with Polaris?add
What potential challenges does the API market face, and how does the concept of modeling data and applying AI relate to extracting value in this industry?add
What are some issues with using modeled data for business users?add
What considerations are there for integrating different databases, such as Postgres and RelationalAI, into a system like Snowflake which uses a single SKU based on one database?add
>> Hello, Data Nation and welcome to theCube's Live coverage of the Snowflake Summit 2025 here at the Moscone Center in San Francisco, California. I'm your host, Rebecca Knight, alongside my co-host and analyst, Dave Vellante. Also joining us is George Gilbert, another analyst at theCube who is an esteemed opinion maker. So thank you so much.
Dave Vellante
>> Good to see you, Rebecca. It has been a while.
Rebecca Knight
>> It has been a while. It's lovely to be here.
Dave Vellante
>> Yeah, San Francisco is hopping. I'll tell you, this conference feels like it's much bigger. I don't know what the numbers are, but I would say it's kind of 15,000-ish. We were here last night, you couldn't even walk around this area. We should have done Cube after dark on Monday night. But George, two years ago, actually three years ago here, we started to see this company, and Rebecca, I can't remember if you were here or not, but anyway, we started to see this company in transition, where I remember Benoît Dageville said, "Has anybody out there heard of Iceberg?" And three hands went up, and I was one of them, because I happened to stumble across it. And now if you asked that same question, every hand would go up. But it just was a signal that Snowflake saw what was coming. I don't think at the time they understood the impact. And of course, the Databricks competition accelerated that, but really it's about its customers dragging them into the world of open. And of course, you've done enough of these Cube gigs to know the power of open and the leverage that it brings.
Rebecca Knight
>> Indeed, and that seemed to be a key message of the keynote. All of us are fresh from the keynote. It was audio visually arresting with a beautiful symphony playing in the background and lines of code, and it was very cool. Lots of big wigs from the company. As you said, Benoît Dageville was up there, Christian Kleinerman, they also brought up some top customers. You were also both at Analyst Day. George, why don't you give me your high level thoughts about what was talked about and really picking up on what Dave was saying, that this is a company in transition that has the power to see.
George Gilbert
>> So from both an end user point of view and a developer and an administrator point of view, I saw them reinforcing the simplicity message that's always differentiated them. And we'll get into that, but the way they used AI, the way they used the catalog, the functionality they added, it all added up to that integrated simplicity that differentiated them. But also then underneath, they made a lot of progress on the architectural transition where they were the ones who pioneered cloud native, separate compute from storage. But now, as Dave pointed out, with Iceberg, we're transitioning to separating compute from data where the customer and no vendor owns the data. And that's a huge shift. So every vendor has to compete for every workload, because they don't lock up the data in their DBMS. And they made a lot of progress in that direction this year, which we'll get into.
Dave Vellante
>> I put a tweet out during Benoît's keynote where we were just really discussing not only his announcements, and I would encourage you to go there, but basically that transition that they're in. I mean, you noticed a couple of years ago they were being forced to open up. One of the key things that, George, your research saw is that customers were taking the data engineering work out of Snowflake, putting it into a Spark execution engine like a Databricks or an EMR for cost reasons. And remember, Snowflake, Benoît, he basically said it, "We're doubling down." He didn't use the word doubling down, but he said my words on simplicity and governance, so that it's trusted. But basically that requires you to put everything inside of Snowflake. And as you pointed out, many times, George, it bundles in the hardware from AWS. So whether it was perceived or real, there seemed to be a TCO disadvantage for that data engineering work, low value work. It's like just getting the site ready, it's scraping the paint. And then, so they were doing that outside of Snowflake. They've addressed that with new pricing. We heard some new pricing today. I don't know if they've announced it yet, but we're going to see some capabilities to go after the data science persona. So that was a very high priority, because that data engineering workload is so important, that data scientist persona.
George Gilbert
>> The two of them, and the reason why they needed to get the data engineering workloads, besides the fact that even when they weren't getting all of them, it was like 30, 35% of their compute, but compute cycles was data engineering. And with Databricks, it was... Several years back, Bob Muglia estimated it might've been 80% of their workloads. I mean, that's where they started with doing the pipelines. But the two things, several things that Snowflake did that really addressed this, one, their Snowpark Java support matured. So with Spark, you could do PySpark, Java, Scala, and data engineers like to work often in those languages in addition to the DBT SQL support. But then they addressed the pricing model problem, which was bundling the hardware for what were very cost-sensitive workloads. So they have two options now. It's essentially a serverless function where other vendors have to offer that. So everyone has to mark up the hardware so there's no disadvantage there.
Dave Vellante
>> Databricks in particular, let's call them out.
George Gilbert
>> Or bring your own compute, which means you don't have to buy the Snowflake hardware markup. So between adding the languages, adding the compute option, and then adding much more high performance Snowpipe, it's like 50% of the cost. Now, they sped it up and they did data volume-based pricing. So those three issues. Now, the reason that's important, it's not just because of, it's like it was 30% of the workloads and could be more, it's that the lineage graph, the data lineage comes from your data engineering workloads, and the lineage graph is what you hang all the operational metadata off. That's the backbone of your catalog.
Dave Vellante
>> All right, so let's sort of back up, because you're probably losing a few people here.
George Gilbert
>> Okay.
Dave Vellante
>> But it's good. So let me sort of set the framework. One thing I just want to call out again on my Twitter post, Marriott announced Adaptive Computing. I don't think I've ever seen that before.
Rebecca Knight
>> The customer.
Dave Vellante
>> We might've started at UiPath one time.
Rebecca Knight
>> Well, because it was a very candid comment about, did Snowflake work for you? Yes and no. And so, here's how we came up with it.
Dave Vellante
>> It was a good setup. And so, I've never seen the customer and I can't recall announcing Snowflake Adaptive Compute. Now, that's kind of cool, because remember, Snowflake for years has had these T-shirt sizes. You had to pick small, medium, and large, and it worked pretty well. It was very simple, but it wasn't perfect, because you couldn't have that granularity. And if you went over, it got kind of complicated. Oracle, by the way, a couple of years ago used to attack Snowflake on this, and so with a more granular sizing option, so Snowflake is taking care of that. But the bigger picture here is, as I said before, Benoît doubled down on simplicity. He said, "Enabling data to be connected and shared." And he's bringing this ethos to AI so that it's trusted. I mean, Snowflake frankly was nowhere in AI three or four years ago. They didn't really have a great ML story. Now they've catapulted themselves into a leading company, technology company in AI, and they've got a compelling value prop that's differentiated because everything is in that Snowflake big blanket. So their near-term priorities we talked about was to go after the data science persona. They're dealing with that. They were getting a lot of competition in that space. And so, when you think about it, George, the other thing that's happening is they have to open up. And Rebecca, so by opening up, there's an interesting trend going on. So the point of control for Snowflake has always been the database that's migrating to the governance catalog. And the governance catalog is becoming open source in many cases. Unity, Polaris, some proprietary ones as well. So the value keeps going up. So that creates interesting challenges for Snowflake. On the one hand, their differentiation is that they have this walled garden and they can control everything. On the other hand, customers in the market and competition is forcing them to open up. So as a result, they have to lean into that trend or they're going to end up driftwood.
George Gilbert
>> But what they did that was really clever that they did not tell us about last year in terms of a statement of direction, was that first their catalog is tied to their DBMS, unlike others where the catalog is a separate product. But what they did that was very clever, was they announced Polaris fully open source last year. It's not only an open source product, it's also a spec that others can implement. So now what they're doing is taking more and more of the Horizon Catalog richness, migrating it to Polaris or the open source spec. And so, you get richer and richer open source catalog to manage the external iceberg data, but it's synchronized with Horizon.
Dave Vellante
>> Okay, so just coming back to kindergarten for a moment, because there was a really important point that you just made. So the value is moving up the stack, and as they lean into open table formats like Iceberg, to your point, there's more of the proprietary Horizon functionality, that people will pay up for the value, trickling into the open world. So that puts greater pressure on Snowflake to have the absolute very best engine, and which I think they're announcing it today, the next generation engine, so it's going to be bigger, better, faster, and more economic, but it also requires them to bring that capability to the open world. So they not only have to be the best engine, they have to be the best at managing all those open table formats. Go ahead.
George Gilbert
>> But this is where, before the perception on Snowflake was they're really good when they're managing native tables, their own proprietary format and their own managed version of Iceberg, meaning where they control the reads and writes to Iceberg. What has changed now is with the Polaris catalog synchronized with Horizon, they moved the permissions, they synchronize permissions with... I know this sounds a little techy, but the point is now their engine can manage, read-write access, so they can do data engineering, the analytic workloads, data science on external Iceberg tables.
Dave Vellante
>> So let me translate.
George Gilbert
>> Yeah.
Dave Vellante
>> So what that means is that the external open experience is much more like the Snowflake experience.
George Gilbert
>> Yeah.
Dave Vellante
>> Maybe not as controlled, maybe not fully, but as close, I mean substantially similar. And that's the game now, that's the race. So the question is, coming back to the value stack movement, what can Snowflake and is Snowflake doing to maintain its value advantage? I heard on TV this morning, the four most highly valued, overvalued was the implication, but highly valued on a valuation standpoint, former securities analyst, Palantir, number one, I think CrowdStrike was number two. I think Snow was number three or four. I think Zscaler was up there. Even though their stock price is way down from its previous high, they're still priced at a premium for perfection, because they have such high value. So what can they do to preserve that value? They have to create that open experience, their proprietary experience has to be great, and they've got some decisions to make. Are they just basically data infrastructure that's serving up agentic platforms or are they going to become the agentic platform that goes after the two new layers that we've talked about? Layer one being, let's call it the harmonization layer, taking all that disparate data, the structured data, the unstructured data, the different formats, and bringing it all together so that it all makes sense as a true single version of the truth. And then that's one layer. And then feeding the high value, then feeding that up to the agentic control framework that can interpret the top-down metrics tree, which is again, they've hinted toward metrics trees. Those two new areas of the software stack are emerging, and Snowflake either will go after... Whoever owns that will have a lot of leverage in the future. There's still tons of opportunity even if you don't own that. But whoever owns that is going to be the highest value software company in the world, and it's going to leave the rest of it for a lot of specialty opportunities.
George Gilbert
>> One thing worth mentioning is that there was the perception that they were limited to doing really good workloads when they own the data. But by synchronizing with Polaris, their engine now competes with any other engine working on open Iceberg tables for data engineering, for analytics. In other words, they're getting net new workloads or net new data to apply their workloads too.
Dave Vellante
>> And it's showing up in their financials. I mean, their last quarter was a statement quarter. They grew revenue 26%. They're on track to be a $4 billion company this fiscal year. Their long-term goal is to be a $10 billion company. So they're talking an incremental $6 billion. So Snowflake has always wanted to be the next great software company like ServiceNow became, certainly like Salesforce before it. But as it goes up the stack, same thing is true for Databricks, as it moves up the stack, it's now entering the domain of the application vendors who have the business logic, the process logic all trapped inside their application domain. And the big question is, how does Snowflake get access to that? Can they get access through connectors and how do they harmonize all that data? Do they want to go after that? Does that make sense?
Rebecca Knight
>> Yes, absolutely. And I feel as though to your point before, it's going to be driven by customers in terms of what they're demanding, what they need, and how they are plotting their AI futures.
George Gilbert
>> So that's actually a great point, because when we talked to the head of platform yesterday, he was like, "So for this year, the hard work is synchronizing native Horizon Catalog with the open Polaris so that they can manage the entire native and iceberg data estate and apply all their workloads." But following on that, there are some very big, advanced customers like AT&T and others who are working with Snowflake partners like RelationalAI to do really advanced business intelligence and knowledge graphs where they're moving up into the application semantics. And what the head of platform told us is, that's 12 months out where we see, can they start building a model of your enterprise by abstracting all the business logic into this knowledge graph? That's-
Dave Vellante
>> By the way, last night at the keynote, it was Sridhar Ramaswamy, the CEO, it was... Sridhar is solid. He's clearly a technical CEO. It was very scripted. Fine. That's who Sridhar is. That's cool. Lynn Martin, president of NYSE, she was good talking about risk at scale, and then Sam Altman was the headline. And I forget how you say her last name. Guo? Sarah Guo?
George Gilbert
>> Guo.
Dave Vellante
>> Guo, who's great. She was the moderator and Sam Altman was there. He didn't say a ton. I was surprised that they didn't talk about the fact that Snowflake, or at least a lot about, is basically integrating Open AI tooling. I think o4 and o1. Open AI is dominating the LLM game, even though some would argue its commodity. And Snowflake is offering choice, of course with Anthropic and Mistral and others. But I bring that up, because A, I thought it was a missed opportunity. Open AI is really doing well in the LLM world. The second is, George and I wrote a piece a while back why Jamie Dimon is Sam Altman's biggest competitor. And the premise of that was that Jamie Dimon and JPMC are never going to put their two exabytes of data out in the internet to be trained by Open AI. At the same time, if Open AI models come into the Snowflake world and are applied to enterprise AI data, I presume that data stays in the enterprise. I presume it doesn't leak.
George Gilbert
>> That was the point of having them, those models, the data does not... I believe you have the option so that the data does not leave your boundary and get consumed by or trained on by-
Dave Vellante
>> Yeah, by Open AI. So then my question is, can Sam Altman have his cake and eat it too and not gain weight? In other words, can he play in both that consumer market at massive scale, whatever, 120 million users a day or whatever the number is?
George Gilbert
>> The API market?
Dave Vellante
>> Yeah, through the API market, can he also win the enterprise? And God knows what he's doing with Johnny Ives.
George Gilbert
>> But Dave, I think the bigger issue is, the API market, it's got gargantuan capital requirements, multiple competitors, and the lead of the frontier models is only six to at most 12 months over either another frontier or the open source guys coming from China or Meta. So the value add is, can you model your data and apply the AI to extract value? And that means the value comes to who can present a model of that data to extract value from it, which is why we say Jamie Dimon, meaning the enterprise customer who can model the data.
Dave Vellante
>> And that's where the value is.
George Gilbert
>> Yes.
Dave Vellante
>> So you remember Netscape, they were the hottest company in the planet in whatever it was, 90 whatever, early days of the dot-com? And then when it became clear that the value was shifting to the enterprise-
George Gilbert
>> The search engine.
Dave Vellante
>> Yeah, they said, "Well, the search engine, they just kind of missed that." But then, so what they did is, they pivoted to enterprise and they said, "We'll be the access to the intranet." Well, that didn't work either. And then they got demoed by all the other software vendors who owned the enterprise. I mean, I'm not predicting the same thing happens to Open AI, because today's management and Sam Altman has seen that before and they're smarter today and they're better capitalized. But to your point, if that business becomes commoditized, the foundation model business and the AGI business, that could present problems for them. I would say this, I think they believe that whoever gets to AGI first, whatever AGI is, will have some significant competitive advantages and that's what they're going after.
George Gilbert
>> But there's also some issues with using the models. One of the things they've talked about probably more than anything else over the last 12 months is, talk to your data for the business user. So talk to your data. Text-to-SQL. Natural language.
Dave Vellante
>> Yeah.
George Gilbert
>> And we've talked about this. They have the semantic views, which are the definitions of the metrics and dimensions. You need that. That's a way of modeling your data, so that an LLM can talk to it and understand the structure.
Dave Vellante
>> This is what you call, I think taking what humans think of as things and translating it into strings that databases can understand through the language of database which is SQL.
George Gilbert
>> Yes. So the metric and dimension layer is the human language, and that translated into SQL language. So they put a lot of effort into this, but the big question is, when you want to talk to your data, they say it's a feature not to have to use dashboards or SQL, but when it returns the data, how does it present it? How does it visualize that?
Dave Vellante
>> Okay, now fusing, Rebecca, the 101 that I was talking about and the 301 that George is talking about, so you've got text-to-SQL, which is things to strings. So what about people? What about places? What about processes? That's the four-dimensional map that you talk about, because ultimately our vision is that organizations want to build a digital twin or a digital representation of their enterprise, so that in real-time, systems of agency or agentic systems can take action and the enterprise can respond much more quickly with far less labor costs. So what is Snowflake's value add in that equation? Very clearly you've got to have strong data foundation to do that. That would be great. And that's a really lucrative market. They could choose to sort of stay there and dominate that, or they could really kind of go after the even higher value ground, which we think Salesforce is going after and maybe Palantir and ServiceNow and some others, which is those layers that we talked about, the system of intelligence and the system of agency. And it remains to be seen. I think they have to be careful, because if they go hard after that and declare that, that's their new North Star, companies like Salesforce are going to say, "Whoa, maybe you're not partners. Maybe we don't want to partner with you guys." So they have to sort of figure out how to enter that market if in fact they even want to get there. And they have some choices to make technically with partners like RelationalAI and Blue Yonder.
George Gilbert
>> Also, just saying you have text-to-SQL, it's not a binary yes, no, it's a very difficult problem to get high accuracy for complex queries on complex schemas.
Dave Vellante
>> Because it's a probabilistic AI.
George Gilbert
>> And one of the things their close partner RelationalAI just announced yesterday was world leading top of the leaderboard text-to-SQL accuracy. And they're doing that with their customer, AT&T. And we talked to the head of platform, he's like, "The companies are getting closer and we know there's a showcase ISV for Snowflake, which is Blue Yonder." And they're building the entire application suite, the supply chain suite on RelationalAI. So we're going to see emerging, what's really a next generation application stack, which is what Snowflake always wanted to be.
Dave Vellante
>> Well, but so Snowflake talks about they have one SKU and that SKU is based on one database. Well, they just bought Crunchy, which is a Postgres database. Do they integrate that into their one SKU or is that a separate database? And how would they integrate a RelationalAI? Let's say they acquired a company like RelationalAI or partnered with them, do they integrate that? Is it still one database?
George Gilbert
>> It could be.
Dave Vellante
>> Like Oracle, although Oracle has MySQL HeatWave, so they actually have a couple of databases, but...
George Gilbert
>> I think it still can be one SKU, because instead of putting the transaction, if you want to execute a transaction to operationalize the decision, normally you've got to go to another application through reverse ETL. So-
Dave Vellante
>> So again, 101 to 301, we're talking here about companies like Snowflake are basically historical systems of analytics, right? I mean, it's what happened and you can apply AI to say what's likely to happen. But when you get into, you've played this point many times, George, what should we do next and with confidence? And allowing agents to actually take that action, that's a whole different ballgame.
Rebecca Knight
>> These are existential questions and we are going to get into all of them over the next two days of our live coverage of the Snowflake Summit. We're going to have a lot of Snowflake executives on, as well as customers, partners, people from Whoop, USA Today, JPMorgan Chase, Affirm, Box, AWS. It's going to be a great show. I look forward to working with both of you.
Dave Vellante
>> Thanks for hanging with us. Look at this rip on that. That was great, George.
Rebecca Knight
>> Exactly. I'm Rebecca Knight for Dave Vellante and George Gilbert. Stay tuned for more of theCUBE's live coverage of Snowflake Summit 2025. You're watching theCUBE, the leader in enterprise tech news and analysis.