Hi, everyone. Welcome to the Confluent Q3 2023 Earnings Conference Call. I'm Shane Xie from Investor Relations, and I'm joined by Jay Kreps, Co-Founder and CEO; and Rohan Sivaram, CFO.
During today's call, management will make forward-looking statements regarding our business, operations, sales strategy, financial performance and future prospects, including statements regarding our financial guidance for the fiscal fourth quarter of 2023, fiscal year 2023 and fiscal year 2024.
These forward-looking statements are subject to risks and uncertainties, which could cause actual results to differ materially from those anticipated by these statements. Further information on risk factors that could cause actual results to differ is included in our most recent Form 10-Q filed with the SEC.
We assume no obligation to update these statements after today's call, except as required by law. Unless stated otherwise, certain financial measures used on today's call are expressed on a non-GAAP basis, and all comparisons are made on a year-over-year basis.
We use these non-GAAP financial measures internally to facilitate analysis of financial and business trends and for internal planning and forecasting purposes. These non-GAAP financial measures have limitations and should not be considered in isolation from or as a substitute for financial information prepared in accordance with GAAP.
A reconciliation between these GAAP and non-GAAP financial measures is included in our earnings press release and supplemental financials, which can be found on our IR website at investors.confluent.io. And with that, I'll hand the call over to Jay..
First, the impact from two large digital native customers. One online gaming company moved workloads back to their own data center, and one of our largest customers ramped slower as they are in the process of being acquired.
We believe consumption from these two customers was impacted by their company-specific events and accounts for roughly 50% of the expected consumption shortfall for Q4. Second, the continuing macro pressure, including the ongoing conflict in the Middle East, where Israel is a top 10 country for us and the possible U.S.
government shutdown, both of which add uncertainty and disruption in particular segments. Specifically, we've seen slower organic consumption resulting from a slower rate of new use case additions in some part of our customer base. Rohan will provide more details on our resulting guidance in the later section of the call.
I want to spend some time now focusing on a critical change we're making to drive growth. Beyond just the friction in the current market environment, the critical project for Confluent is to capture the massive market opportunity in streaming.
This is a $60 billion market where we are still just scratching the surface of even the existing open source Kafka usage. And we have additional expansion opportunities from Flink, our connectors and data governance, as I outlined in the earnings call last time.
Critical to our execution against this opportunity is leveraging our go-to-market engine to rapidly land new customers, expand new workloads and ensure the adoption of our full set of product capabilities. To this end, we'll be completing the transition to orient our cloud business around consumption.
This will make cloud revenue rather than bookings or committed spend the primary goal of the go-to-market organization for Confluent Cloud. This was a planned transition. Indeed, we began changes in this direction this year, but it’s a transition we’ll be significantly accelerating heading into 2024.
To explain what this means, let me start with a little background. In the traditional world of on-premise software, customers would make big upfront commitments. Salespeople worked with the customer to scope these commitments and were paid as a percentage of the resulting bookings.
The marketing organization measured pipeline based on these commitments, and every internal system and process was oriented around measuring and managing the bookings that resulted.
There was some misalignment between customer and vendor because customer might end up over purchasing, but this was masked by the fact that the rest of the stack, such as servers that ran the software, were also fundamentally upfront in inelastic purchases.
With the advent of the cloud and the elasticity and flexibility it offered to customers, this model had to evolve. Cloud has a utility-like model where services are metered as they are used.
However, in the early days, the go-to-market engine for cloud infrastructure software largely remained as it was previously, selling customer commitments or credits that overlaid this dynamic usage. Alignment between customer and vendor improved somewhat, but the vendor still had an incentive to maximally scope customer commitments.
Over the last couple of years, businesses like MongoDB, Snowflake, Datadog and hyperscalers have all transitioned their go-to-market to a fully consumption-based model. In this model, the customer and the go-to-market organization are both oriented around the actual service usage, not the upfront commitment.
This fully aligns the customer value realization with the vendor's revenue. Less obvious from the outside is how this completely upends the sales and marketing model. Pipeline is no longer oriented around maximum customer commitment, but rather new logos and new workloads.
Salespeople aren't compensated for getting an upfront booking, but rather for what a customer actually uses, finding new workloads and driving new product adoption. This is an absolute win for customers and also a huge win for vendors, who are actually able to grow faster by removing much of the uncertainty and risk from customer purchasing.
With Confluent Cloud now at nearly 50% of our revenue, having NRR over 140% and continuing rapid growth, it's time for Confluent to complete our transition to this fully consumption-based model. We've already made the transition to usage-based pricing that bills for what is used.
But today, our go-to-market is still primarily oriented around booking customer commitments. The final step in our consumption journey is to now fully align our go-to-market operations to the consumption motion. This directly attaches our go-to-market efforts to cloud revenue and to our customers' value realization.
This is one of the most important possible growth levers for Confluent. It is also necessary that we do this now. As we've seen across the industry this last year, economic pressure combined with the changing norm in cloud means customers are increasingly reluctant to make large multiyear commitments ahead of their usage.
This means over the course of this year, the misalignment between our subscription-based good market and the natural buying behavior of customers has increased, creating a drag in our cloud growth.
This is shown up in the dislocation between RPO and cloud revenue, but ultimately affects both as our interaction with customers are directed in a way that is out of sync with the customers' natural buying behavior. We believe this change will turn what is currently a drag into a tailwind.
This is not a new plan for Confluent, we'd originally planned to make this shift over a three-year period beginning with steps this year. However, in light of the change we've seen in buying behavior, we’ll be accelerating this and completing the transition next year. In practical terms, here's what this means.
First, beginning in Q1 of FY24, we're shifting from our current model where 10% to 15% of cloud sales compensation is based on consumption to a model where 100% of cloud sales compensation is based on incremental consumption in new logo acquisition. We will keep the vast majority of our customer revenue under a committed contract, as we do today.
However, we will not be attempting to get commitments ahead of the usage. Rather committed amounts will be customer-driven as customers choose to commit in exchange for greater discounts. Second, we're fully orienting our field-facing teams towards landing new customers and driving new workloads with customers.
Third, we'll be adapting our product and pricing to enable customers to frictionlessly try and adopt new products, enabling easier lands and adoption of new features.
Finally, we'll be undergoing a significant reworking of our systems for planning, growth, building and measuring pipeline and forecasting performance as all shifts drive directly off cloud revenue. A number of our cloud-oriented peers have made this transition to a very positive effect. So the path and benefits are clear.
Like the transition to the cloud we began several years ago, we expect to emerge stronger on the other side, more aligned with our customers and better positioned to capture the $60 billion opportunity in front of us. In September, we held Current 2023, the only industry event dedicated to the data streaming ecosystem.
It was a high-energy event and a fantastic illustration of the excitement and innovation around data streaming. The event included speakers from BMW, NASA, Nationwide, Snowflake and Uber and many others, as well as thousands of practitioners who gathered to discuss the state of the art data stream.
One of the things I'm most proud of is the velocity of product innovation the team has sustained over this last year, and I think our announcements at Current are great illustration of this.
We announced a new tier of Kafka cluster, Enterprise clusters which offer many of the advantages of our dedicated clusters like private networking and enhanced security, but include instant elasticity and our served multi-tenant of our core stack. This allows a better price point for customers and significantly lower serving costs for Confluent.
These lower cost clusters are perfectly aligned to our consumption transformation, since they lower the entry point price but scale up automatically as you need them. Data Portal is an important new addition to our stream governance suite that brings our vision for discoverable and reusable data streams to the forefront of Conclude Cloud.
The excitement around Flink at Current was palpable. Flink sessions were among the highest rated and most attended sessions at the entire conference, highlighting the hunger Kafka users have for Flink. That's why we're so pleased with the launch of Flink public preview in Confluent Cloud.
Since the announcement, we've seen incredible uptake with hundreds of customers opting into the preview and trying out our cloud-native and serverless Flink offering. The addition of Flink strengthens our position as the only complete data streaming product.
We bring Flink together with the connectors that capture streaming data, the stream itself in Kafka and the governance capabilities to manage streaming data across an organization. Each of these capabilities strengthens the other, and the combination comprise what we believe will be the most important data platform in a modern company.
And finally, we introduced data streaming for AI, a set of new partnerships that span vector databases, CSPs and SIs. It also includes new product capabilities, including the Confluent AI assistant that will turn natural language inputs into helpful suggestions and code.
We expect this initiative to address the demand we're seeing across our customers who are building innovative new AI applications. A great example of this is Notion. Notion is an AI-powered connected workspace where modern teams can create and share documents, take notes, manage projects and organize knowledge all in one place.
Given Notion's strong growth, a new data streaming platform was needed to meet its rapidly expanding needs. However, due to the lean engineering team, they required a fully managed service to focus on product development rather than managing Kafka.
They began using Confluent Cloud for real-time data flows for internal analytics pipelines and to supply data into data lakes. After realizing the value it provided, Notion expanded the usage of Confluent Cloud to enhance product features, including search automation and Notion AI.
Now Notion’s customers can automate tasks in real time, such as adding summaries, extracting key points and consolidating action items in meeting notes. This is only the beginning as Notion continues to innovate and actively explore how data streaming can power new AI applications.
Before turning things over to Rohan, I wanted to reiterate a couple of key points. We're incredibly excited about the tailwinds to the business, Flink, data governance, AI and the rest of the data streaming platform components add to our business.
And while it may cause some short-term headwinds, I firmly believe our accelerated transformation to a fully consumption-oriented business will put us in an incredibly strong position to drive the monetization of these new offerings. I've never been more confident in our ability to be the leader in the emerging $60 billion data streaming market.
With that, I'll turn things over to Rohan..
First, we expect to have a full quarter impact from the two large customers mentioned earlier, whose consumption is impacted by their company-specific events around the shift of workloads back to on-prem and an acquisition. We estimate that these two customers account for roughly 50% of the expected consumption shortfall in Q4.
Second, the macro uncertainty, including the ongoing geopolitical tensions will likely persist, impacting new use case deployments into production and driving lower-than-expected consumption.
As Jay discussed earlier, the current macro has increased the misalignment between our subscription-based go-to-market and the new buying behavior of customers, which creates a drag in our consumption growth.
We believe our accelerated move to a fully consumption-oriented comp model for Confluent Cloud will turn this drag into a tailwind for our business. Turning now to guidance. For the fourth quarter of 2023, we expect revenue to be in the range of $204 million to $205 million, representing growth of 21% to 22%.
Cloud revenue to be approximately $97.5 million, a sequential add of $6 million, representing growth of 43% and accounting for approximately 48% of total revenue based on the midpoint of our guide. Non-GAAP operating margin to be in the range of 0% to 1% and non-GAAP net income per share to be approximately $0.05.
Additionally, we expect the free cash flow margin to be in the range of 0% to 1%. For the full year 2023, we expect revenue to be in the range of $768 million to $769 million, representing growth of 31%, non-GAAP operating margin to be approximately negative 9%, and non-GAAP net loss per share in the range of negative $0.01 to $0.00.
Looking ahead, I'd like to provide an early read into our outlook for next year.
Our fiscal year '24 preliminary outlook assumes the impact of a continued volatile macroeconomic and geopolitical environment, the dynamics mentioned earlier for Q4 continuing into 2024, and risk associated with our transformation to a fully consumption-oriented business.
If macro were to improve, we would expect to benefit from it, but it is too early to tell. Given these factors for the full year 2024, we expect revenue to grow approximately 22% year-over-year, non-GAAP operating margin to break even, improving approximately 9 percentage points year-over-year.
This is despite a 2- to 3-point headwind associated with our move to a consumption-based sales commission plan, resulting in higher upfront expense recognition. And we expect free cash flow margin to break even. I'd like to highlight a few things about our consumption transformation.
Despite potential near-term top line impacts on our fiscal year '24 outlook, we expect the transformation will enhance our ability to drive durable and efficient growth over both the midterm and long term and will put us in a stronger position to capture our $60 billion market opportunity.
We believe subscription revenue, which captures ACV from Confluent Platform and consumption from Confluent Cloud will be the best indicator of our success. Starting Q1 next year, we will include subscription revenue in our key financial metrics and move our quarterly and annual revenue guidance metric to subscription revenue.
And consistent with that, RPO and CRPO will be less relevant as a forward-looking indicator given the greater emphasis on consumption over ACV-based commits for cloud.
We expect our non-GAAP operating margin midterm target of 5% to 10% and long-term target of greater than 25% to be firmly intact and for free cash flow margin to continue to trend roughly in line with operating margin. In closing, we are pleased with delivering a solid Q3 in a challenging environment.
Our net and gross retention rates remain strong, reflecting the durability and resiliency of our growth. We remain committed to driving growth and improving profitability while transforming our cloud business to be fully consumption oriented, and we are excited about capturing our market opportunity ahead. Now Jay and I will take your questions..
Thanks, Rohan. To join the Q&A, please raise your hand. And today, our first question will come from Sanjit Singh with Morgan Stanley followed by Deutsche. Sanjit, please go ahead..
I had two, I guess, one for Jay and one for Rohan. Jay, the spending environment hasn't been particularly great kind of all year.
And so, I wanted to get a sense of outside of those two customers that may be a little bit idiosyncratic, like what's changed in the environment? When did you start to see the change in the quarter? And how much of that would you attribute to macro versus just sales execution?.
Yes. Yes, it's a great question. So obviously, the two customers are kind of a significant impact, but particular circumstances in each. I do think we've felt kind of pressure on the number of net new software projects that are just getting funded throughout the year, and we've talked about that.
Kind of building over time, that's the motivation for this full shift to consumption that I talked about.
I think when I think about our execution, there's always something in one customer or another that could be done better, but the biggest systematic thing is really making sure we're lining up to drive the adoption in new projects, making sure that we're attaching to each thing that's happening.
This is something we've watched in peer companies and it's worked really well for them. And I think just because we're a younger organization, we're maybe a year or two behind in that journey.
And what we saw over the course of the year is definitely customers adapted to this environment, the combination of pressure on IT budgets, along with the switch in the behavior of our peers has really led people to kind of consume and then commit as it were.
And you really want to have then your go-to-market motion focused on the consumption side of things like really driving the adoption and the use cases that becomes extra important. And so, on the execution side, I think that's absolutely the biggest change we can make. And we're just very excited about the impact that can have.
I mean, obviously, there's some adjustment period as you go through switching all your internal systems and a number of the different definitions from pipeline to comp, et cetera. And of course, even adjusting some of the sales motions.
But if you think about why we're doing it, it really is to be able to align that to attaching to the next new project, making sure that that has streaming and Confluent as part of it, making sure that these new components of the data streaming platform, Flink and the connectors and the governance capabilities, making sure that that gets adopted and the customers are really consuming that.
That's far and away the biggest lever we feel like we have to drive additional growth..
Understood. And then Rohan, for you, I guess the question is in terms of your initial view into calendar 2024.
I guess my question is that for Q4 of 2023, you're guiding to 21% to 22% and then for next year, you're guiding essentially sustained growth? And how did you come up with the 22% number? And do you see any risk or sustained growth going into 2024, given what you're calling out from a sales force transition perspective, from a macro perspective? Just love a little clarity on how you set up the 2024 guidance..
Great question, Sanjit. Yes, when thinking about our 2024 guide, I kind of put it into three buckets. Starting off with, we called out the two large customers that had impact, although customer specific implications, but that will have an impact into Q4 as well as 2024.
The second category is around the macro, which, I'd say, a combination of geopolitical exposure to Israel, coupled with the slowdown in the new cases that we've been seeing. And the third category is the consumption transformation.
And as Jay just called out, any time you go through a transformation like this, there's going to be this adjustment factor. And we are trying to prudently bake in that impact of that into our guidance. So, taking a step back, these are the three drivers that impacted our guidance.
But when you think about the first half versus second half, we expect like the second half to be better off slightly than the first half, clearly because the consumption transformation will have, I would say, a larger impact in the first half of the year.
I mean, I also want to call out, Sanjit, that as we are exiting 2024, there will be a decent amount of tailwinds. First of all, the consumption transformation, which we expect will be behind us and which will reduce the friction between our go-to-market teams as well as how our customers want to buy our products.
Second, I mean, we'll have a decent amount of product tailwind behind us. We'll have Flink, we'll have GA about 6 months in and a couple of other unlocks from the data streaming platform perspective, coupled with FedRAMP and AI. So we feel that exiting 2024, we feel pretty good with where we are and just in general, from a long-term perspective..
We'll take our next question from Brad Zelnick with Deutsche.
Brad?.
Thank you so much, Shane. And it certainly is tough out there. And really appreciate the disclosure and granularity that you've given us. I wanted Jay to hone in on the one customer that you said is moving back into their own data center. Because that's not something we typically hear.
We've heard about this notion of repatriating cloud workloads, but it's hard to actually find it.
So, any context that you can add as to why on earth that might be happening, why that particular customer wouldn't have been use Confluent Platform at that point? And any reason to believe that this is the beginning of a trend?.
Yes, it's a great question. So yes, like you always hear a little bit about this like move out of the cloud. And by and large, we don't see it, maybe in the history of the Company, I could say one or two examples where that's happened. But yes, I don't think it's a broad-based trend. It's definitely motivated by cost.
When you think about the kind of pressure on IT budgets, it's far and away the most significant in the digital native sector. These are companies that, in some cases, are cutting 30% of spend. And so, that means very significant adjustments.
In this particular company that is coupled with the strategy of moving stuff out of the cloud into their data centers, and Kafka and Confluent kind of gets dragged along with that. We're in discussions with them on Confluent Platform, but that kind of overall shift is not something that's Confluent specific, it's part of a broader strategy..
And maybe just a follow-up for both yourself and perhaps Rohan. The new enterprise cluster that you’ve announced at Current, sounds like it's really powerful.
How much might this present a headwind to consumption as it seems you're making customers way more efficient, requiring fewer reserve instances? And is that something that you contemplated in the guide that you've given us?.
Yes. I mean, of course, that and all these other factors are baked into what we've given, but we think that's a tailwind. So when we think about aligning to the consumption transformation, a lot of that is on the go-to-market side, but it's also about the product. We want people to be able to land with the least amount of friction possible.
And then we want to make expansion as easy and automatic as possible.
And so, it's hard to stress how much better it is when we move things from an instance of Kora that's running dedicated to that customer to something multi-tenant, what happens is the customer experience gets much better because the cluster is just to spend instantly as you need it. There's no kind of manual invocation you have to make.
And secondly, it's obviously very positive for Confluent in terms of the efficiency. So, it allows us to do something that's a better deal for the customer, but really a much better deal for Confluent as well. So, it's kind of positive on both sides.
And our goal in this is to take a lot of this friction out of the system in adoption and expansion, make it as easy as possible to land the first use case and make it as automatic as possible for customers to expand. So yes, we expect it's a tailwind, even though you may be starting at a lower initial price point..
We'll take our next question from Kash Rangan with Goldman follow by Wells Fargo.
Kash?.
It's always tough to go after Brad Zelnick who asks such good questions, but I'll try it best. Jay, I think there was a comment made that as you exit this transition, the Company is going to be in a better position.
So can you just recap for us how structurally you're able to prosecute a better TAM more easily because of the switch in your go-to-market model? And secondly, just to post the devil's advocate, is the cloud platform missing certain things that you have to lower the barrier by giving an alternative to go easy on the consumption? Because I wonder what did this customer, gaming customers saw that that they felt compelled to put it in the data center and not get the benefits of the cloud.
Is there something missing on the cloud platform side itself by way of functionality, whatnot this customer might have seen that we might be missing here that you plan to introduce in the functionality set becomes less of a friction point going forward..
Yes. Let me address both. The first, I think, is more bigger picture, like, hey, how does this change -- does this change the market or our ability to address the market, how and why. So, I'll start with that big picture question. So like, yes, I think the opportunity in streaming is as big as ever, probably better even than it was a year ago.
I mean, if you just talk to customers, the level of excitement in this space, the adoption of stream processing, all of that is really at the forefront of people's minds. We just had Current, our conference, had a number of customer conversations there. This is something people are absolutely thinking about.
In the current environment, all of these forward-looking transformations in our customers, they have to moderate the pace, right? So whether it's cloud adoption, the move to real-time streaming, all of this stuff is happening a little slower. But this is absolutely right at the top of their agenda. So, I don't think any aspect of that changes.
And then, yes, exiting '24 what are kind of the tailwinds? I think Rohan did a good job of speaking to this.
So, you want it to be the case that the energy we're devoting in our go-to-market organization is about finding these use cases, making sure they're adopting the full part of the product, really consuming all the aspects of the data streaming platform, Flink, the connectors, our Kafka offering, all of it.
And you're doing that as rapidly as possible across the organization. And so, the -- what's important to get right in that is get all the systems pointing in that direction. And then that really becomes a driver for growth.
What you don't want is something where you're very focused on the amount of the commitment and trying to get customers to commit ahead of these projects, that kind of puts them into too much of an analysis paralysis of trying to scope out every new application you're going to build -- how much is it going to consume exactly, commit exactly to that amount, that ends up being effectively wasted time versus just getting them going and getting them to use the product.
I think you've seen this change in effectively all the PR companies.
For those that start with an on-premise offering as we do, they have to introduce it gradually as they have enough cloud revenue to really make it make sense, as we do now, right? For those who were kind of born with a cloud-only offering, this is perhaps the native thing they started with or perhaps there was some adjustment earlier in the life cycle.
So, that's the change. That's why we're doing it. It's had positive impact for all the peers. We expect it will have that exact same impact for us.
And of course, as we prosecute that, we want to make sure we get the impact as soon as possible, right? So, we've, I think, prudently baked some impact into the first half of the year that, hey, we think as we kind of change all our systems, there's some adjustment period involved in that.
But yes, as we're exiting the year, we think that's absolutely a tailwind. And that combines with and in many ways, accelerates these new product offerings that we're adding, the connectors, governance, Flink offerings, they're really coming to maturity throughout the course of that year.
Some of the security unlocks that like the work that we're doing on FedRAMP, also being able to address other kind of regulated industries. That's a significant unlock. Those things are really powerful in allowing us to take on more use cases, more completely with less work from customers.
And of course, these are mostly line items that customer purchases directly. So allowing us to increase the spend with customers. And then yes, is there something missing that we would make changes like this? I don't think so.
I mean, if you look at peer companies, of course, there's always more we can do in our product, and we continue to invest and innovate. But this idea of just aligning to consumption, trying to make that as low friction as possible, that's actually really important for us.
And especially the shift of workloads into multi-tenancy it's kind of just net positive, even though that entry price can be a little bit lower, the cost to us is so much lower that it's just disproportionately much better. It's exactly what you want to have happen. And that takes a lot of the friction out of the conversation.
And this is really important in tighter environments where customers have to think not just about the TCO and the payoff, but exactly that transition cost in time and do they have the ability to get from here to there. You want to make that a new brainer. So the customers can just get up and get going.
That to us is key when we look at all the open source usage, we want to go scoop that up as quickly as possible..
And Kash, just to add to what Jay said, the customer we referenced that was not a Confluent specific decision. That was just a broad, I would say, architectural decision that they made, and we happen to be impacted by it. Just wanted to share that color..
Yes, yes. In case that's not clear, like the decision to start moving workloads into your data center is typically something much broader than Confluent itself..
Do you have obstacles set for other customers who might be contemplating this nutty kind of thing to make sure that they don't do it in the future?.
I just don't think it's a -- every year, you hear about 1 or 2 companies that do something like this. And every year, the total spend on cloud as a percentage of IT spend goes up. And so I just don't think it's a broad-based trend. That one I do think is a specific instance in that customer.
If you think about what's the kind of overall pressure that causes stuff like that, it is this kind of pressure on efficiency and spend, and companies do different things to try and get that. This is what they happen to pursue. So, the right adjustment to us is not something where we advocate for customers not to move back into their data centers.
The right adjustment is this consumption transformation, like really line up make it as easy as possible to expand and attach to projects. That's going to be the thing that really drives us forward kind of broadly across the customer base..
We'll take our next question from Michael Turrin with Wells Fargo followed by JPMorgan.
Michael?.
Jay, I want to go back to some of the comments on just the cost side of customer discussions. I appreciate the details you're breaking out. Is there anything new that you're seeing from an optimization standpoint that's impacting the spend profile of existing customers? And you made some reductions to storage costs at Current.
So I'm wondering if that's something that all impacts customer spend and expectations in subsequent periods..
Yes. I mean, customers continually optimize their environments. I would say what we're kind of reacting to more is making sure we get all the new projects, right? So if you think about two forces that are happening in cloud right now, one is more about optimization of existing usage. There's, of course, some of that in our customers at all times.
But the other thing is just how many net new workloads are there. For a production system like ours, I think it's more about the new workloads than it is about optimization of environments. And so that's kind of where our focus is. When you think about what we're doing.
It really is this consumption motion is really about making sure you land and attach to everything that's going on in the organization.
And then, on the storage side, yes, similar to the enterprise clusters, these are efforts that are designed to accelerate consumption, particularly when you think about Flink and stream processing, it becomes important to not just process the data once, but if you change your logic, be able to go back and reprocess it again.
So the incentive to want to store data becomes much higher in Confluent and we want to make sure that it doesn't become cost prohibitive as customers go after that. So in part, we're trying to set ourselves up for this preview release of Flink that we just did as that becomes GA.
We want to have a cost structure that makes sense with the stream processing capabilities. And we think the result is going to be net positive in terms of how much data companies are storing..
Okay. That's helpful. Just a follow-on, if I may, on the go-to-market changes you're making around consumption.
Can you just help characterize how big or natural the change that is from the sales side? And maybe, Rohan, if you have a view on if that impacts visibility into the cloud trends you'd expect going forward, either favorably or less favorably that's also useful. Thank you..
Yes. It is a big change, right? So if you think about how an enterprise software company works, in this traditional model, everything really is organized around booking of commitments. That's what pipeline is. That's what the sales motion tracks. That's the line item in sales force.
And you kind of are changing all that, where the thing you're going after is the use case, the thing that you're driving is consumption, the definition of pipeline is different. So there's a big change. Now it's not like the cloud transition we went through. That was a really big change.
This is an adjustment that we're kind of factoring into the first half of the year. Our desire and the thing we're working towards is hit that as smoothly as possible. Like we're doing this because we think this is absolutely a positive, right? It's a thing that will allow us to grow faster.
But when you make a bunch of changes all at once, you want to factor that in. You want to be prudent about how you do that, and that's why we've made the adjustment. This wasn't a new plan, like we're not doing this purely in a reactive way, where we got halfway through the year and we were like, oh, let's change everything.
We had laid out a three-year plan for how we were going to do that. We were in year one. And we were going to do a little bit more next year and then a little bit more the year after.
But realistically, when we just looked at what had happened to customer buying behavior, like how are customers behaving with regard to these consumption models and then how are we set up to work with them. We just realized, hey, this is not aligned. It does not make sense to slow roll it. We should just do it fast.
Maybe it's a little bit more disruptive at the very beginning of the year, but I think you get to something net positive, much, much faster in that world. And it's -- in some sense, easier than having one leg on each horse for an extended period of time..
And Michael, to your question around visibility, I'd say that when you look at the view of the world in an old scenario where customers are still committing ahead. Obviously, you have the visibility of RPO and CRPO, but in a world of consumption, it's always trying to predict what the customer is going to do. It's not ratable revenue.
It's always it follows a curve, you need to predict what your customer needs to do. And how do companies solve it, they're essentially trying to do some predictive models by multiple factors, that predict that curve.
In the new world, of course, what we are expecting and what we think is going to happen is the customer commitment is still going to be there, but they're not going to be committing ahead of time. They’re just going to be committing to your existing levels.
So how do we get the predictability? As Jay alluded to, like -- we get the predictability by understanding what are the use cases they're going to drive.
So the consumption transformation and all the work that we are going to do here will provide us, I'd say, added visibility into the work streams, which will help us predict revenue in a much better manner.
And that's one of the reasons why I called out in my prepared remarks that going forward, subscription revenue will probably be a better indicator of the health of the business..
And you can see exactly why it is, right? Like if we work with a customer, we're trying to scope out every workload over the next two years and commit to lock that in. Of course, there's some predictability like in the two years, we know what the minimum they have to spend is.
But if we're successful, they should be growing their consumption much faster than that curve anyway. And the effort of trying to really think of each application and how it's going to be used and exact usage pattern and all the uncertainty that pushes on to the customer is actually just really high.
And so taking that more out of the equation and just having customers really do it in a customer-driven way where customer locks in the right level of commitment based on what they know they're doing, that's effectively what all the peer companies have done. It's worked very successfully for them.
And that allows you to then really orient the go-to-market organization around finding those use cases rather than getting people that kind of pre-commit to them..
We'll take our next question from Pinjalim Bora with JPMorgan, followed by Bank of America..
One clarification on the gaming company that you're talking about.
Can you help me understand was that company on committed contracts that move to open source, committed contract moved to Confluent self-managed? What was the transition exactly?.
Yes. So, this was a set of applications running out of the cloud running on Confluent Cloud. That's moving into their data center initially on open source Kafka..
Got it. Understood. Okay. That's clear. One question on Flink in general. We have spoken to a lot of your partners.
And it sounds like without Flink, and this is more of a high-level question, people are using Kafka streams on AWS with a Java-based application which is squaring into your Confluent clusters, right? That's a compute that is not being captured by Confluent today. I'm assuming Flink kind of allows you to capture those compute.
But then we're also seeing AWS kind of coming up with their managed link service, right? I'm trying to understand Confluent being able -- now that you have -- you own the data persistence layer and the processing layer, as a whole, can you actually provide or offer something which is more than a point product Flink service that might be in the market?.
Yes, we absolutely can, right? The challenge in the streaming world has been for a long time, customers are left kind of piecing together the parts their own to build a data streaming platform.
So you get Kafka and then maybe you add on some Flink thing, you get somewhere else -- and then the parts don't work together, like the security models are different. The governance and catalogs of data are different, it's all different. So it's kind of left as an exercise to the reader to put those things together.
That's not the recipe for success, right? We think in this area, customers want to integrate it. So just in what we demonstrated at current already. I think we've taken an amazing steps. So unified security model across all of this, it handles all the connectivity. Unified model of data.
If you go and create a new stream in Kafka, it's available there for you in Flink. You don't have to go and recreate it in Flink again. The ability to actually just seamlessly use these two together and do it without having to preallocate anything. You can go issue a query against your Kafka cluster.
It will scale up and run that processing and scale back down to 0 cost if it finishes. So already, that integration is leaps and bounds further than anything that's been out there, I think, in the streaming world, integrating both the stream and the processing.
And so if you think about, hey, what are our advantages in doing this, like stepping back to the question behind the question. I think the reason we feel like we have a lot of ability to succeed in this beyond any other competition. First of all, Flink is this platform that has been incredibly successful that everybody wants.
Second, we have the ability to integrate that stream with the processing. And if you think about the model for data systems, that's just much easier. You could buy a database in parts where the query layer was separate from the storage engine, you can piece it together yourself, but that just doesn't seem like the winning formula.
And it's obviously and very concretely a hassle today. And then finally, one of the things we're seeing is that people are thinking about this processing and streaming together, and that's moving upstream in organizations.
And so, when we think about any competition in this space, most of it tends to come from different destination systems, but increasingly, the processing is kind of happening at the source and going out to all the destinations. So structurally, those are the three factors that we think give Confluent a unique advantage in this.
Our expertise in Flink and just the fact that we have the core people driving that forward, the ability to combine it with Kafka, the kind of de facto standard for the stream and really provide synergy between those things.
And then structurally, the fact that the processing of data is moving upstream to the source where the stream is created, which allows us to bring together that combination at the right time..
We'll take our next question from Brad Sills with Bank of America, followed by William Blair..
Wonderful. Thanks, Shane. I just wanted to ask a question here on this transition here, maybe one for you, Rohan.
Should we think about this headwind, if you will, subsiding in the one-year anniversary from the change, or is there some other way to think about the timing of when that headwind kind of bottoms out, if you will?.
Yes. Brad, that's a great question. And as I shared in the prepared remarks, there are a couple of headwinds heading into 2024 which we spoke about a couple of customers, macro and the consumption transformation.
And as you think about the shape for next year, we feel that the consumption transformation will have a bigger impact from an adjustment perspective in the first half of the year than the second half of the year. So that's one data point.
And I think Jay and I touched on it, exiting Q4 of next year, we'll have a few tailwinds, and I'll just reiterate a couple of them. The first one is the consumption transformation will be behind us. And what that means is how our customers want to consume confluence product and services will be very aligned with how we are going to market.
So that's number 1. number 2, with respect to the product unlocks, of course, we'll have Flink which will be in GA, coupled with other unlocks on the DSP side, that's going to be FedRAMP exiting Q4 '24, and there's going to be tailwinds from AI.
So overall, we feel that there are a decent amount of tailwinds exiting 2024, which gives us confidence that we'll get back to a 30%. It's just timing. We just want to get -- don't want to get into '25 guidance and specific timing, but that's a trajectory that we are thinking through..
And then maybe that leads into my next question here. You mentioned a few potential tailwinds exiting next year.
Maybe to you, Jay, the Flink opportunity, FedRAMP, AI, what are you most excited about when you look at the pipeline and some of these initiatives potential to accelerate growth?.
Yes. I mean, I think probably what's most exciting, big picture is actually just -- if anything, there's been an acceleration in interest in streaming.
I would think about this in stages where this is a market that went from kind of very early nascent kind of evangelical to something that now just increasingly very senior people and organizations believe in and see happening and are trying to adjust to it and figuring out what does this mean for us and how do we take advantage of it? I mean, I think that's probably the biggest thing overall.
In terms of our products, I love them all. That's why we're putting all the energy into it. That's why even in a year of a lot of efficiency, we really made sure that we had sufficient investment to bring together the rest of this data streaming platform, and we just weren't willing to cut that stuff.
I think probably the biggest thing for us in terms of transformation of what we do is the Flink offering. I mean, that's what brings these application workloads directly into our platform, what makes it so much easier to build some of these things that actually capitalize on streaming to kind of run the logic of the business.
In some sense, that's the most exciting, but they actually all play together. Without the connectors that get the data, you don't have the streams to process. Without the innovation in Kora, you don't have the ability to really handle the streaming data across a big organization.
Without the governance stuff, it's actually just hard to use this and have correct guarantees around data and reason about it. So they all kind of play together to form the platform. So I think they're all kind of a powerful part of that story. And I think that story is why people are so excited about this space..
I'll take our next question from Jason Ader with William Blair, followed by TD. Cowen.
Jason?.
Jay, I know you said you haven't seen this in the past, but why wouldn't repatriation be a trend in the digital native space, if you can save 30% on costs? And then, what percentage of your revenue actually comes from digital natives today..
Yes. So, I should clarify the comment I made. What I was saying is if you look at the amount of kind of operating margin savings that companies are targeting, it's often a very large percent. I'm not saying that moving into a data center saves you 30%.
I'm just saying, you're trying to say 3% operating margin improvement, you move some things around in a little small ways. 30%, you're going to do something big, right? For Confluent, we've made really substantial improvements that came out of a combination of growth, structural improvements, optimization of cloud, a whole bunch of things.
But it was a bunch of moving parts to make it happen. If you look at other digital native companies, they're also doing big things. I think for 99.5% of companies moving into your data center going to be a lot more expensive. But that's up to them to model it out and figure it out. But I'm not saying that you'll see 30% savings from that.
Does that make sense?.
Well, I mean, there's -- I guess, there's just more examples. I know CrowdStrike moved back on-prem, right? And they save, I think, 2 or 3 points of gross margin. So, it seems like it’s material that some of the digital natives are getting significant savings, maybe not 30%, but significant..
Yes. I mean, look, we can speculate either way. I guess, I feel like we hear about this every couple of year really. We heard about the Dropbox doing it. Every year, you hear about some example, companies selling private cloud stuff, put it on the front page of everything to make a big deal to how the cloud is over.
That's not really what we see, like broadly, when we talk to our customer base, including the digital natives, I don't see a big movement there. I do think you see a lot more optimization of cloud, the usage of public cloud, that's certainly happening. But I don't see a big move out as being realistic. I just don't think it makes sense.
But we'll see for us. We obviously sell across both cloud and on-premise. So to some extent, we're like a little bit agnostic. I personally as a technologist, just don't see it as a likely trend. I did this analysis internally when I worked at LinkedIn. I didn't think it made any sense. We were using the public cloud.
Public cloud has gotten cheaper since that time. So I don't think it's likely, but obviously, either way, we're happy to work with the customers..
And then what percentage of your revenue -- maybe for Rohan, what percentage of revenue comes from digital natives and then what percentage of your revenue comes from Israel?.
Yes. Israel happens to be a top 10 country, Jason, and so that's baked into the outlook that we shared both for Q4 and 2024. The digital native piece, I think it's an opportunity for us. I feel that where -- we have more opportunity to take share in the digital native. So that's obviously something that's ahead of us versus behind us.
So although we are seeing a couple of digital native customers that have been impacted by slower than expected use cases, but the bigger opportunity is ahead of us from a digital native perspective..
And if that sounds weird, the thing about Confluent is we kind of came back to tech like open source Kafka started in tech. But in terms of our sales, we actually started with more kind of traditional enterprises and as our cloud got to large scale came back to tech. So it's on one hand, it's a sector that's a little turbulent right now.
On the other hand, we're kind of underpenetrated relative to the Kafka usage. And so there's kind of two countervailing forces. That does make it a little bit of a less predictable area. But like our Rohan said, we actually think there's a lot of opportunity there..
Thanks, Jason. We'll take our last two questions from Derrick Cowen and Raimo with Barclays. Derrick, you go first..
Jay, when you go through such big go-to-market and sales comp changes, this is likely to be disruptive from a -- kind of from a sales attrition standpoint.
Are you able to talk about what -- how much proactive change you need to make to the makeup of the sales structure and what that may mean for churn? And then in kind of consumption, go-to-market model, do you feel like you need less sales coverage than in the old go-to-market model?.
Yes. That's a great question. So, yes, I don't think it's going to be that disruptive because we did start the process this year. So we're measuring consumption. There's a portion of compensation that's already dedicated to it. We've kind of taught our team a little bit of this.
So, it's not like we haven't dipped our toes in the water, and that I think takes a lot of the risk out of it. We kind of know that people can be successful with us. So, that's kind of what gives us I guess, the confidence to jump in with both feet.
I think realistically, if we were trying to do this from a cold start last year, I think it would have been quite disruptive and highly risky. I feel pretty good about what we're doing. Now, it's still a big change. It does mean that those primary metrics are all shifting and are kind of all shifting together.
I think -- when I think about it from the team's point of view, I think it's actually exciting. I think people feel this friction today and how they're working with customers. I think doing the thing the customer wants is actually great when you're in a go-to-market organization.
And if you look at the kind of growth in cloud revenue, we've done really well, right? The pressure has been on RPO. It has been on bookings. It has been on these kind of big upfront commitments. So, part of this is about making Confluent grow faster. Part of is also making sure our field team is really successful.
And those two things kind of go together. So, yes, I actually think it's pretty exciting. And it's -- for a lot of people in go-to-market organizations, these consumption motion is kind of an opportunity. It's the new thing that's happening across a lot of these companies.
If you kind of know that and have done it, you're kind of in a better position and what you're set up to do..
And Jay, I'll just add to what you said. Derrick, if you look at just the health of our installed base today, the redemption rates, the gross retention rates have been consistently above 90%. I think that's one area which we feel pretty -- very good about.
And just looking ahead, of course, with this transformation and the one-off cases that we said, we expect the NRR probably to dip a little bit. It will still be above 120% as we go through next year.
But exiting next year as we exit the transition, we expect that to -- the 125-plus percent range that we set from a medium-term code from an NRR perspective. So the health of the installed base is also very solid. That's something that I wanted to share..
That's helpful. And Rohan, just quick -- a couple of modeling questions.
How much of cloud revenue is consumption based today? And how long will it get to be majority? And I guess on the flip side, with the go-to-market change away from on-prem, any color to get a sense for what the bleed-off of platform revenue is going to look like next year?.
Yes. For cloud revenue, 100% of our cloud revenue is consumption. And that's the motion that we've been driving. And to Jay's earlier points, that's why we are excited with respect to the reduction in friction with how our customers are buying and how we are selling. So that's on the cloud side. On your second question….
It's like the pricing is a usage-based model. The go-to-market is still strongly oriented around commitments, credits as other companies would call it..
That's correct. Yes. And on Confluent Platform, of course, right now, I'm not going to double-click and provide guidance for 2024.
But just broad brush, if you look at our platform business over the last 12 months, we've been very happy with the performance, especially we've seen strength in the industries like financial services as well as the federal space. So, we're in a good position as we exit next year -- as we enter next year, rather..
All right. Thanks, Derrick. Raimo, you got the last question..
So at the conference, you could see the excitement around Flink. And I was kind of -- the rooms were full, I didn't get a seat, et cetera.
How do you think about the adoption curve as you go kind of GA next year? Do you think there's going to be like a couple of kind of reference customers and everyone is going to look and see how that's working, or do you think that's going to be more broad-based? Just to get an idea about the excitement for next year for the Flink kind of rollout..
I think it will definitely be more broad-based than a couple of customers. Yes, I mean, this is out there in preview today. We've had great early adoption of people trying it. It takes a little bit more before people really put the kind of mission-critical production workloads on it. That's that kind of ramp up to GA.
But yes, we expect to see great adoption over the course of the next year. We're very excited about it..
All right. That concludes today's earnings call. Thanks again for joining us. Have a good one, everyone. Take care. We'll talk soon..
Thanks, everyone..