Podcast

From hype to value: How AI is reshaping growth in technology and software

header

Transcript

[00:00:08] Narrator Welcome to the second series of The Growth Blueprint, a podcast from Simon-Kucher, a global consulting firm with 40 years of experience in helping companies unlock smarter, more sustainable growth. As the world's leading pricing and growth specialist, we work with clients to increase both revenue and profits, helping them turn commercial strategy into measurable results. Throughout this series, we will look at the key trends shaping the future of business. We will cover all major sectors, giving you a panoramic view of what's happening in the market and what you can do to stay ahead.

In today's episode, Sarah Yamase, Partner and head of Simon-Kucher’s technology, media, and telecom sector, speaks with Eddie Hartman, Senior Partner and Board member, to explore how AI is reshaping value creation in the software and tech industry. From the importance of proprietary data to new types of pricing models, they share their points of view on how to turn innovation into growth in 2026 and beyond.

[00:01:04] Sara Yamase Hello, my name is Sara Yamase. I'm a Partner here at Simon-Kucher and Partners. I lead our TMT practice and I'm here with my colleague Eddie Hartman. Eddie?

[00:01:14] Eddie Hartman Hey Sarah. I'm a Senior Partner within the TMT practice. I'm also lucky enough to be on the board of Simon-Kucher.

[00:01:21] Sara Yamase So we've seen a lot of change happening, especially over the last couple of years, but I've seen a lot of headlines saying is SaaS dead? Is user-based pricing dead with the advent of AI? We've seen AI become embedded in everything. It blurs the line between software, data, and AI moving forward. So we're going to talk about what does it really take to win the market in the era of these artificially intelligent products. The first question I have for you, Eddie, is what is exciting about AI right now?

[00:02:01] Eddie Hartman I think the most exciting thing about AI is how AI is slowly starting to become unexciting. By which I mean it's no longer a remarkable thing. What looks like advanced knowledge from technology will never really be the same thing as a human mind. I think increasingly we just don't care. I don't hear people debating like, can Chat GPT actually think? Does it actually have consciousness? Is it self-aware? Because we've realized like maybe those are just artifacts about how we used to think about intelligence. Maybe those don't really matter. The ongoing debate about whether or not artificial intelligence is actually the same thing as what we mean by human intelligence. I think increasingly people are just going to throw up their hands and say, I don't care anymore. And that is, I think, what's exciting. We're beginning to accept it as just part of our lives instead of this novel thing that needs to be challenged.

[00:02:58] Sara Yamase Do you think, Eddie, that there is a change in behavior because of that transition to indifference?

[00:03:09] Eddie Hartman Yeah, I think so. And it's dangerous in some ways because I think that people will accept what comes out of ChatGPT almost at the same level as they'll accept what comes out of a human being. I actually just had this yesterday. My wife had a fairly serious health complication that took us to the hospital. And there we were both on ChatGPT. She suddenly looked very worried. And she said, Oh, ChatGPT says that this could be this and it could be that. And I said, honey, we're at a hospital. We're about to talk to actual doctors. So I think what it means is that we blur the lines, we no longer consider the source in the same way that we would with a human being, for better or for worse.

[00:03:48] Sara Yamase That's super interesting. And especially with your example of increasing amounts of background data and information, do you see that data being an advantage for folks who are providing AI capabilities or LLM?

[00:04:11] Eddie Hartman Are they getting better? Sure. Do they need to get better? I think is a good question. What matters much more is what data do they have to work on? Not the volume, but the specificity, the uniqueness of the data, potentially. And I'd say the other dimension is how do we use it? What are the questions that we can ask it? There's this old paradigm to teach people about infinity and how big infinity is. They say, well, if you had a room full of monkeys, an infinite number of monkeys banging away on typewriters, one of them would produce Hamlet. Because if you think about what are the odds that a monkey would hit the first character in the play Hamlet, it's some odds. Well, how about the second one? It's okay, now we're talking about compounding odds, and you get bigger and bigger. And okay, it'd become extremely unlikely that any one of them would type start to finish the entirety of Hamlet. But with infinite monkeys, it would be certain that one does. The metaphor here is the volume of data. You have enough data out there. Somewhere is your answer. When I heard about the infinite monkey’s paradigm, the question that I came up with was this: but how would you find the monkey that had the copy of Hamlet? You would need either a copy of Hamlet to start with in order to compare it against whatever the monkey was typing, or some way to find the monkey who'd done the trick. And I think this is why the second most important thing is how we interact with the AI, how we use it, how we prompt it, for example. But I Sara, I think those are the two most important things. Can you ask the right questions? And do you have not volume of data, but specific data, unique data that allows you to power your model? Well, how do you feel about it?

[00:06:03] Sara Yamase Yeah, I 100% agree. I think what I've seen at least a lot of companies starting to do, but to obviously varying levels of success is to say, hey, we have access to proprietary information that either we're generating based off of interactions with people and with companies, or that they're building it themselves in the process of learning what they do in a specific sub industry vertical. And there's a lot of value of can you use a rag model to be able to make your ability to find the Hamlet monkey faster? That is really, really interesting. Like I think from what I've seen, I'm hoping that companies start to get to the answer in 2026. But even companies who have inherently sold data, I don't think are doing a great job yet of being able to figure out how to kind of unlock the value that is in their data.

[00:07:10] Eddie Hartman So interesting you say that. And most companies, of course, don't sell data, that's not their business.

[00:07:14] Sara Yamase Mm-hmm.

[00:07:15] Eddie Hartman But I think that companies are going to start to realize that all of them have data as an asset. Even if they don't sell it directly, they're going to start selling it indirectly because it's what they need to power their models. And Sara, what advice do you give to companies that sort of haven't made that turn yet and don't realize, hey, the data that you have is going to be fundamental to your future?

[00:07:40] Sara Yamase I think one of the main things that we've seen happen over the last few years is that people thought, especially generative AI was valuable because it was at that point very exciting. Like hype cycle was off the charts. But to your point, as AI gets less exciting, people realize, hey, I can't just add on a chat bot to my software and call that differentiated value because it's not. It's actually both the cutting edge of technology, but a commodity because they're driven by the same underlying LLMs that you have to say, what is going to give me the competitive edge? And I think the one area where you're able to find the Hamlet monkey faster is with data, is with being able to say, okay, how do I get to not just a better answer, but an increasingly better answer over time because I have the neural networks of data pointing me in the right direction as quickly as possible.

[00:08:50] Eddie Hartman Yeah. First of all, I agree with every word you said. I don't think one in twenty companies has made that turn yet. I think it's the everyone is focused on sort of the power of the drivetrain. Very few companies are focused on the power of their inherent data.

[00:09:06] Sara Yamase Yeah, Eddie, I want to pivot the conversation a little bit, but stay on this AI value trend. I mean, we wouldn't be Simon-Kucher if we didn't talk about how does this value turn into monetization? How are price models evolving? And I think there's some folks that would say they're evolving faster than customers can keep up. What are you seeing in the market right now?

[00:09:33] Eddie Hartman It's almost like tired to say that change happens slowly than all at once. So I think when we speak to investors and when we speak to people who are in sort of like dead in it, their worldview is we are already living in the year 2028 and everyone has already shifted over and user-based, seat-based, license-based is dead. And I think that they're right if you're looking at the future, but they're not right if you're looking at today. So many people still want to buy based on a user-based model. And you need to respect that right now. Okay, sure. The future absolutely is different. But today, you need to think about whether or not your customer base is willing to move away from the way that they've traditionally bought, or if you need to execute some sort of a stair step. How do you think?

[00:10:30] Sara Yamase Yeah, I a hundred percent agree with that. I do think that a lot of investors are pushing companies to at least start to think about that transition or make the change very, very quickly, like faster than I expected. The one distinction I would make, however, is between those two phrases that I just said. I think there's a difference between starting to think about what changes need to be made and actually making those changes, because I do think that the transition will need to happen to your point, Eddie, at the same pace that that customers are willing to adopt to that change.

[00:11:16] Eddie Hartman You're so right. I think technology happens like at a technology pace, but human behavior and sales is human behavior changes at a human pace. So I think companies that are saying, hey, look, we're just going to go straight into usage or outcome, maybe getting a little over their skis. They need to think about, yeah, but is your market ready to accept now? Other markets, man, they are absolutely waking up to this and saying, well, this is what we always wanted. We always wanted to pay based on the outcome, right? And I'm sure you're seeing that as well.

[00:11:50] Sara Yamase Yeah, 100%. Yeah, it's really interesting to me that I think traditional or different roles within an organization also has an impact. If I talk to any CFO, they're like, no, no, no. We want predictability and we want to be able to make sure that the ROI has a balance of being shown, but also something that I can budget for. And on the software vendor side, something that I can actually put against my accounting rules and come out square on the other end. How do you think, Eddie, about what types of measures make more sense for a company versus others?

[00:12:43] Eddie Hartman Yeah. Well, I mean, we have this paradigm of thinking about are you replacing the workforce or are you augmenting the workforce? If you think about an airplane, are you creating an autopilot that's gonna make the pilot more productive, able to put more planes in the air, lower fatigue, that kind of thing? Or are you making a drone where you don't need a pilot? So I think that's gonna have a big impact on that tension and on how you meter things. I guess the other metric, it's like you said, Sara, how quantifiable is the output going to be? If it's highly quantifiable and you're reducing the workforce, then you can think about some really cutting-edge models. If on the other hand you're not, then I'd say more of a stair step, more of a crawl rock walk around tends to be the way we go.

[00:13:29] Sara Yamase Yeah, agreed. I think the other dimension that I've seen our colleagues talk about different ways to consider usage-based pricing is, is it based off of compute like Snowflake or Databricks? Is it something that is more input-based? So what are the bits and bobs going into a solution that the provider or the vendor is refining versus what are the outputs? And I feel like before you get to outcomes, outputs is probably where a lot of folks are trying to get to. And I think that's really interesting because it really closely aligns the value that you're getting from a solution with, yeah, then how much are they paying for it?

[00:14:17] Eddie Hartman Intercom famously went from, hey, charging based on interactions on customer care, to did the customer rate the agent a four or five? Once you put part of the revenue in the hands of the customer, or really you put it outside of your area of control, I can only imagine it's gonna make some people very uncomfortable. Well, how do you think, Sara?

[00:14:42] Sara Yamase I agree. I actually have had conversations about outcome-based pricing with clients where they're using the discussion or the consideration of it as a way to one up their customer. It's like, hey, we're gonna do better, therefore we can charge more. And their customers over there thinking, oh, if they do poorly, we can get charged less. And to me, that that kind of defeats the purpose of outcome if you're gonna consider anything on outcomes, because ideally you're partnering to have a win-win. You're doing these things more efficiently, faster. And that that creates a shared incentive to drive growth. But that is really hard to do in a historical cloud-based SaaS offering when you know you have very little influence on how your customer is actually using the product.

The one thing I will add, however, is that knowing how much additional value a solution provides, to me, that's critical. You don't have to charge on it, but you have to know how to measure and articulate that value so that it's a no-brainer for someone to be able to adopt an AI-driven solution that should be able to get them from this level of productivity to this level of productivity or this level of sales conversion to a higher level. Cause without understanding that, people are still skeptical on the efficacy of AI and what it can do. And they need to be convinced about it.

[00:16:25] Eddie Hartman Sara, that is just such an interesting topic. I wonder if we could like let's go one level deeper on that. Because I sometimes tell people, if you're selling nails, it does not matter whether it's being used for an outhouse or a townhouse. It's the same nail and people don't expect to pay more. You could, of course, smart packaging, you could create with different grades of nails. And one of them could be like a high-end nail with a nice finish, or hey, maybe it has a better rating from some engineering standard. So you can be sure that these nails are gonna stand up to Gale Force winds or what have you? But ultimately, like to talk about your specific example, if you have two sales outcomes, or in the thing I was talking about, you've got two customer accounts and Intercom comes in and the AI agent without a human being involved, talks to two different customer accounts and they both give it a four or five. So the outcome threshold is met. Can Intercom charge more if one of them was a $10 million account and one was a $10,000 account? Is there some way for people to structure an outcome metric so that as you're saying, Sara, like it incorporates like, well, what's it worth in order to make sure that both sides feel good about things? To me, I very rarely see a model where that's even talked about, let alone possible. Are you seeing anything out there?

[00:17:50] Sara Yamase I think that a lot of our clients historically have kind of generalized, right? Between different industries, different industry verticals, how big, how many users are effectively going to be touched by that thing. But not everything has been a proxy, right? It's either a proxy on industry, it's a proxy on buyer, it's a proxy on size, it's inferring it. Like a lot of what we do, right, is try to infer people's willingness to pay based on feature differentiation and how sophisticated is this thing that you're providing relative to other versions. But all of those are drive differences in choice, but not actual, hey, you have a higher ROI for one situation versus another.

[00:18:43] Eddie Hartman Yeah, I think that the best that we can do is an approximation, like you're saying, through the metrics that we can use as a proxy and then potentially using packaging to self-select. So back when I was at  LegalZoom, we had this issue of we would pay for clicks on Google. Let's imagine for divorce. Law firms wanted those clicks and we wanted those clicks. And a law firm, well, a law firm oftentimes will take a divorce matter on contingency. And what that means is that the person getting divorced doesn't pay anything. But then the lawyer takes a portion of the judgment. And that's a really interesting metric. And I think that for LegalZoom, we couldn't do that. We didn't know what the judgment was going to be. We just really were helping to facilitate the person getting divorced. And I think that many companies who are in AI now are in a similar situation to where I was. But they don't know what the outcome is going to be. There's no way for them to metric or to use that as part of their price model. What they can do, however, is do smart packaging to create different versions so that somebody who's getting divorced and there's $10 million on the line chooses a different package than someone who's getting divorced and has $10,000 on the line, right? But I think at least I have not seen metrics where you get a rake on the actual outcome. I don't see things trending there.

[00:20:08] Sara Yamase Yeah. And to the point that you had made earlier, the amount of predictability on that is difficult until you see kind of a lot of usage, a lot of trends. And even then it's not going to be precise. It's just going to be directional in order to give customers some level of comfort that, hey, this is this is the order of magnitude of how much usage based on certain demographic factors you will likely be at. 
We've been talking a lot, Eddie, about super engaging and interesting topics. But I imagine a lot of people listening to this say, well, how do I how do I actually make this work? You know, how do I commercialize these types of models? Like what does my commercial team look like in the age of AI? What have you seen changing recently?

[00:21:05] Eddie Hartman You might say, like, well, in that world, what are we going to do with our sales teams? I'm seeing people wanting to use AI to make people more efficient. And they realize that that what that means is that maybe we won't need as many people. But it's more like a 10-15% cut. It's not like wipe out the entire sales team. I think what people really want to do is use AI to make their internal people, their sales organization, their commercial operations more powerful. They want to take a newbie and make them sound like a veteran. Like the average tenure of a salesperson in America is 18 months. So if you have somebody who's only been on the job for 18 months or much less than that, right? How do you make them sound super experienced and give them the tools and support to make them really kind of superpower? And how do you take your very best people and turn them into real superheroes, make them far more powerful than they ever were in an organization? There's a lot of tools out there that'll help you with that effort. But Sara, how do you feel about those statements?

[00:22:10] Sara Yamase Yeah, I think that makes a lot of sense. I've also worked with a lot of companies where we'll look at things like sales tenure with price realization and discount depths to see, hey, are more experienced salespeople better able to achieve outcomes that are higher, all things equal? And the answer is 100% yes. Like you learn what works, you learn what doesn't, and the ability to onboard folks in a way that allows them to get up to speed as quickly as possible because either they have knowledge of the account, that like account history has been captured well, or they're trained on specific topics or they have chat bots telling them how to help engage or answer a question or filling out what are the needs of this customer. Yeah, I like I think there's a ton there that yeah, that can to your point help not necessarily replace kind of a whole sales forces.

[00:23:17] Eddie Hartman I think this stuff is so powerful. At Simon-Kucher forever, we've done something called peer pricing. And peer pricing, for those who don't know, is basically let's use math models and machine learning to look at thousands of deals that have gone through the system in order to say, what is the right price? At what price were people able to achieve a win, a yes, at what price did you know the client walk away and use that to inform sellers like, hey, this is based on what we know about this particular deal, looking back on other deals that were like this one, this is a good range. I think, and Sara, I'm working on this right now with a client. I find it super exciting. Instead of just saying, this is the right price, saying this is why this is the right price. So using AI to go one level deeper, and instead of looking at deal paper in the past as, well, what was the price you were trying? What was the going in? Walk away and everything else, what were the arguments? What did they respond to? Getting as much information on the on the won customers, lost customers, in order to power the model and say to a sell to a seller, not only is this a good price to go for, but here's why this is the right price to go for. Here's how to talk about it with your client. I find this really, really exciting stuff, if I'm honest.

If I can like turn the tables and ask you a question here. So I think what we're coming to therefore is that prior data is becoming one of the most important assets that a company has. This kind of takes us full circle to what we started off with. How do you see companies trying to ensure that they have that data set, or do you?

[00:25:06] Sara Yamase I feel like a lot of companies inherently have that data. And I think some more than others, right? There's a lot of clients that we've been talking to, especially recently that talk about, hey, we're a system of record. And because we're a system of record, we have access to data sets that no one else does. Very fundamental things about how a business is run, how many customers it has, what price points, what products, how much are they engaging? Are they increasing usage over time? Are they decreasing usage over time? Like all of those things. And I think we're going back to the topic we talked about earlier of okay, then what do I do? What do I do with it? But I think your question is more about how do you become someone who has access to that data?

[00:26:01] Eddie Hartman I don't hear that that often. I don't hear people saying, oh, we're a system of record. I hear people instead saying things like, I'm embarrassed to tell you that most of our sales tracking data is probably BS or there are big holes, like, oh yeah, our loss codes are totally unreliable and people don't really enter them properly.

[00:26:20] Sara Yamase I think there's a difference between what is our solution doing for customers versus how are we using other solutions to be able to run our business? And I think in the latter you're right. Like even those system record companies may not have the cleanest data in the world. It's hard. And it's increasingly hard in a world where change is happening constantly.

[00:26:47] Eddie Hartman If you are listening out there and you're thinking, I don't have clean data, I don't have clean data, don't worry. You're in the majority, I think.

[00:26:54] Sara Yamase I would say if you are a company out there saying, yes, I have clean data, yes, I have clean data, I would highly question that. Anyway, but I think the number of interaction points you have with your customers, like that inherently, as you move from, hey, I have no control over my data because my solution is an on-prem solution, to, hey, I now can start to track usage that my customer engages with my software because I have pendo or another usage tracking solution. Like even that is increasingly valuable in a world where we wanna be able to understand what our customers are doing. We want to be able to find out how customers are engaging in the sales cycle, how they're engaging while they're using the product, how they're engaging as they're offboarding, and being able to derive some amount of patterns and insights for that. And in order to be able to find the monkey with Hamlet, you need to start off with having an infinite number of monkeys in some way. I do think that it is something that a lot of companies haven't started yet. They haven't thought through yet, especially if you're taking it one step further to say, I'm not just gonna look at trends on this data, but I actually want to charge on this data. That's a whole other level of sophistication.

[00:28:24] Eddie Hartman Way to bring it back to the Hamlet monkeys, by the way. But Sara, would you advise a company to even potentially invest in it by doing things like telling customers who are only buying a spot solution to accept, would you offer a tell them to offer a bigger discount or bigger break if they're going to a platform platformization where you can then see their data across multiple interactions?

[00:28:53] Sara Yamase Yeah, for sure. I mean, I think platform adoption is not just a, hey, can I collect more data points point? How do I get more for less? Like, how do I buy something that meets more of my needs instead of saying, hey, I have tech sprawl and I have best of breed point solution everywhere. I always give the example of we use Microsoft Teams because the level of integration with other Microsoft products is high. We're already paying for it. So why pay for another additional point solution? Increasing amount of platformization and encouraging customers to buy more, huge, huge trend right now. You like you have to have a packaging construct that allows and encourages this. And it will do multiple great things. One, be able to drive more cross-sell, more adoption within existing customers. And two, Eddie, to your point, allow you to capture more usage and data and information about how your customers are engaging. And three, like it's a feedback loop, right? It makes folks stickier because it's more embedded in their solution set and therefore harder to then subsequently replace.

[00:30:12] Eddie Hartman That's exactly what I was trying to get to. Like somebody introduced me to this idea of three minutes before, three minutes after a meeting, are you still engaged? Three minutes before you and I chat with each other on Teams, three minutes before I might be in Outlook looking at my mail. And three minutes after that, I might be in PowerPoint, all within the same platform. Right. And it's true with this is not just a Microsoft hype session. Like other platforms do the same thing too. So I think that increasingly it's not a question of like best of breed versus one pane of glass for a revenue question, is what you said. It's actually a full loop. It's well, then you have the data to drive the future engagement to like understand what the customer is trying to do to use the better AI models. Like it, it's really all part of one I hate the word ecosystem, but it's all part of one whole that creates quite a bit more than the sum of its parts.

[00:31:05] Sara Yamase Mm-hmm. Yeah. What that reminds me of, Eddie, is the number, the number of companies we had that talked about customer experience, right? Customer engagement. That could look like contact centers, that could look like email marketing campaigns, that could look like survey tools, that could look like but like all of them are in the same kind of general space of customer engagement. I'd be super interested to see which companies kind of do more across that engagement platform and and win kind of more share and how. Is it a segment thing? Is it a customer size thing? Is it a level of integration with other tools thing? But that is gonna also be a very huge trend for 2026.

[00:31:52] Eddie Hartman Look, I think we're coming on some real truths here. The winners of the future will be the ones that understand that the data is becoming not just any asset, but maybe the asset. And that see like a more integrated approach to their customer thinking about the needs. And that kind of drives a lot of the model and therefore the way that they'll be charged. Is it gonna move away from traditional users and seats eventually? But maybe not, maybe not quite yet. We're still doing that crawl walk run into the future.

[00:32:23] Sara Yamase It has been great to chat with you, Eddie. I think we always have so many things to talk about, whether it's highly cerebral and conceptual down to very, very tactical and interesting. But really appreciate the time and look forward to doing this again with you soon.

[00:32:42] Eddie Hartman Awesome. Well, thank you for setting this up.

[00:32:46] Narrator Thank you for listening. If you enjoyed this episode, please consider leaving a rating or review. For more insights, visit www.Simon-Kucher.com.

The Growth Blueprint Podcast

2026 trends and strategies for sustainable growth

Kontakt

Nehmen Sie Kontakt zu uns auf, unser Team berät Sie gerne.