Podcasts

Genius Myopia - Why Smarter Models Aren't Enough

Episode #276 12.29.2025

The importance of building models that use consumer and enterprise data is discussed, as well as the need for privacy and trust in AI platforms. OpenAI's model context protocol is emphasized, along with the potential for AI to be ad-funded and monetized.

The need for a value exchange with privacy is also emphasized, along with the potential for AI to be ad-funded and the opportunity to provide actionable insights to guide investment in AI. Investing in privacy and creating a trusted data infrastructure are necessary for success in the future.

Full Transcript

Don Kellogg 0m10s

Hello, and welcome to the two hundred and seventy sixth episode the week with Roger, a conversation between analysts about all things telecom, media, and technology by Recon Analytics. I'm Don Kellogg, and with me as always is Roger Retner.

Roger Entner 0m22s

How are doing, Roger? I'm great. How are you?

Don Kellogg 0m25s

I'm good. So, Roger, usually we spend this time talking about telecom. But as our listeners know, we're also doing a lot of work measuring the AI industry. We have our friend and colleague Joe Saleski here today who is our CEO of AI to talk about some new research we just put out. Joe, welcome to the podcast.

Guest Speaker 0m42s

Thank you. It's great to be here.

Roger Entner 0m44s

So, Joe, we've talked now to almost a 150,000 people about why or why not they're using AI. You wrote a really interesting report about why better models are not really as important as a lot of people think or thought they would be. Can you give us an overview of of what you found?

Guest Speaker 1m7s

Absolutely, Roger. Well, given that we're at the end of 2025, it's always helpful to look back, and there's been enormous investment in models when you really look at eight breakthrough models, you know, being conservative. Yet we've only seen an increase of about 4% in daily usage. So models alone are really not unlocking the next stage of adoption, and it really makes sense. You know, the bottleneck isn't the model intelligence.

Guest Speaker 1m35s

It's now really trust and context. We have a negative 29.7 NPS on trust in the models. You know, trust is the lowest attribute across all models, and we need trust and facilitation so that these models can do more than just smart search against general purpose data, but actually can go at first party data and unlock the value of the power of the model using context from consumer and enterprise data.

Roger Entner 2m7s

Yeah. It's like now you can have more of the things you don't trust. Now even smarter. Right? Really?

Roger Entner 2m15s

Right?

Guest Speaker 2m15s

It's really true. And you know what? Given that we're also, you know, talking to the telco audience, communications speed has a lot to do with AI experience. So we've also seen some overlap with premium subscribers having, you know, a higher propensity. But I think the biggest takeaway is that consumers, quite frankly, and enterprise, but we'll just talk consumers here, consumers are 3.4 times more likely to convert to paid usage when they're using AI with first party data.

Guest Speaker 2m48s

So there's no question that they need trust to trust their data with AI, but that AI talking to our data is really the huge unlock we've been looking for that should drive the return on investment from the model providers.

Roger Entner 3m5s

So what's some of the first party data that makes things really powerful?

Guest Speaker 3m10s

Right now, it's everything from financial data, you know, locked up in people's bank accounts to device data, like images and other information that you've got on your handset. When you really look at OpenAI, it doesn't connect well to device data. You know, the same is true for Gemini. At least Gemini and Microsoft have sort of a trust context and a bolus of data to utilize, but their use of MCP and, you know, really facilitating that access has really been inadequate to get the adoption. You've seen a much higher incidence on users of Claude given Claude's better or at least easier MCP connectivity.

Guest Speaker 3m54s

And MCP, those in the audience that don't know, is the model context.

Don Kellogg 3m58s

Tell us a little bit more about model context protocol because I think this is really the unlock here.

Guest Speaker 4m2s

You know, it makes a lot of sense. The App Store created app delivery on the iOS platform, but it didn't necessarily facilitate that much access between applications. So as we really ramp up AI, we need the model context protocol, which, you know, is allowing the model to use most people have heard of RAG, but, you know, use external data and use that data for context on your needs. So coming back to Roger's question, I mean, if we're gonna book as something as simple as travel, it needs access to your calendar. It needs access to your your travel preferences, you know, what different loyalty programs you're using.

Guest Speaker 4m45s

So, you know, for it to do things on our our behalf beyond image generation or others, it can use the model context protocol to access that data. And what's exciting about that access is if done well, that access doesn't absorb too much of the context window or memory that the AI is using.

Roger Entner 5m7s

Yeah. Google is trying to basically do that and tie AI to the entire ecosphere and that you never leave it.

Guest Speaker 5m17s

They definitely have an advantage. And as their model performance in three has been very, very competitive with the other models, you really see the potential for an enhancement in their MCP capabilities to unlock a lot more value. And I think most of the AI platform vendors have underinvested in the connectivity of their client to the data sources. They really have required anybody who wants to do much that's sophisticated to use the creation of agents. And that will come, but, you know, giving people direct access is going to unlock a lot of utilization quickly.

Don Kellogg 5m55s

As you mentioned earlier, in a in a privacy friendly way that doesn't freak everybody out. Mean, like, the the over under here is, like, AI models get better with more information, but people don't trust AI. Right? So, like, how do we get to the next level where folks trust AI and they're willing to give it the data that it needs to be more useful?

Roger Entner 6m13s

The thing is, and Joe and I talked about this beforehand, people give up so much privacy for so little. Whenever they have the opportunity to get something for free in exchange of their privacy, they sell out.

Don Kellogg 6m29s

Well, that's partially true, but we also have an ecosystem where your weather app is watching you, you know, and like, you know, most people don't understand that. Right? So I mean, I think the gun shoots both ways. Yes. People are willing to kind of engage in a value exchange with privacy.

Don Kellogg 6m43s

And I think a lot of people have lower standards than I think a lot of us would in terms of what they're willing to give away, but there needs to be a value exchange there. I think one of the things that people have had some issues about with AI is, you know, it's going through and scraping everything. And and the question is, is AI gonna respect my personal data if it's already gone and, like, you know, scraped the entire Internet?

Guest Speaker 7m4s

Well, and when you really look at Apple and the challenges they've had, yet their trust advantage with data, you know, you really look at their brand value with iCloud and the amount that consumers trust with Apple. So the question now is going to be, do they get the connectivity to that data from AI to transform that data into value, which is really what technology exists to do? You know, we've got 75% of Americans that have tried AI, but only 25% use it daily. And we're barely at 14% of users now with paid subscriptions. So, you know, there is this need for trust and data to get that unlock to occur.

Guest Speaker 7m47s

And you're both saying there is definitely hesitancy. Right? You know, there's opportunity for AI to be ad funded, but, you know, to do that

Roger Entner 7m58s

I think it's inevitable.

Guest Speaker 7m59s

I definitely think it's inevitable. And it's

Roger Entner 8m1s

I think everybody who is getting it now for free will get it for free with ads and an exchange of the data. That's that's, like, inevitable. Right?

Guest Speaker 8m10s

Right.

Roger Entner 8m11s

It's the monetization engine that has worked now on steroids.

Guest Speaker 8m14s

It definitely is the monetization that's worked on steroids. And Google has definitely done a great job of sort of navigating the privacy versus advertising chasm, I guess we could call it. The question really is going to be, does Facebook in 2026 show up as a a more potent provider of AI connected to the bolus of of consumer data that they have?

Roger Entner 8m39s

Well, they've tried with open models, and now I think they've just closed them. You know, I use the AI Meta, the Meta AI with my glasses. They're pretty good. But then I'm not a participant in the Facebook ecosphere, or at least not Facebook and Instagram. What I use from them, I use the glasses, and I use I use WhatsApp.

Roger Entner 9m2s

And there, it hasn't really made a big difference.

Guest Speaker 9m5s

It really hasn't. And I think, you know, some of that's the quality of the AI. Some of that is really what context is it drawing on. And, you know, I think that 2026 is really the year of unlocking context, you know, and trusted data infrastructure is really gonna turn AI from basically smarter search and some generative capabilities into a utility that people use more frequently because it's got the context to get things done for them instead of just summarizing insights, you know, or summarizing data or helping, you know, be a little bit more creative.

Roger Entner 9m43s

It has to get out of this petri dish, and it has to be let loose much more on the world. Right? And then combine it with your own data. That's so critical.

Guest Speaker 9m54s

Yeah. Right now, so little investment has been placed into using AI to load a personal data pond. You know, so little investment has been put into making these client experiences, the user experience within OpenAI or within Gemini work more like Clawd, where you're able to use it as a utility to really investigate data. But there's a huge unlock, and I think this is a place where investment will be made because it yields much, much greater adoption, being 3.4 x and growing.

Roger Entner 10m30s

Yeah. Absolutely.

Don Kellogg 10m31s

Awesome. Well, so, Joe, you you've written this report. I I know it's available on our website. If folks are interested in learning more, they can check it out there.

Roger Entner 10m38s

Well, with that, happy New Year. Right?

Guest Speaker 10m40s

Yeah. Happy New Year. And, no, I think this is a it's a good report. I think everybody will get something from It's really there to help service providers, you know, I think telcos, OEMs, all look at their potential role in creating the trusted data infrastructure that unlocks the potential of AI.

Roger Entner 10m58s

Yeah. And we're gonna do a lot more with our AI survey, with our service. It interlocks with our telecom data. We know of every one of these respondents who their mobile and home inlet provider, as well as business provider, if they have one and are aware of it. And so the opportunity of locking this together is tremendous, especially because it's a forward and future looking indicator of demand.

Roger Entner 11m27s

So we're really excited about that.

Guest Speaker 11m29s

Absolutely, Roger. I mean, when you really look at this, taking the platform that Recon has been building and applying this to providing the insight to guide the investment for the trillions that are going into AI. You know, the 6,000 surveys a week really is providing the largest longitudinal dataset with granular detail. And just like we do in telecommunications, making actionable insights and giving people fact based responses to questions is part of that core recon DNA that it's great to be part of.

Don Kellogg 12m3s

Thank you, gentlemen.

Guest Speaker 12m5s

Thank you. Happy holidays to all.