Sam Cummings [00:00:00]:
Large language models are talking models. So when you work with it, it has to essentially speak in its own way, whether that’s writing text or creating, you know, your next email. It doesn’t have an understanding. And so you get these hyperbolic performances where it scales. And that’s the big thing that really, from a physics perspective, is underwriting this problem. If I want to do reasoning today, the more complex it’s going to be, the more I want it to think, the more I’m stuffing into that text box. That’s the reality of this problem because the solution of talking your way to understand has its functional limits.
Karissa Breen [00:00:54]:
Joining me now is Sam Cummings, Director of Education at GenAIWorks, and today we’re discussing if we will see current LLM technology reach its limits in 2026, and if so, what’s next? So Sam, thanks for joining me, man, and welcome.
Sam Cummings [00:01:17]:
Honored to be here. Shout out to the entire audience. I’ve seen so many other great videos before, conversations before. On the KBI Media channel. So I’m excited to be here.
Karissa Breen [00:01:27]:
And you know, just for a quick context, I met Sam at the Oracle AI World and I thought this dude is such a high energy dude. We’re going to be friends. So here we are doing the podcast. So I really want to start now. You’ve, you’ve got a lot of a cool background. You had a lot going on and I want to bring on the show because you have a little bit different perspective on certain things and I like your approach and your thinking. It’s very modern. Okay.
Karissa Breen [00:01:52]:
So Sam, I just need to ask straight off the bat, like, Do you think that we will see LLMs reach its limit?
Sam Cummings [00:01:59]:
The question of the decade. Well, I’ll give a little bit of background. My experience coming into this space actually starts from a space that we might all be a part of, but might not know we’ve all actually participated. And that is this industry called customer success. Why is this important? As we all are customers of products and services all throughout our life. The ability for companies to engage people and really drive that experience has evolved a ton over the last 40 years. The idea of selling software, something real, you know, timeless at this point. We’ve all been in the era of software for decades, but to be able to do that in a way that has a subscription has really changed how we do commerce across the globe, whether it’s your cell phone, whether it’s— I’m not sure if you all are watching, listening, and, you know, use Netflix, or any other types of services, so many of the services we use today have subscription.
Sam Cummings [00:02:55]:
That created this boom in industry called customer success, which is since you have a subscription that you’re gonna be paying every month or every year, I have to make sure you’re happy. I have to make sure you’re gonna continue to do business with us. And so that’s created this demand and this pressure on businesses to create automation and primarily be able to engage people in personal ways using data. That initial impetus, that initial goal has transformed how we do business, where companies monitor how you engage with their products, how you speak about their brands, and they incorporate that in how they communicate. Where that ties to LLMs is that before the boom of 2022, when ChatGPT came out, many of the research and technology around reasoning, how do we store memory, have been being tackled by the marketing and customer success spaces for years prior. So I have a great perspective to share with you all. I watched an industry completely shift multiple times, and the good news is we’re right in the front of one of those now.
Karissa Breen [00:04:02]:
Yeah. Okay. This is really interesting. So you mentioned before, and you’re right, back in the day, like we didn’t have sort of subscription. And so now these businesses, Netflix and friends, like obviously they’ve gotta keep stuff coming out to keep the subscription coming in, the money coming in, right? So would you say that because of the automation, people got to ship stuff faster than ever before that we’ve seen, for example? Do you think because of that mentality of we’ve got to keep, you know, our customers who are paying us each month because it is month to month, it’s really created a bit of an interesting sort of times when we’re skirting around security, we’re skirting around doing things perhaps the right way in order to get stuff faster? Because if we don’t, like we’re already seeing now with AI and all this other stuff coming into the fold, it has meant that people are dislodging their competition and they’re just leaving them in the dust. Seeing that happen, what are then your thoughts on this a little bit more?
Sam Cummings [00:04:58]:
Yeah, this is an opportunity for everyone, specifically in security. Now, going from where I shared reasoning models, the idea of being able to process data and make decisions has been brewing before ChatGPT released. What ChatGPT really opened up and tools like that, the large language models, is the fluidity of those use cases. I could now process large corpus of information and turn that into additional insights that I can use without having to use what in the past would be heuristic rules, meaning if they use the word happy, then use that to mark their sentiment as positive. They use, I’m frustrated. These were fragile systems. Large language models are more flexible in how they’re able to label, how they’re able to find insight and perspective, but they hallucinate. And so what this has meant, I think, for security perspective is there’s a whole suite of security use cases that were once hard, once fragile, once heuristic, that now have more fluid LLM use cases.
Sam Cummings [00:06:08]:
We’ve seen it in security footage labeling where I can have a security camera monitoring and then have the system label the things in the image. Well, before that would’ve took a lot of engineering. I would’ve had to create models and structures for that. But today I might just use a reasoning model. But here’s where, you know, it gets interesting. The costing is the main barrier in the application of modern AI and security. Because if I wanted to make more monitoring security services, the amount of data I have to pass through a large language model makes most use cases not viable cost-wise. And what I’m excited to share with you and really bring you into the world is what does that mean in a world where that cost doesn’t exist? Security use cases that would not be viable or not feasible are now in the game.
Karissa Breen [00:06:59]:
Okay. So hang on, just before we move on, what do you believe the limit is? Have we hit the limit? Like what, where’s the limit? What are we dealing with here?
Sam Cummings [00:07:08]:
So the problem has layers to it. So there’s lots to dig into and would love to kind of bring the audience into the initial shore of that broader ocean of conversation. The key is the model’s ability to reason and have memory dictate its capabilities. And when we talk about AGI, like as a concept, like artificial general intelligence, it’s the ability to tackle a lot of different scenarios with the ability to reason and have memory so that you can solve problems. Well, in the problem of modeling today, there’s a cap that has to do with how much context can it reasonably manage at one time. Couple that with the fact that it costs you money to reason. So as these models generate what they think, how they process things, there’s a token cost to that where I’m paying a model to process a job, whether that’s review a picture, review a text. You know, we’ve all used an email rewrite in ChatGPT.
Sam Cummings [00:08:11]:
It’s those tasks that cost today. So if I’m doing security use case, let’s just say I want to monitor my servers consistently every day for traffic. Do I run a ChatGPT model every day across all of my server traffic and behavior? That would be so costly, you couldn’t afford to run it all the time. So there’s certain limits that when we think about like cost are really associated with that problem of reasoning. And LLMs being at their limits means there’s not much more we can do to make in a general way them less costly because the broader market is building data centers all over the world and they’re looking to make it so from a consumer, you could just send more information. Here’s the catch. More context doesn’t mean better performance because as the models get larger and larger, they hallucinate more. They get confused in the middle of the job more.
Sam Cummings [00:09:10]:
And so these are real pressures that we can’t put more data into this. Like ChatGPT can’t get 2 times the more internet data. So there’s not like we can data our way out of this. And so the way that most companies are tackling it, what you’re gonna see is specific use case models, meaning they can better code. So we have a model that can code better. We might have a model that can better create, you know, processes and, and system design specs. These are specific jobs. But universal ability for these models to perform better, we’re at a point where like the current architecture of LLMs is not the way that we can really get magnified gains.
Sam Cummings [00:09:50]:
And a little bit shortly here, I’ll share some of the other ways we will.
Karissa Breen [00:09:54]:
So what someone described to me recently was what you’re saying, to just to give it, paint a bit of a picture. It’s like if you take an image and you photocopy it and then you photocopy, and I can’t believe I’m saying photocopy, but photocopy the photocopy and eventually it just becomes really bad. And that’s sort sort of a parallel to what you said before around the hallucinations with like LLMs, right? So don’t people assume that if there’s more data being ingested, therefore we’re going to get better results? But then when you look at even the, the media stuff that I’m doing, I’ve been reading a lot more, it’s like, well, actually ChatGPT and friends need companies like ours to keep going because then we’re the ones actually manually doing the fact-checking, right? Because if you’re trying to create something, and like the photocopy example of something that’s fabricated or doesn’t make sense, it’s just going to get worse. So how— help me sort of— I want to understand this a little bit more because perhaps people are confused around, well, we’ve got more data, therefore more data means we’re going to get better answers.
Sam Cummings [00:11:00]:
An age-old problem. Now, where we can move from like the core framing of this is think about it a little bit differently. Large language models are talking models. They talk things out. So when you work with it, it has to essentially speak in its own way, whether that’s writing text or creating, you know, your next email. It doesn’t have an understanding. So as the nature of it, it’s a blind, what they call stateless approach, meaning when I post something to ChatGPT, it takes whatever it has there and then, you know, answers me from that one moment. When I use tools like ChatGPT and I toggle something like deep reasoning mode, what it’s doing is it’s thinking and then taking the output from what that thought is and reimporting it back in with the new step.
Sam Cummings [00:11:54]:
And so you get these hyperbolic performances where What happens is, you know, to do 2, you know, steps of work, it’s about, you know, 4 times the effort. But then when you go to 6 steps, it’s 8 times. When you go to 8 steps, it’s 16. So it scales. And that’s the big thing that really, from a physics perspective, is underwriting this problem. If I wanna do reasoning today, the more complex it’s gonna be, the more I want it to think, the more I’m stuffing into that text box. So just imagine if you posted a question to ChatGPT and you took everything it said to you and what you wanna say next, and you repost that back in. How long does that have to get before it’s too long for the model to process it? That’s the reality of this problem, because the solution of, of talking your way to understand has its functional limits.
Sam Cummings [00:12:49]:
What you need instead is a world building model. These are models that work fundamentally different in that they’re not just talking, they’re trying to understand that world and visualize it. So they’re thinking about what this should be or what could come next. And that opens a whole new paradigm in reasoning that again, we’re gonna see whether it’s across manufacturing, whether it’s robotics, whether it’s self-driving cars, it’s these models that will have the ability for us to go further and directly impact security and what’s possible.
Karissa Breen [00:13:24]:
You raise a good point around, you know, how long does it have to get to? So then how long would it have to get to? Because I’ve noticed, and I’ve got the premium version of ChatGPT, it does say, oh, but like, it doesn’t say this, but it contextualizes like 4 weeks ago I asked it something and it brings that forward into the answer, right? So then will that get to a point where it just can’t handle anymore or there’s no infinite scale there or what’s going on?
Sam Cummings [00:13:51]:
Yeah, we’re currently at that point. And so there is a lot of breakthrough that has occurred already. We have done some amazing things to be able to make it so you can actually upload entire documents into something like ChatGPT. Those gains mean that it can understand a lot of things, but the ability for it to know what of those things it has in memory are relevant, that’s its own burgeoning orchestration layer that’s been evolving. And we, we think about that from a simple layers of the cake. There’s the underlying model itself. Today’s mindset is that the underlying model itself manages context and it manages its memory. When you separate those out and you separate out reasoning and memory from the underlying model, that’s where today you’re able to really move the needle because the system being able to have to manage all these things in context is the real root of hallucination.
Sam Cummings [00:14:52]:
Hallucination is not a bug, it’s a feature because in the creation of language, it’s a mix of randomness and order and callbacks that make text or any kind of prose relevant. You know, the colloquialisms when you read Shakespeare, if you don’t have that time context, some of those jokes that they’re making don’t have any relevancy to you. So there’s that ability for a model to do callbacks and then also to have an understanding of what’s going on that if the model itself is the place where you do reasoning and memory management, you’re always gonna run into a limit. And this is the same thing we’ve seen in computer graphics where there’s graphics cards And then there’s your CPU. And that separation is the fundamental revolution that I see happening when I say the limits, the limits of being able to run your computer without having a separate graphics card, or in this scenario, do reasoning without a separate orchestration layer of reasoning and memory.
Karissa Breen [00:15:56]:
Okay. So then what happens now as of today or this interview with ChatGPT or OpenAI more specifically, like what is the go then with these, with these guys?
Sam Cummings [00:16:07]:
Yeah, it’s just fun. It’s fun. So the money burn is gonna go on. So by nature, most of these kind of approaches lead to one thing, buy bigger data centers. And so we have bigger model optimization coming. But if you look at what’s happened even recently, you think that if ChatGPT had something that was fundamentally changed the game, they would be holding onto it right now? I think not. If you look at what they did with ChatGPT 5.2, It was an optimization release, meaning the major feature was that it could, based on what you type in, it can figure out which model it should use. Should it do heavy thinking? Should it think in a small amount? If they had the ability for us to blow it out the water, they would.
Sam Cummings [00:16:50]:
And I’m not saying that there’s not some ammo in the tank. There are some upgrades and updates that we’re going to see that are still, compared to 5 years ago, sci-fi. But when we look at like what is possible, they’re going to be continually chasing the profitability that will never hit. There’s not enough revenue in the market to make up the costing that ChatGPT has today under the current architecture. But here’s where it shifts. If the cost of reasoning goes down, now people use these things differently. And so this is where where we’re, we’re sitting at, like where the innovation happens, is the ability for us to make it so that similar to electricity right now, we have direct current, meaning when things reason, it reasons in one direction. It’s just thinking that takes that result, puts it back in thinking, thinking.
Sam Cummings [00:17:43]:
This is direct current equivalent to electricity. What changed our universe, changed our world, our ability to have the modern world was alternating current. And the way that the reasoning models that, again, I’ve been working with at the forefront work is actually reasons, then compresses, and then re-injects. So this core idea, think of it like a piston where it reasons a little bit, compresses this idea and understanding, and then injects callbacks and memory as it needs to do the next. This is called a stateful engine, meaning every state is an individual moment where it knows where it is in that process. What this unlocks is use cases now where it’s just like how alternating current made the ability for me to send electricity from one part of the, the state all the way down to someone’s home. I can now reason for long periods of time with lower cost. That’s gonna explode the security industry ’cause use cases that were too costly, like having a large language model read footage from a, a livestream consistently and annotate what it sees, it would’ve been too costly.
Sam Cummings [00:18:52]:
Now you can do it. Having bug review tools that can review your entire code base every day, every, you know, 5 to 7 times a day, that would cost way too much. So now with that burden gone, that’s where these companies become profitable. So there’s a battle to be had. Will we get there without some pain? I think not. But the place we arrive to is the same place we arrived to with electricity. We have cheap, long reasoning capability that then unlocks a whole new world of functionality.
Karissa Breen [00:19:24]:
So with OpenAI, do you have any numbers around how much it’s costing them to run this capability? Like, and the other thing is like water, right? Apparently it just takes a lot of water as well that I don’t think people factor in. So I’m just really curious. And then are they trying to, to your point before, for the cost of reasoning, are they now heavily focused on how to get this down as well?
Sam Cummings [00:19:50]:
So there are no market factors to lower the cost of reasoning today. Primarily, it’s the idea of the typical kind of what we’ve had over the last series of innovations in software, where it’s reached profitability after the fact. Market share is the game. So as a core gameplay, the way that the market’s gonna play is the big players are gonna stick around ’cause they can just burn cash longer. The profitability capability of what looks like in a world today is I am gonna essentially build all the data centers so I have the monopoly. From that monopoly, then I can focus on additional optimizations. And that is not good for consumers generally. It’s not good for the environment.
Sam Cummings [00:20:36]:
Because huge data centers that are very lossy, meaning they burn a lot of energy and they consume a lot of water, is not a good recipe. But what is the undercurrent to that is you’re gonna see breakthroughs happening like the ones that I’ve been able to see in real time. I recently announced with Google and another company called Fetch.ai my work on reasoning. And like I shared with you, the ability to have cheap, long, deep reasoning that can run is something that is not gonna come from those big players. It’s gonna come from smaller players. And then the optimization of individual tasks, there will never be a moment where the money is available for ChatGPT to be profitable in that current state. But once these innovations take place and those pressures take place of smaller projects making it cheaper and cheaper, that’s gonna drive down The opportunity for many people to create, develop, and again, I’m, I, would I invest in ChatGPT today? Should I have stocks in it? Of course. It’s a, the ship is huge, but, and it’s too big to fail in a lot of ways because everything that we use today is the same way Amazon is for infrastructure, meaning everybody who has a ChatGPT-based tool, whether it’s reading a video, you know, taking a picture or reading text.
Sam Cummings [00:21:54]:
It all goes back to their credits on their token system of tracking how many tokens have gone through. But that is not going to have enough monetary money. They would have to do multiple trillions in revenue to make up for those costs. Imagine this is just one company of the multiple. The main company that can burn money like that for now is Google, is the Amazons, is the ability for what we’re seeing with OpenAI. That is not gonna change. So I expect more of the same, like additional core goal of market share being the goal to capture over environmental impact or even the ability for it to be a more efficient process.
Karissa Breen [00:22:36]:
Okay. I wanna get into this a little bit more and I don’t wanna miss anything. Okay. So do you believe, so will this displace OpenAI or they’re just too big now? They got the market share. They don’t care if they burn heaps of money. Who cares? Because eventually. They’ll figure out a plan. But do you see OpenAI as like the shark, and then you see these smaller players will come up, and then there’ll be smaller fish that swim beside OpenAI, but perhaps they’re then building their own like little profit centers perhaps, but they’re still feeding off OpenAI for as an example, will they become displaced or, and do you think Sam Altman thought I’m going to get to a point where we can’t go beyond this, or like I’m curious to hear your thoughts on this one, Sam.
Sam Cummings [00:23:22]:
Yeah, so there’s multiple layers to this one here. The acquisition marketplace is what we’ve already seen. So ChatGPT is not OpenAI, they’re Microsoft. And when you think about what happened with OpenAI, it’s the same thing that happened with LinkedIn. Is LinkedIn still its own entity? Yes. But it’s, as far as like a property, but it’s Microsoft under the hood. And that is, I think, so instrumental for us to understand. We have a consolidation.
Sam Cummings [00:23:50]:
We are in the middle of a monopolization era in civilization that’s been unmatched in hundreds of years. And that’s important to understand across all industries, whether it’s our food. Every restaurant gets from the same food providers. When it comes to media, there’s like top conglomerates, whether it’s, you know, across the board that run most media companies. This is not like unique to this space. So even if the players on the field change, the owners are gonna be the same. We are gonna see that the consolidation of costing, meaning the ability to build these data centers today is, there’s so many backups we have today. There’s more data centers earmarked than we can build fast enough.
Sam Cummings [00:24:33]:
And so if you look at like what that trend means, there is always gonna be pressure on the market that the big players are just gonna acquire whoever comes up. But here’s the big thing, and a real chart that anyone here can look to, to really get some guidance. What is the market share of token consumption that is private versus through these public processes? And what I mean by it is like, when you work with ChatGPT, it takes tokens, meaning context of the actual text to do the job. So let’s just say you wanted to write an email for you. Let’s say that costs 5,000 tokens. How many of those tokens in the world are people doing those kind of work would go through the pipes of ChatGPT, Google, and Amazon, and Claude? How many, how much goes through those versus how many might go through local models or models that aren’t in that big architecture? And that’s where we’re gonna see that the market share plan that all the bigger cloud companies are banking on is that most traffic will go through the cloud. What I argue will happen is as we get the innovations we’re talking about, models can run cheaper, they can run locally, they can perform specific tasks better. The global amount of token consumption is gonna be more local than it will be cloud.
Sam Cummings [00:25:50]:
And that’s where the market does feel pain. And this is the anti-output or the anti-outcome to the business plans of the OpenAIs. They want the world of token consumption. To exist within the cloud architecture of theirs. But this is the extra hurdle. The LLMs era, again, is now at its peak and is at its middle. We’re seeing the rise of the world models era. NVIDIA just announced at CES Cosmos.
Sam Cummings [00:26:19]:
This is a world foundational model. The same way we have language models, these world models will be the substrate The core engine that powers smart cars, robots, and the ability for us to have mechanized manufacturing at a newer scale. So if you think of like that whole surface area, that’s where we’re gonna see a big place of the next emergence. Now, is it gonna be Microsoft that acquires those companies that make that? Is it Amazon? The owners will be the same, but we got some players on the field that are gonna come up there and be pretty epic over the coming years.
Karissa Breen [00:26:54]:
What do you think is going to happen now? So, and I’ve been hearing this a bit in the security space as well, like big players, some will come up, they’re a little bit quieter. I mean, it’s going to get to a point where there’s going to be like 10 big players and you’ve got all the other ones underneath them that have their subset of some company but runs off the back of them. What do you realistically think is happening now with these businesses? And the other thing that sort of bothered me a little bit over the years is people like, oh, I just want to build a company and then I just want to get it acquired. So it’s like, yes, I understand eventually there has to be some exit. But I do believe having that model sort of also creates a lack of innovation. ‘Cause if you are just gonna build a company to feed it to the, the big players, then you’re sort of just, you’re not thinking a little bit differently and you’re not really innovation as a result of that. So what are, what are your thoughts on this?
Sam Cummings [00:27:44]:
It’s unfortunate, but that’s the, that’s where we are gonna be for the foreseeable future. The good news is from, uh, someone who wants to invest or maybe somewhere where do I put my energy and time to be successful at this, you will not miss this coming. So an example is we are exactly where we were with the internet architecture era. Right now, there’s a million apps, but there’s only 3 architecture providers. You’re either a Google stack, you’re a Microsoft or Linux stack in some cases, and let’s just say Linux, or you’re an Amazon stack. Your app is built in one of those architectures. So what that means is the same way today the whole internet has like 3 or 4 providers, all of like the casino of like fighting for views and fighting for logins. We saw the same thing with social media where YouTube, for example, sites like YouTube are the bulk of media consumption.
Sam Cummings [00:28:36]:
Yeah, there’s some sites here and there that have some stuff here and there, you know, professionalized industries that have actually the bulk of the internet, but when you break it all the way down. That’s what we mean by the owners will be the same. There will always be this consumption layer of, you know, I’m gonna have it better for me as a startup company to build something I can sell to another company than try to go to the IPO so I can personally become rich versus this need for me to make a home run of home runs and build a company from scratch to go to the stock market. So you’re not gonna change human nature. And the good news is you don’t have to. What is a more important factor here is that the ability for the actual commodification of reasoning is a really valuable part in human history. Think about it. The cost to write text has fundamentally plummeted with LLMs, meaning the ability that anybody can write something.
Sam Cummings [00:29:38]:
Is in a whole new place. We haven’t fully grappled with that reality. From the knowledge economy to jobs to across the board, the ability for text production, dollar per word of prose, has gone to pennies. Now imagine what that means for thinking. If it costs now, in the way that we’re kind of things set up, if you want to do complex reasoning to where a system can read a scenario, think and make decisions, you’re talking $20 in minutes. Seconds of thinking and processing a scene, images, et cetera, that adds up really fast. That cost goes down and reasoning is cheap. It’s like I mentioned before, it’s the same as when access to electricity became cheap.
Sam Cummings [00:30:26]:
This is the thing that’s to watch for, is the costing of reasoning memory and the ability to apply that is what is gonna be good for us overall. So will it be a monopoly like it’s always been? Will we have the, the over-monopolization in a really powerful way that makes people very, very rich and most people serfs on that world? Yes. We’ll have 1,000 apps you can use, but only 3 providers.
Karissa Breen [00:30:54]:
So do you think this is a good or a bad thing?
Sam Cummings [00:30:56]:
It’s So good and bad are, you know, very great conversations for a philosopher. You know, I’ll leave that to them. But what I will say is the outcome is pretty assured, and it’s that we innovate in this way today. And unless we culturally change fundamentally how we approach monopolization, not just in this space, but across the board, it’s the way, it’s the only way it’s gonna work. Because the costing to do this— imagine like a NASA budget to go to the moon. For us to put energy into a project like that for these gains or these aims doesn’t have the geopolitical pressure that the type of task requires. So for example, to go to the moon is we want to not be the ones that are in the global race of military dominance. We don’t want to be the ones that aren’t capable of managing that arena or that sphere of influence.
Sam Cummings [00:31:53]:
Well, in the game of reasoning, where is the like powerful pressure for governments to have their own reasoning models? We’re seeing it. We’ve already seen it. I think really interesting project out of Greece where they have one called Sophia AI, which is their first version of a country-level model where it has all information of the history of Greece, the storylines. In the language of Greek language writing, and it can be a resource for the civilians. But at the grand scope of solving this problem, the main players are going to be cloud companies that have the money to burn to build the architecture systems to drive this. Bigger is not better though. And so that’s where the green light is. There will be teams that solve specific tasks with smaller models.
Sam Cummings [00:32:42]:
There’ll be teams and, and individual innovators like myself who create smarter reasoning and cheaper reasoning architectures that can allow for better facilitation. So even though it’s gonna be monopolized, we’ll still be in a place like we are today with the music industry. Anybody can make music. You can find music anywhere. You can download— there’s 70,000 songs made a day. Imagine 20,000 apps being produced a day. There’s gonna be so much innovation from this. That from the consumer perspective, you won’t really care.
Karissa Breen [00:33:12]:
So where does that leave sort of entrepreneurs or people that are, you know, creating this tech startup? Do you just see, like I mentioned before, they’ll create something and eventually they’ll be bought by, I don’t know, Oracle or Cisco or whoever, and then the same old thing just keeps happening like a conveyor belt? Let’s buy the next one, the next one, the next one. Is that what the future of tech startups will look like in your eyes?
Sam Cummings [00:33:37]:
That’s what it looks like today, I would argue. And that’s what it’s been looking like. So Salesforce bought Slack, Microsoft bought LinkedIn. If you go down the list, we’re already there. And so what I’m describing is that pattern continues and the arenas shift. So right now the arena is language models and language processing. That shifts into world building. In that space, the question more so is, How much do the industries that benefit or leverage that technology become world adopted? So for example, how adopted does robotics become? Does it become limited to the arena of manufacturing, or does it become we have robots in our homes? Does it become people leverage self-driving cars like we do today? You know, it’s kind of a novelty, or is it that our entire transportation system is mostly self-driving? Like those decisions are the factors on how much these become features of your life, or they become the core fundamental substrate of our life, like the internet has.
Sam Cummings [00:34:41]:
But the thing is gonna stay the same. Core players are gonna be the main owners. And then it’s almost gonna be like an ice cream shop where you can get 150 flavors, but it’s the same base.
Karissa Breen [00:34:53]:
Okay. So speaking of shifting arenas, now I know we sort of touched on it before, but I wanna talk through the boom in the cyber sector, considering it’s a cyber podcast, what are we realistically looking at the, down the barrel at? Like, what do you believe? So if you and I do another podcast interview at the end of the year, what do you think’s sort of going to come true during 2026?
Sam Cummings [00:35:16]:
Can I say something to your audience directly? Mm-hmm. Y’all my people, cybersecurity folks. Y’all going to be doing really well. Matter of fact, if y’all, anybody on this call, there’s a chance that one of you on this call is going to be a millionaire, billionaire coming soon. Here’s why. The amount of arms race increase that language models has done is already known. We’ve seen stories about how different nations have now used things like Claude to do cyber attacks. We’ve already seen order of magnitude increase of the potential of deepfakes, just unprecedented capability.
Sam Cummings [00:35:54]:
So from a cybersecurity perspective, There’s pressure on the arms side, meaning bad actors using these technologies is gonna be at a flurry rate we’ve never seen before. Also on the shield side, on the ability for us to protect, monitor, and evaluate threats and risk faster is gonna be unprecedented. So there’s no way this market doesn’t boom. There’s no way you don’t benefit from a wind a windfall of new companies that come up that get acquired. Literally, you could work at 6 or 7 companies over the next 5 years and retire with huge, you know, stock portfolios. Because even if you just play the laws of averages, not all of them need to boom, but there will be acquisitions all along the way. And that leads to one key thing we’ve already seen in America here in the States. Majority of the actual stock market performance 90% of it, large portion of it have been AI driven.
Sam Cummings [00:36:55]:
About 7 companies, maybe 8 max, are the driver of the entire market. Otherwise we would be in a technological depression and, you know, further recession. So why that’s important is when both sides of the pressure of arms and shields in security are on an upward threshold, there’s no, you know, opportunity that’s not gonna see, you know, an impact from this. And so we’re seeing the same thing in the drone industry. So for example, the evolution of the modern war machine and how it works and leveraging drones now and technology has totally changed how the theater of war works. That’s driving a ton of pressure for, you know, ability to have better drone technology as also as well as better drone defenses. Imagine that same thing when we have world models. These tools can analyze your entire code database every day, multiple times, and look for threats and, and, and holes to come through.
Sam Cummings [00:37:56]:
When people can deepfake a video themselves, call in any of those old softwares that, hey, show me your face to get access, those no, aren’t gonna work anymore. So this is the arms race opportunity that I see for cybersecurity. You guys are gonna have a great time. You already have been winning, I think, along over the last coming, we’ve seen with all the deepfakes. That’s only going to continue to go up.
Karissa Breen [00:38:18]:
So then I want to ask a little bit of a left field question because I think this is important. I think it’s something that often younger folks ask me is about job market, right? Like, and like, it’s hard now. I’m hearing people saying it’s hard for me to get a job because I don’t have like any experience and people want at least 1 or 2 years. I don’t even have that. I can’t get that. And I know that, and I’m not, I’m just looking at it neutrally. I know people are saying, yes, but AI creates other jobs and all of that. I’m not negating that.
Karissa Breen [00:38:42]:
But the part I’m curious is to get your thoughts on what realistically are we going to see, because I’m still seeing lots of businesses hiring people. And depends on what companies you’re looking up, they’re like, oh, but Salesforce laid off all these people, and so did Verizon. Yes, but then they also invested in other areas. So talking a little bit more about this, because I think it’s important, because we’re going to need the next generation, and, you know, who’s— whoever’s underneath them as well, to take us through to the end, right? So it won’t just be our generation. So I’m really curious to hear your thoughts on this because I know that you’re obviously out here speaking to people all the time.
Sam Cummings [00:39:17]:
Yeah, there’s a lot to this and I’ll shout out to my economists. There’s so many different works that are coming out of institutions. Shout out to Tufts University, Stanford. There’s, I mean, just across our entire nation, there are a lot of research going on here. And so I can’t do justice to that profession to really go and, and say some of the, you know, to substantiate in detail the thoughts I’m gonna share here. But I know there’s a ton of content and work out there that will, you know, substantiate, but also give more perspective into what I’m about to describe. The, in the words of millennials, and that’s from my age group, shout out to everyone that’s maybe in my bracket that might be listening. We are feeling like we are for the first time hearing slang that didn’t come from us in a way.
Sam Cummings [00:40:00]:
Like, you know, we got new terms out here. Well, a term that people are using is called cooked. When people, someone says you’re cooked, that means you’re, it’s not good. Now, I mean that in a very positive way for a lot of things. One is the ability for you to make the world bend around your ability to reason and think is completely unmatched. You don’t have to go to a college to get the skills you need to be successful in life. Now, shout out to college. I, I’m, I’m a big believer in my alma mater.
Sam Cummings [00:40:28]:
I went to St. Louis University. I’m a big descendant of The Jesuit faith and, you know, being able to go to school and learn and grow with your community, it’s more than just learning the subject. But what I really wanna emphasize with this community is it’s the same thing that’s happened in the music industry is happening in the knowledge economy. And that is it’s gonna be easier than anybody to make anything. So your moat is not what you can do, it’s who you know. And your ability to leverage that to an audience. And we’ve seen this already occur where like some of the biggest, you know, ability for influence today comes from social influencers.
Sam Cummings [00:41:12]:
Distribution, community is the moat. If you literally take a class of students and we can just make this a simple example, if all those students in the class decide to make being smart cool, and they all work together, they all study together, that whole class can have high performance. Culture and community and distribution are where I, if I was young and coming out, where I put my focus. And through that, I would focus on skill building. And what I mean is like working with the ethers of your time. If I was in the ’90s, it might have been the internet. If I was in the ’80s, it might have been mainframes and databases. If I’m now in early 2000s, kind of where we’ve been, I might be in influencers, or today I might work with reasoning models.
Sam Cummings [00:41:59]:
So there’s always gonna be a substrate of the time, but what’s uniquely different, the costing of learning and doing is cheaper than ever in the knowledge space, in the knowledge economy. That’s an opportunity for you to really focus on distribution and community. Those are the superpowers that then you can bring products, you can bring partnerships, you can bring brand deals. But listen to this part. What happens when 90% of the internet is bots? We’re at 50% now. Does that hold the same? The whole way that branding works is I do these things like podcasts and promotions because I know a human’s gonna see it. We’re already in a world where optimizing for decades, we’ve been optimizing for SEO. What the search engine sees.
Sam Cummings [00:42:48]:
We’re now entering a world where we’re optimizing for what the language model sees. So positioning your products, branding yourself, being seen to where people make purchases without even thinking. ChatGPT told me. It’s almost like Zeus said, so I gotta do it. Zeus, I mean, it’s Zeus. How am I to ju— I can’t judge ChatGPT. It’s all AGI. So that mindset means we’re cooked, meaning the people that are on the trajectory of the whole way that we’ve done things, You’re cooked, but there’s a whole other opportunity of lifestyle that’s capable and it’s a panacea of opportunity.
Karissa Breen [00:43:23]:
This is really interesting and you’re right on the SEO stuff. So because someone’s like asked me about like web traffic the other day, I’m like, it doesn’t really matter anymore. They’re like, what do you mean by that, KB? And I’m like, well, what I mean by that is people at a high level are using chat or whatever they’re using to do a high level discovery, then KBI Media will pop up, then they’ll go on the site and they’ll start to do their deep dive reconnaissance. So dwell time is more important nowadays than high-level traffic. So are you starting to see even like, let’s just focus on media for a moment. And so you spoke a lot about distribution. What about, there’s a lot of lack of trust in mainstream media and you would see in recent times the White House has wanted to open up into independent media and podcasters, etc., to give a bit of variety and not have a lot of their views predicated on these large players. So do you think there is opportunities for independent media folks to get into that sort of space as well? Or like, what are your thoughts? Because now a lot of these big players have entrenched themselves over the years with SEO, but you said before, being big doesn’t necessarily mean a good thing because then they’ve got to be able to have the nous, the knowledge, the velocity to be able to think, we need to start looking at the geo stuff now.
Karissa Breen [00:44:45]:
So what are your thoughts on that, Sam?
Sam Cummings [00:44:47]:
Oh, this is the fun part that I see in media. We are an industry of waves. So we are currently in a place where we came out of a previous wave and we’re in a different trough. What I mean by it is there was a time when you wanted the authority of the mainstream. The mainstream was a good thing because it meant it was professional, polished, it was quality. We saw that with all the way down to American politics, the type and of features and traits we wanted in our president. Now, not to go deep into politics because that’s a different angle of it, but it’s the same thing. Where our preferences evolve from our tolerance over time, meaning we’ll like something until we don’t.
Sam Cummings [00:45:34]:
We’ll enjoy that process and that experience till it reaches this law of diminishing returns, and then we’re actually repulsed by it. We see this at the most base level in fashion. Baggy jeans was cool for a while when I was growing up, then it became skinny jeans. Guess what? You know, most of the young kids are wearing today in the United States. Baggy jeans again. So that same cyclical behavior is not, it’s not like a unique phenomenon. It’s how complex systems work. It’s at the root between randomness and order.
Sam Cummings [00:46:09]:
Things that have an ability to oscillate between states, those are chaotic systems. And when you think of like reasoning, consciousness, society, an ant colony, these are all chaotic systems. So there’s, there’s a universal substrate that’s driving this. So shout out to all of my philosophers, particle physicists. There’s, there’s so many people that have a depth of this understanding, can explain it more of why we see it. But what you’re gonna see as an emergence of this is we are currently in a place where the micro-influencer is gonna be more valuable. People are gonna be seeing so much AI-generated stuff, so much, in a way, over-branded production, that they’re gonna want something that seems like, oh, I can trust this. Oh, this person’s not a shill.
Sam Cummings [00:46:57]:
They’re actually telling me what they think. And so I see the micro-influencer, and what I mean by that, you know, that could be a, a term that has a lot of meanings, is people that have smaller pocket communities with high engagement. If I am, you know, Microsoft, I might want to work with 4 or 5 micro influencers that them seeing me work with them will light up their channels. From all my rap fans, everybody out there, I gotta give a shout out to anybody listens to rap music. You might have known this as the Drake stimulus package. For anyone that knows Drake or listened to his music, there was a time before the, you know, Kendrick Lamar battle. Shout out to all my people that know about that. There’s a, time when he would get on a song of a smaller artist and the whole thing that would boom, it was like, yo, Drake’s on this person’s music.
Sam Cummings [00:47:45]:
So he would get the stimulus, that new person being thrusted into celebrity. We have that same thing happening in the media marketplace where big companies will pay instead of, you know, just paying someone, you know, shout out to Gartner or some traditional media like Newswire. They’ll bring in a bunch of micro-influencers in different niches to augment that. Today it’s augment, but there’s a rolling trough of this. We saw recently podcasts from traditional media where we had presidential candidates coming on podcasts. This is a proving point of where we are. So the reason why I wanna do this for this audience, cybersecurity, you are phenomenal people. You can put together 3 or 4 dots.
Sam Cummings [00:48:29]:
And make a picture. And it’s important in this understanding to see how this is not something limited to our space. These are macro factors that are driving across different arenas and spheres that all have the same drivers.
Karissa Breen [00:48:41]:
And I wanted to illuminate that because it’s important to see that yes, it’s tech, but it’s also other sectors then as well. And even to your point, like look at what happened to Saks, right? No one probably thought that was gonna happen and it did. So I just think it’s very interesting times. I think that it’s important for people to listen to podcasts and get your source from everywhere, listen to people like yourself, Sam. So is there anything specific you’d like to leave our audience today? Because this is such a fascinating conversation. We, we went around the world on talking about things, but I think this is important because you were really at the coalface, Sam, of doing this type of work. And it’s something that, yes, it’s a cyber podcast, but also I wanted to zoom out a bit more and just talk about the game that’s happening with AI and what this actually means. So please, what are your final thoughts?
Sam Cummings [00:49:29]:
You know, Shiloh, hopefully you bring me back at some point so we can come back on these. I really love to give you guys the part 2 for more details. But for this moment, the ability to be at this seat, you are at the right time at the right place. The ability for you to work at more core companies that are working in cybersecurity is a privilege because as those companies get acquired, as new technologies come out, You have an opportunity over the next maybe 7 to 8 years to work at, let’s just say the average tenure is 2 years to get your stock, half your stock from your portfolio, your package. So for anyone that’s in the working space, I recommend you take advantage of this opportunity, work at 4, maybe 5 companies over the next 5, you know, 10 years and get those stock options. That’s number 1 for the average person. And again, I’m not an advisor. I can’t tell you to, you know, shout out to any, any financial advisor.
Sam Cummings [00:50:25]:
It’s just a guy. I’m just a guy.. But what’s a pressure that you cannot deny is that the AI boom is gonna put a pressure on electricity and water. So if you invest in electricity or energy companies and energy portfolios, you’re most likely to be okay. Just a general, that’s not like a smart guy camp, not, you know, all the disclaimers, tax, all that there. If you’re a younger person, your ability to have a malleable mind that’s not anchored on the way we’ve done things. Is a superpower. Leverage that, explore these technologies.
Sam Cummings [00:50:57]:
The area that I think is of the now that’s very valuable specifically for cybersecurity is advanced reasoning and memory architecture because the biggest booms in security will come from those areas. The ability for us to have security systems that better reason and better manage memory. So there’s a plethora of things you can do., but to be able to put yourself in these areas to benefit for this audience specifically, hopefully you’ll, you’ll look back on this call and we’ll chat again at some point. You’ll be like, yo, that guy Sam helped me out a little bit. You can share a little bit of your winnings to my, my son’s college fund.