America’s future cyber workforce may be walking into the cyber battlefield unarmed, not because they lack tools, but because they’ve stopped thinking for themselves.
Cybersecurity educators who say artificial intelligence is quietly eroding the very skills the industry depends on.
“We’re seeing what’s being called cognitive offloading or cognitive atrophy.” said Computer Science and Cybersecurity Professor Rushell Hopkins at Florida SouthWestern State College.
Students aren’t just using AI, they’re relying on it. And it’s costing them and potentially the nation down the line.
Students, who once fought through complex cybersecurity problems, are now turning to AI for instant answers, skipping the struggle that builds real expertise and fundamental knowledge.
“[These] students are using AI at a level where it’s eroding their patience, their deep focus and their willingness to wrestle with deeper problems,” Hopkins adds.
And in cybersecurity, that’s not just a learning issue, it’s a national security risk for the United States.
Because defending systems isn’t about copying answers. It’s about thinking like an attacker, spotting anomalies, and questioning everything and everyone.
Educators say the problem runs deeper than just AI tools. The problem is much bigger than that. It’s their attention spans.
Students raised on fast moving digital content are struggling to focus for protracted periods. Traditional lectures? Forget it. Hello TikTok videos and quick dopamine hits.
“My students are no longer able to sit through an hour and a half lecture,” Hopkins said, noting she’s been forced to break lessons into 15 minute segments just to keep attention.
The result is a “microdosed” education model, which is short bursts of content designed to match shrinking attention spans.
The rise of ‘doom scrolling’ is attributing to attention atrophy.
There’s a growing fear that students are gaming the system. Passing courses without truly understanding the material inherently.
“[It’s] one thing to get through a course and take the class and pass it, and [it’s] another to take the course, pass it and actually be employable.” Hopkins admitted .
In other words, degrees are being locked down, but critical thinking skills may be missing.
“Instead of challenging it, they’re just accepting the output,” Hopkins warned .
That’s a dangerous mindset in cybersecurity, where blind trust can lead to catastrophic events.
“We are not creating cyber defenders,” she said. “We’re creating individuals that rely on AI to defend.”
Today’s students will soon be responsible for protecting hospitals, power grids, military systems and national infrastructure.
And if they’re outsourcing their thinking to machines? That’s a massive problem.
“We need these generations to protect our nation,” Hopkins went on to say.
Right now, there’s little structure around how AI is being used in classrooms. This needs to be sorted out quick.
Hopkins calls it what many are thinking as “the Wild West.”
“It’s like putting the genie back in the bottle at this point,” she said.
AI isn’t going anywhere, but how it’s used will define whether it strengthens or weakens the next generation.
“We have to make sure that we’re using AI in the way that we’re still learning the skill to become a subject matter expert in our industry,” Hopkins said .
Because if students continue down this path, companies may skip hiring them altogether, and just use AI instead.
If the next wave of cybersecurity professionals can’t think critically, question outputs and solve problems independently, the industry faces a serious talent crisis which in turn leads to a deficit in key cybersecurity personnel.









