". . . and having done all . . . stand firm." Eph. 6:13

Newsletter

The News You Need

Subscribe to The Washington Stand

X
Article banner image
Print Icon
Commentary

When Students Ask the Machine

March 23, 2026

Every commander learns to watch for indicators before a crisis fully develops. In America’s classrooms, those indicators are already flashing.

A recent RAND Corporation study found that the share of students using artificial intelligence (AI) for homework jumped from 48 to 62% in just seven months. More telling: 67% of those same students believe the technology is weakening their critical thinking. The generation most dependent on AI is also the generation most worried about what it is doing to them. That combination should give every parent and pastor serious pause.

I recently spoke with two Christian students about their experience. One, a 13-year-old junior high student, described what is becoming unremarkable among his peers: most of them see AI not as a tool but as an exit ramp. Students routinely have it draft essays, then tell it to “make it sound less intelligent” to dodge detection. He acknowledged the pull himself: “AI can do it for me in 20 minutes instead of me struggling for three hours.”

That is an honest answer. It is also a warning.

The older student, a 19-year-old in college, put it differently: “The brain is a muscle. AI spoon-feeds you rather than making you do the work.” She is right — and she knows it. The RAND data confirms what she described from experience: students who rely on AI for writing, brainstorming, and analysis are systematically bypassing the very effort that builds judgment. The struggle is not incidental to learning. The struggle is the learning.

This matters for reasons that go well beyond test scores. In “The New AI Cold War,” I argue that America’s strategic position in the coming decades will rest on human capital as much as computing power. A workforce conditioned to accept AI-generated answers without independent verification is not just academically weaker — it is a national security liability. The nation that builds the most disciplined, adaptive, and ethically grounded people will hold the advantage. Right now, we are not building that.

The Ethics Problem Nobody Is Resolving

The students I spoke with both struggled to draw the line between using AI and abusing it. The 13-year-old couldn’t define exactly where “help” becomes “cheating,” though he knew it when he crossed it. The college student was clearer: “If you have AI do your work, you learn nothing. It’s not genuine.”

The RAND study found that most students don’t consider common AI use to be cheating — except when the machine provides answers directly. That leaves an enormous gray zone, and students are navigating it without much adult guidance.

Scripture does not give us ambiguity about this. “Whoever walks in integrity walks securely” (Proverbs 10:9). Integrity is not situational, and it is not something a student redefines for himself after the fact. Adults — parents, teachers, pastors — need to set the standard before the habit forms, not after.

The Authority That Is Quietly Shifting

The college student offered the observation that stayed with me longest: “I’ve seen people consult AI like a pastor.”

That single sentence captures a cultural shift that most Christian families have not yet registered. Students are not merely using AI to finish assignments. They are turning to it for guidance on identity, meaning, and how to endure difficulty. A tool that produces confident, detailed responses in seconds is filling a role that God designed for parents, mentors, and the church.

AI can simulate wisdom. It cannot possess it. It can produce persuasive answers on any subject — including matters of faith — without a conscience, a soul, or any accountability before God or man. As I address in “AI for Mankind’s Future,” unchecked reliance on algorithmic systems erodes the human judgment it was meant to supplement. A child who habitually outsources moral reasoning to a machine is being formed — just not in the way her parents intended. Jesus warned of confident, persuasive deception (Matthew 24:24). That warning did not require foreknowledge of large language models to remain relevant.

Forming Minds before the Machine Does

Neither reflexive rejection nor uncritical embrace is an adequate response. AI is a tool, and tools require governance — not panic, but deliberate parental authority.

Parents should establish a clear household principle: wrestle with the problem first, use AI afterward for feedback or research, and always disclose its use. The questions that matter most — identity, faith, how to survive difficulty — stay with people who know and love the child. Regular, honest family conversations about how AI was used will build awareness before quiet dependency does.

Schools need assessments the machine cannot simply complete on a student’s behalf — in-class writing, oral explanations of work, process-based grading that shows how a student reasoned, not just what was submitted. Clear disclosure standards reinforce the integrity that makes an education worth having. Teachers need preparation as much as students do; whether AI narrows or widens inequality will depend on whether under-resourced schools receive adequate support or are simply left to absorb the consequences.

The college student offered the best analogy for what is at stake: using your brain is like going to the gym. The effort is inconvenient. The results are yours. AI offers immediate output — but what it produces is not what you built, and it cannot substitute for what the struggle was developing in you.

The fear of the Lord is the beginning of wisdom” (Proverbs 9:10). Information processing is not wisdom. These are not the same thing, and a generation that has never been taught the difference will not discover it on its own.

Parents and pastors must enter this conversation now — not because the issue is new, but because the window for influence is narrowing. The machine is already in the room. The question is whether the people who love these students are as well.

Robert Maginnis is a retired U.S. Army lieutenant colonel, senior fellow for National Security at Family Research Council, and the author of 14 books. His latest, "The New AI Cold War," releases in April 2026.



Amplify Our Voice for Truth