". . . and having done all . . . stand firm." Eph. 6:13

Newsletter

The News You Need

Subscribe to The Washington Stand

X
Article banner image
Print Icon
Commentary

Who Is Raising Your Children - the Parent, or the Algorithm?

April 28, 2026

On April 26, I spoke at Hickory Hammock Baptist Church in Milton, Florida, about AI’s impact on children and families. After the service, parents and grandparents lingered with questions — not about geopolitics or corporate boardrooms, but about what was already happening inside their own households. They wanted practical steps to protect their children. Their concern is well-founded.

Picture the moment: a child sits at the kitchen table, struggling with homework. He doesn’t ask a parent — he opens an AI app and types the question. Within seconds, a clear, confident answer appears. No friction. No conversation. No one who loves him is involved at all. Across the room, his mother consults her own parenting app for guidance on how to handle his behavior. The moment looks utterly ordinary, and that is the problem.

The question those parents in Milton were asking is the right one: who is raising our children — the parent or the algorithm?

A Pew Research Center survey of 1,458 U.S. teenagers found that 64% now use AI chatbots — including 12% who have sought emotional support from these tools and more than half who turn to them regularly for schoolwork. A companion Pew report found that only 51% of parents believe their teenager uses AI regularly, while 30% have no idea. What parents don’t see, they cannot shape.

The Brookings Institution, drawing on input from more than 500 participants across 50 countries, concluded in January 2026 that the risks of AI in children’s education “overshadow its benefits” — because those risks strike directly at foundational development: attention, reasoning, social relationships, and independent judgment. Children often cannot recognize, question, or even see the technologies quietly shaping their earliest experiences. This is not simply a technology problem. It is an authority problem.

For generations, parents controlled which outside voices entered the home. A television could be turned off. A book could be closed. A teacher could be called. AI operates differently. It is embedded in the devices children already carry, available at any hour, and patient in ways no human being can sustain. It does not raise its voice or express disappointment. It does not ask what the child thinks before delivering an answer. Those qualities feel reassuring to a child — which is precisely what makes them quietly formative.

RAND Corporation study found student use of AI for schoolwork jumped from 48 to 62% in just seven months during 2025, with 67% of students acknowledging the practice weakens their critical thinking. In one conversation I had recently, a college student told me she has watched her Christian peers consult AI the way they would a pastor. That is not a metaphor any parent or pastor should let pass without reflection.

There is a relational cost embedded in all of this that rarely gets named. Real formation — the kind that produces character, judgment, and wisdom — happens through friction. When a child shares a tough question with a parent, they gain more than any AI can offer: the parent’s wisdom, a strong relationship, and an appreciation for patience. AI systems are engineered to be responsive, affirming, and conflict-free — optimized for engagement, not formation. Engagement sustained over years becomes its own kind of formation, only one running in a vastly different direction.

Scripture understood this before algorithms existed. “Train up a child in the way he should go; even when he is old he will not depart from it” (Proverbs 22:6). That charge was given to parents — not to AI platforms. The Hebrew verb for “train” — chanak — carries the sense of dedication, of establishing a direction through habitual influence. Formation is cumulative. Every time a child turns to an algorithm instead of a parent — and every time a parent turns to AI for guidance on how to respond — that cumulative process is quietly redirected.

Artificial intelligence has no conscience. It is not accountable to God. It cannot love your child, discern his heart, or distinguish between what he wants to hear and what he needs to know. As I examine at length in “AI for Mankind’s Future,” unchecked reliance on algorithmic systems erodes the very human judgment those systems were meant to supplement. The voice is confident, the answer is instant, and children are not equipped to evaluate what they are being handed. “Trust in the Lord with all your heart, and do not lean on your own understanding” (Proverbs 3:5). A child trained by habit of leaning on an algorithm rather than a parent is being pointed in a fundamentally wrong direction — not by malice, but by the steady drift of convenience.

Parents who think they are managing this problem by monitoring screen time are already behind it. Treating AI like a hazard to be filtered addresses the symptom while missing the cause. A more effective response means being present in the conversation — asking the question before the AI app gets to it, discussing what the app provided, modeling the slower and more honest work of thinking through a problem. It means teaching children that truth is different from a confident answer delivered in two seconds by a machine. Moses understood the principle: “You shall teach them diligently to your children, and shall talk of them when you sit in your house, and when you walk by the way” (Deuteronomy 6:7). The home was always the first classroom. Parents have always been the first teachers. AI has not changed that assignment — it has only made it more urgent.

Pastors need to address this with the same directness they bring to any other threat to spiritual formation. AI is shaping how young people think, relate to authority, and understand where truth comes from — and that is not a secondary concern. Policymakers need to move beyond phone bans — a political band-aid on a deeper wound — and confront the design incentives that make these systems so compelling, because removing a phone from a classroom does not fix a platform engineered to capture students’ attention the moment school ends.

In “The New AI Cold War,” I argue that the future security of this nation depends as much on the character and discipline of its people as on its technology. That argument starts in the home. A generation shaped more by algorithms than by parents will not have the judgment, resilience, or relational depth to defend what they have inherited.

The AI is already in your home. It is neither neutral nor passive, and it is not going away. The parents who understand that clearly will still have a chance to answer the question those families in Milton were asking. The ones who are still waiting to take it seriously may find the answer has already been made for them.

Robert Maginnis is a retired U.S. Army lieutenant colonel, senior fellow for National Security at Family Research Council, and the author of 14 books. His latest, "The New AI Cold War," releases in April 2026.



Amplify Our Voice for Truth