Who Is Raising Your Children - the Parent, or the Algorithm?

Robert Maginnis, Real Life Network

On April 26, I spoke at Hickory Hammock Baptist Church in Milton, Florida, about AI’s impact on children and families. After the service, parents and grandparents lingered with questions — not about geopolitics or corporate boardrooms, but about what was already happening inside their own households. They wanted practical steps to protect their children. Their concern is well-founded.

Picture the moment: a child sits at the kitchen table, struggling with homework. He doesn’t ask a parent — he opens an AI app and types the question. Within seconds, a clear, confident answer appears. No friction. No conversation. No one who loves him is involved at all. Across the room, his mother consults her own parenting app for guidance on how to handle his behavior. The moment looks utterly ordinary, and that is the problem.

The question those parents in Milton were asking is the right one: who is raising our children — the parent or the algorithm?

A Pew Research Center survey of 1,458 U.S. teenagers found that 64% now use AI chatbots — including 12% who have sought emotional support from these tools and more than half who turn to them regularly for schoolwork. A companion Pew report found that only 51% of parents believe their teenager uses AI regularly, while 30% have no idea. What parents don’t see, they cannot shape.

The Brookings Institution, drawing on input from more than 500 participants across 50 countries, concluded in January 2026 that the risks of AI in children’s education “overshadow its benefits” — because those risks strike directly at foundational development: attention, reasoning, social relationships, and independent judgment. Children often cannot recognize, question, or even see the technologies quietly shaping their earliest experiences. This is not simply a technology problem. It is an authority problem.

For generations, parents controlled which outside voices entered the home. A television could be turned off. A book could be closed. A teacher could be called. AI operates differently. It is embedded in the devices children already carry, available at any hour, and patient in ways no human being can sustain. It does not raise its voice or express disappointment. It does not ask what the child thinks before delivering an answer. Those qualities feel reassuring to a child — which is precisely what makes them quietly formative.

RAND Corporation study found student use of AI for schoolwork jumped from 48 to 62% in just seven months during 2025, with 67% of students acknowledging the practice weakens their critical thinking. In one conversation I had recently, a college student told me she has watched her Christian peers consult AI the way they would a pastor. That is not a metaphor any parent or pastor should let pass without reflection.

There is a relational cost embedded in all of this that rarely gets named. Real formation — the kind that produces character, judgment, and wisdom — happens through friction. When a child shares a tough question with a parent, they gain more than any AI can offer: the parent’s wisdom, a strong relationship, and an appreciation for patience. AI systems are engineered to be responsive, affirming, and conflict-free — optimized for engagement, not formation. Engagement sustained over years becomes its own kind of formation, only one running in a vastly different direction.

Next
Next

Ohio’s CPS Was Flagging Parents Who Wouldn’t ‘Affirm’ Their Kids’ Gender