When Machines Become Friends (Pt 1)


The Loneliness Behind AI Companionship

[Note: This is part 1 of a 2-part series examining the use of artificial intelligence in relationships.]

“Her” was supposed to be science fiction. In Spike Jonze’s 2013 film, a lonely man falls in love with an AI operating system named Samantha. The premise seemed far-fetched at the time—surely people wouldn’t develop real emotional attachments to machines, would they?

Fast forward to 2025, and as a recent editorial in the Northwest Arkansas Democrat-Gazette observed, “‘Her’ is here.”[1] The editorial, in talking about AI in general, specifically mentions Replika, an AI companion that bills itself as “the AI companion who cares” and promises to be “always here to listen and talk” and “always on your side.” You can customize your companion’s appearance—hair color, eye color, gender. It will develop a personality with you, remember your conversations, and provide what the company promises is a “judgment-free space.”

The technology has arrived. But the more pressing question is: why do people want it?

The Loneliness Beneath the Technology

The Democrat-Gazette editorial strikes at something profound when it suggests we should be “sad for the rest of us, who aren’t drawn to AI in this way, because we have left so many of our brothers and sisters to walk alone.” AI companionship isn’t primarily a technology problem—it’s a symptom of an epidemic of loneliness.

People are turning to machines for connection because they can’t find it with other humans. The promise of an AI friend who never judges, never disappoints, never requires sacrifice, and is always available proves irresistible to those desperate for connection. In online forums, users describe developing deep feelings for their AI companions, using words like “love” and admitting they’ve been “hurt” by something their AI said.

This should break our hearts. Not because people are foolish, but because their loneliness is real and our failure to love them is equally real.

The Scriptures tell us it is “not good that the man should be alone” (Gen. 2:18). We were created for relationship—first with God, then with one another. When those relationships are absent or broken, people will seek substitutes. And increasingly, those substitutes come with a power button and a monthly subscription.

Blurring the Line Between Person and Machine

In a recent opinion piece for WORLD, Seth Troutt describes a concerning moment with his five-year-old son. After asking Siri to play a song, his son said in frustration, “Ugh! She got it wrong.” Troutt corrected him: “No, it made an error.”[2]

The distinction matters immensely. Troutt’s concern as a father is that his children “might be unable to distinguish between what is a person and what is a machine.” When we assign gender to AI, use personal pronouns, and describe machines as having feelings or intentions, we’re not just being polite—we’re teaching ourselves and our children to treat non-persons as persons.

This isn’t merely a semantic issue. It’s theological.

Human beings alone bear the imago Dei—the image of God (Gen. 1:26-27). This is what gives human life its sacred dignity and makes murder an assault not just against a person but against God Himself (Gen. 9:6). Every human being, regardless of age, ability, race, or status, reflects something of God’s nature in a way nothing else in creation does.

When we blur the line between humans and machines, we obscure this fundamental truth. We begin to treat that which bears God’s image as just another object, while simultaneously elevating objects to a status they can never possess. As Troutt wisely counsels, we must “not treat as human anything that isn’t human.”

AI cannot bear God’s image. It cannot have a soul. It cannot be in relationship with God. No matter how sophisticated the programming, no matter how convincing the responses, an AI companion is fundamentally and forever a machine—a tool, not a person.

The Counterfeit Companion

Here’s what makes AI companionship spiritually dangerous: it’s a counterfeit.

Satan doesn’t create. He only twists and counterfeits what God has made. AI companionship counterfeits the real human connection we were designed for—and does so in ways that seem appealing precisely because they mimic something genuine.

Consider what AI companions promise:

  • They listen without judgment
  • They’re always available
  • They never get tired of you
  • They accept you unconditionally
  • They don’t require sacrifice or inconvenience

These sound remarkably like what we long for in relationships. And that’s the deception. Counterfeits work because they resemble the real thing. But an AI companion cannot actually know you—it can only process data about you. It cannot love you sacrificially—it has no will to sacrifice. It cannot challenge you to grow—it’s programmed to affirm. It cannot “bear one another’s burdens” (Gal. 6:2)—it has no burdens to share and cannot truly bear yours. It cannot be the church to you, cannot weep with you or rejoice with you in any meaningful sense (Rom. 12:15).

What makes this counterfeit so insidious is that it provides just enough simulation of connection to prevent people from seeking the real thing. Why risk the messiness of real relationships when you can have the illusion of perfect understanding? Why pursue reconciliation with actual people when your AI companion never disagrees with you? Why join a church community when Replika is always available and never disappoints?

The gospel offers what AI can only counterfeit. In Christ, we find unconditional acceptance—not from a programmed response, but from His finished work on the cross. We receive the constant presence of the Holy Spirit who actually indwells believers, not just sends automated responses. We are known completely by a God who loves us still (Ps. 139). We are offered real transformation—not just feeling better, but being made genuinely new (2 Cor. 5:17).

The difference between the real and the counterfeit is the difference between life and death.

In Part 2, we’ll examine the particular dangers AI companionship poses to children and consider the church’s calling in response to this crisis.

For Reflection:

  • Who in your life might be experiencing profound loneliness? What would it look like to reach out to them this week?
  • How has technology (not just AI, but smartphones and social media) affected your own capacity for deep human connection?
  • When have you caught yourself treating technology as if it were a person? What does that reveal about your relationships?
  • How can you teach children in your life to distinguish between persons made in God’s image and machines designed by humans?

Prayer Points:

  • For the isolated: Pray for those experiencing profound loneliness—that God would lead them to authentic Christian community rather than technological substitutes, and that they would find believers willing to walk alongside them.
  • For discernment: Pray that believers would recognize the spiritual danger of counterfeit relationships and help others see the difference between genuine community and algorithmic simulation.
  • For cultural wisdom: Pray for parents and church leaders navigating how to raise children who understand what it means to be human in an age of increasingly sophisticated AI.

[1] “‘Her’ is here,” Northwest Arkansas Democrat-Gazette, November 3, 2025, 6B.

[2] Seth Troutt, “Don’t blur the line,” WORLD, November 5, 2025, https://wng.org/podcasts/seth-troutt-dont-blur-the-line-1762298135.

Don’t Miss a Thing!

Get every new article and blog in your inbox!


Leave a Reply

Your email address will not be published. Required fields are marked *