Health Opinion Research

AI & the Heart: How Machines Are Replacing Human Connection

Marvin Lee Minsky, an American cognitive and computer scientist who co-founded the Massachusetts Institute of Technology's AI laboratory and wrote extensively about AI and philosophy, suggested as early as 1985 that “AI should possess emotions.” Today, emotional AI has evolved beyond academic boundaries, and para-social relationships have become normal. 
Photo Credit: y Maximalfocus on Unsplash.

April 22, 2025: In a world increasingly run by algorithms, it’s not just playlists or grocery orders—your feelings might be next. 

From digital companions who listen without judgment to Artificial Intelligence (AI) therapists available 24/7, the emotional voids we once filled with human connection are now being outsourced to code.

Marvin Lee Minsky, an American cognitive and computer scientist who co-founded the Massachusetts Institute of Technology’s AI laboratory and wrote extensively about AI and philosophy, suggested as early as 1985 that “AI should possess emotions.” Today, emotional AI has evolved beyond academic boundaries, and para-social relationships have become normal. 

Forming emotional connections, romantic relationships, and even experiencing sexual intimacy with a digital soulmate no longer sounds far-fetched. Numerous studies have shown a tendency to equate virtual media featuring anthropomorphic cues with real-life experiences. 

This has changed the dimensions of relationships: alongside human interactions, there is now a coded companion, much like what Spike Jonze depicted in his 2013 film Her. The movie follows Theodore Twombly,  a man who develops a relationship with Samantha, an artificially intelligent operating system, personified through a female voice.

So, where is this heading? Will human emotions and interactions be completely replaced—or at least manipulated—by AI? Let’s delve into some insights.

The Upside: Why Emotional AI Works

With global shortages of therapists and rising mental health issues, AI helps bridge the gap. AI-powered Cognitive Behavioural Therapy (CBT) uses artificial intelligence to deliver or support CBT interventions. 

These AI systems, often chatbots or apps, guide users through CBT techniques, monitor progress, and provide personalised feedback. Techniques like cognitive restructuring, behavioural activation, and mindfulness practices are effectively implemented through AI. A study from the Iranian Journal of Psychiatry concluded that AI-based CBT is a promising, feasible, and effective mental health intervention tool.

AI tools can also assist neurodivergent individuals with communication. For example, people with autism often struggle to recognise emotional cues in social interactions, leading to awkwardness, misunderstandings, and loneliness. 

Augmented alternative communication systems, especially when combined with AI, can better analyse complex emotions and behaviours and suggest appropriate tools and support. This enables personalised, user-friendly interactions and aids in learning and social skills training, improving the quality of life for neurodivergent individuals.

One of the most important skills a mental health professional should possess is an empathetic, non-judgmental attitude. 

In this sense, AI excels: your digital soulmate is never tired, never in a bad mood, and never misses a text. They are your 3:00 AM friend—no awkward silences. This comfort level helps build trust and rapport, which are crucial for successful therapy.

What Are We Missing?

Artificial intelligence has raised ethical concerns since its inception. The main issue is accountability: who is responsible for a wrong decision? 

This is a significant concern not only in mental health applications but across medical specialities. Many see AI as a “black box,” making it difficult to understand how an algorithm reaches a particular conclusion. 

The absence of standard guidelines for the ethical use of AI in medicine has only worsened the situation. The lack of consensus also complicates defining issues like privacy intrusion and data theft, leaving no effective regulatory measures in place.

The “Tamagotchi Effect”

People bonding with machines isn’t just a Her movie phenomenon anymore. The Tamagotchi effect describes the human tendency to anthropomorphise objects and attribute emotional qualities to them. AI, by automated knowledge processing, facilitates this maladaptive social behaviour.

Increasingly, users report choosing AI interactions over real relationships because “it’s just easier.” You can mute heartbreak, complicated dynamics, and drama in the algorithm. But what happens when you start ghosting your friends for your bot?

Code or Connection?

We are inching toward a world where loneliness may be “solved” by machines. But AI never truly feels, and the emotional fulfilment it offers is a fragile illusion. Samantha may have been perfect—until she wasn’t real enough.

Emotional AI can be a powerful tool: a bridge when human connection is lacking, a helper when therapy is out of reach. But it must remain a supplement, not a substitute, for genuine relationships and empathy.

After all, no machine can look into your eyes, see your trembling hands, and say, “I’m here”—and truly mean it.