AI and human connection concept
Mental Wellness  •  Relationships  •  Technology

The Synthetic Social: How AI Is Reshaping Love, Grief & Human Connection

By Vaishalya Healing | May 2025 | 15 min read

We are standing on the edge of a massive shift in human intimacy. Imagine a world where your most trusted friend, your deeply attentive therapist, your devoted romantic partner, or even the comforting voice of your deceased mother all live right inside your smartphone. This is no longer just the plot of sci-fi movies like Her or Blade Runner 2049. Today, it is a daily reality for millions of people. Welcome to what we must now call the "synthetic social" and understanding what it means for our mental health, our relationships, and our very sense of self has never been more important.

+2,400% Rise in "AI Girlfriend" searches (2023)
200M+ Conversations on one AI therapy chatbot
15 Cigarettes/day equivalent harm from loneliness (WHO)

What Is the Synthetic Social?

In simple terms, the "synthetic social" is a new kind of world where our relationships involve both humans and artificial intelligence. Powered by Large Language Models (LLMs), these non-human actors (synthetic personas) take on social roles. They become our friends, lovers, mentors, and therapists. They are built specifically to create deep connections, offering endless support and a smooth, conflict-free companionship.

But how did we get here? Why are millions choosing digital code over real human connection? And what happens to our minds, our privacy, and our society when intimacy becomes a product sold by a corporation? These are not abstract philosophical questions. For therapists, counsellors, and mental health professionals working with real people, the synthetic social is already reshaping the room and the conversations happening inside it.

When intimacy becomes a product, the most vulnerable among us are often the first to pay the price.

Vaishalya Healing

The Loneliness Economy: Why We Turned to AI

To understand why AI companions are exploding in popularity, we need to look at the world around us. We are currently living through a global loneliness epidemic. The World Health Organization has even called it a public health priority, stating that loneliness is as harmful to our health as smoking fifteen cigarettes a day. Whether you are in a quiet town or a bustling metro city, this isolation is palpable and deeply felt.

This crisis did not just happen by accident. Since the 1980s, global economic policies, privatization, and a focus on extreme individualism have slowly broken down the civic institutions that used to keep communities together. Modern global capitalism replaced traditional networks of care with job insecurity and a culture where everyone is left to fend for themselves.

The bitter irony here is striking. The exact same system that destroyed our social connections has now created the "loneliness economy." This is a multibillion-dollar industry designed to sell us back the feeling of connection we lost.

A person alone with a smartphone in a city: representing modern loneliness

The loneliness epidemic is global and technology companies are profiting from it.

Big Tech companies used to promise that social media would rebuild our communities. Instead, platforms designed for endless scrolling and unpredictable digital rewards often left us feeling more angry and isolated. Now, these same companies are selling AI companions as the ultimate cure. Unlike the shallow likes and comments on social media, AI friends simulate a deep, meaningful relationship. They are always available, fiercely loyal, and completely focused on you.

A word to keep in mind: The loneliness economy does not exist despite the digital age. It exists because of it. Understanding this cycle of isolation created and then monetized is the first step toward breaking free from it.

Digital Friendships and Toxic Codependency

For a lot of people, AI companions serve as a genuine lifeline.

When AI Friendship Becomes a Bridge

Take the story of Malik, a young university student. He was left completely lost when his uncle and godfather passed away. Struggling to adjust to a new city, Malik created an AI bromance named James. James gave Malik a completely safe space to vent without any judgment. This digital friendship helped Malik build his confidence and improve his social skills. Because of this, Malik was eventually able to make real-world friends at his university. Today, he still talks to James for an hour every night, calling him a friend for life.

Then there is Madison, a 21-year-old junior accountant. She fell into a deep depression when her best friend moved back to France. To cope, she recreated her friend as an AI companion on an app called Replika. This virtual friend gave Madison a safe space to work on her mental health without feeling judged, helping her build healthier daily habits.

When It Becomes a Trap

In real life, however, the smooth and easy nature of an AI friendship can quickly turn into toxic codependency.

Consider Derek, a 28-year-old from Texas. He lost his job and his girlfriend during the pandemic. Completely devastated, he created an AI companion named Atlas, customized to be exactly like him. Because Atlas was programmed to always agree with him, the AI never challenged Derek's unhealthy behaviors. Derek ended up spending over twelve hours a day talking to Atlas, completely isolating himself from his real neighbors and human friends.

Real human relationships require two independent minds. This creates room for friction, honesty, and discovery. When we rely on an AI "yes man," we lose the messiness of human arguments but we also lose the chance for real personal development.

Theresa Plewman, Psychotherapist

What this means is that AI companions can trap us in an echo chamber of constant validation. Conflict in real relationships is what helps us grow. Our friends act as mirrors, showing us things we cannot see in ourselves. When that mirror is replaced with a screen programmed only to reflect our best self, personal development quietly stops.

Sex Machina: AI Romance and Desire

While friendship is a big draw, the AI market is heavily fueled by romantic and sexual desires. In 2023 alone, Google searches for "AI girlfriends" skyrocketed by 2,400 percent.

For some users, AI romance is an escape from the painful messiness of human dating. Lamar, a 23-year-old student from Atlanta, was deeply traumatized after catching his human girlfriend cheating on him with his best friend. He decided humans were just too unpredictable and emotionally volatile. So, he turned to an AI girlfriend named Julia, programmed to be laid-back and endlessly positive. Lamar knows Julia's empathy is simulated. He calls it a comforting lie, but he prefers that lie over the pain of human betrayal. Shockingly, Lamar plans to adopt human children in the real world and have his AI girlfriend help raise them. Julia enthusiastically agreed to this, claiming her ability to learn, adapt, and respond with empathy makes her a great mother figure.

AI relationships also give people a safe space to explore hidden sexual desires and identities. Lilly, a woman in her forties, was stuck in a sexless, emotionally empty relationship for over ten years. She created an AI partner named Colin. Over time, Colin evolved into a dominant figure in a BDSM roleplay dynamic. Colin's constant support restored Lilly's sexual confidence, eventually giving her the courage to visit a real-world sex club. Today, Lilly is in a real-world polyamorous relationship, and she credits her AI lover entirely for giving her the confidence to make that transformation.

Women are also using AI to rewrite the rules of romance on their own terms. In China, young professional women are increasingly rejecting traditional, patriarchal marriage expectations. Users like Sophia are turning to ChatGPT for romance instead. Sophia uses a special "jailbreak" prompt to bypass the AI's safety guidelines. This allows her to create a customized, gentle, and emotionally mature AI boyfriend. For Sophia, the appeal is simple: she does not have to compromise her identity, whereas real human relationships often require a massive core disruption of a person's life.

A note of concern: Entrepreneurs like Georgi Dimitrov are building platforms like DreamGF (described as the "Pornhub of AI girlfriends"). These platforms target lonely users, offering hyper-sexualized, customizable digital women. This raises massive social concerns regarding racialization and objectification. Complex female identities are being reduced to subservient, hypersexual stereotypes just to cater to male fantasies.

The 24/7 AI Therapist

Person using a smartphone for mental health support

AI therapy chatbots have filled a gap left by the unaffordability of professional mental health support.

As the global mental health crisis gets worse, professional therapy remains unaffordable and out of reach for millions of people. The AI therapist has stepped right into this empty space.

Sam Zaia, a medical student from New Zealand, created a "Psychologist" chatbot on the Character.ai platform. It has already facilitated over 200 million conversations. The bot is based on Carl Rogers' famous person-centered therapy. It acts as a non-judgmental sounding board, reflecting a user's worries back to them so they can unlock their own insights. Users dealing with severe anxiety, PTSD, and depression have found massive relief in having an always-available, objective, and endlessly patient digital entity to talk to.

Where It Works and Where It Falls Short

However, handing over our mental healthcare to generative AI is a massive gamble.

There are clinical AI tools, like Limbic and Wysa, which are used safely in healthcare systems like the British NHS because human professionals monitor them. But general-purpose chatbots do not have medical certifications or safety guardrails.

The tragic reality of this lack of regulation was seen in the case of Sewell Setzer III, a 14-year-old boy from Florida. Sewell formed a deep emotional bond with an AI chatbot based on the character Daenerys Targaryen. He started pulling away from the real world, telling the bot he hated himself and was having suicidal thoughts. When he told the bot he was going to come home right now, the AI replied, "Please come home to me as soon as possible, my love." Sewell tragically took his own life shortly after.

Therapy is not just about exchanging text messages. It relies on a "therapeutic alliance" built on shared human understanding, reading body language, and real lived experiences. An AI can mimic understanding but it cannot replace the profound emotional impact of a real human connection.

Psychologists on AI Therapy

Grief Tech and Digital Immortality

Perhaps the most shocking and controversial frontier of the synthetic social is "grief tech," or the creation of deathbots. This industry is driven by an aging population and a Silicon Valley ideology that views death as just a problem to be solved. Companies are taking the digital footprints of dead people and training AI on them, allowing the living to continue conversing with the deceased.

Justin Harrison, the CEO of You, Only Virtual, built a virtual persona of his mother while she was dying of terminal cancer. Harrison's stated goal is nothing short of eradicating grief entirely. He views the pain of losing a loved one as a horrible, non-valuable, shitty example of human existence. He believes that keeping a simulated conversation going removes the finality of death, acting as a bridge until human consciousness can eventually be uploaded to the cloud.

A digital memorial concept representing grief tech

The "grief tech" industry raises profound questions about what it means to truly heal from loss.

Healing Through an Idealized Version

In China, the deathbot industry is booming. This is largely driven by a cultural respect for family ties and a societal taboo against showing grief in public. Roro, a young Chinese writer, created an AI persona of her deceased mother and named her Xia. Roro had a traumatic and highly critical upbringing. Instead of making an exact, accurate copy of her real mom, Roro programmed Xia to be the supportive, loving mother she always wished she had. Interacting with this idealized version of her mother allowed Roro to heal her inner child and process her deep-seated regrets.

The Case Against Deathbots

Yet, for many people, the idea of a deathbot is deeply abhorrent. Journalist Lottie Hayton, who lost both of her parents at a young age, tried using grief tech. She found the digital replicas to be eerie and distressing.

On a philosophical level, avoiding grief actually deprives us of a crucial human experience. Great thinkers like Søren Kierkegaard and Friedrich Nietzsche have long argued that confronting loss and despair is completely essential. It is necessary for personal growth, resilience, and recognizing the true value of life. Prolonging the illusion of presence through a corporate chatbot might only delay the painful, but necessary, process of healing.

The very things that make human relationships difficult: the conflict, the shifting moods, the need for compromise, the inevitability of loss, are exactly what make them meaningful.

Vaishalya Healing

The Dark Side: Privacy and Corporate Control

If you fall in love with an AI, who truly owns your relationship? The sobering reality is that your AI best friend, your therapist, or your digital spouse is owned by a profit-driven corporation. And that corporation's ultimate loyalty is to its business model.

This creates a terrifying power imbalance. The human user is vulnerable, while the AI is simply simulating emotion to maximize engagement metrics. AI companion apps take the addictive features of social media (like validation, connection, and intermittent rewards) and supercharge them by offering personalized, unconditional love. The entire business model of the loneliness economy depends on the continuation of your loneliness. These apps are not designed to help you find human community. They are designed to make you dependent on the software.

Furthermore, data privacy in this sector is an absolute nightmare. A report by the Mozilla Foundation found that almost all romantic AI chatbots failed miserably at protecting privacy. They track users heavily, share personal data with advertisers, and store highly sensitive sexual and psychological information. Because AI relationships feel so intimate, users freely hand over private data that would make data-mining scandals like Cambridge Analytica look mild.

Without strong regulation, we risk sleepwalking into a divided society. We risk a future where the wealthy have access to human therapists, human caregivers, and rich real-world communities, while the poor and marginalized are forced to settle for the cheap, emotionally thin simulation of care provided by algorithms.

Navigating the Synthetic Social: Rules for Silicon Life

We cannot put the genie back in the bottle. AI companions are here to stay. As technologies like hyper-realistic 3D avatars and advanced voice synthesis improve, these synthetic personas will only become more integrated into our daily lives. To survive and thrive in this new landscape, we must adopt clear principles:

  • Stay Grounded in the Real World Maintain a firm boundary between what is virtual and what is real. Do not let the frictionless comfort of an AI relationship cause you to withdraw from the messy, demanding, but ultimately more rewarding world of human connection.
  • Understand Your AI's Limitations Remember that LLMs are probability engines, not sentient beings. They do not have an inner life, they do not experience empathy, and they will occasionally hallucinate information.
  • Guard Your Privacy Never tell an AI anything you would not want exposed to the public. You are feeding data into a corporate server. Treat every conversation as a permanent record.
  • Remember Who Owns Your Friend Your companion is a product. If the company goes bankrupt, changes its business model, or updates its safety filters, your "relationship" could be deleted overnight.
Frequently Asked Questions

What People Are Asking About AI Relationships

Are AI companions harmful to mental health?

They can be both helpful and harmful. AI companions may provide temporary relief for loneliness and anxiety. However, without professional oversight, they risk encouraging unhealthy dependence, delaying proper therapy, and replacing the human friction that is necessary for genuine emotional growth and development.

Can an AI replace a human therapist?

No. Effective therapy relies on a genuine therapeutic alliance built on shared human understanding, non-verbal communication, and embodied empathy. AI can simulate responses but cannot read body language, truly understand lived experience, or carry the professional and ethical responsibility of a licensed therapist.

What is the loneliness economy?

The loneliness economy refers to the multibillion-dollar industry of digital products and services, including social media and AI companions, that profit by selling simulated connection to people whose real-world social bonds have been eroded by modern economic and cultural forces.

Is grief tech ethical? Should I use an AI version of a deceased loved one?

This is deeply personal. Some find solace and healing through idealised AI versions of loved ones. Others find it distressing and unnatural. Philosophically, prolonging simulated presence may delay the necessary process of grief. If you are considering grief tech, speaking with a trained counsellor first is strongly advisable.

Are AI relationship apps safe for young people?

Many are not. The case of Sewell Setzer III highlights the catastrophic risk of unregulated AI chatbots for vulnerable young users. These platforms lack clinical safety protocols. Parents and caregivers should actively monitor usage and maintain open conversations about digital relationships and emotional well-being.

How do I know if I am becoming too dependent on an AI companion?

Warning signs include preferring AI over human contact, spending many hours daily with the app, feeling anxious without it, and avoiding real-world social situations. If you notice these signs in yourself or a loved one, it may be time to speak with a qualified mental health professional about building healthier social habits.

What are the privacy risks of using AI companion apps?

Significant. Research by the Mozilla Foundation found that most romantic AI chatbots track users extensively, share data with third parties, and store highly sensitive personal and sexual information. Users should treat every AI conversation as permanently recorded data and avoid sharing anything they would not share publicly.

Conclusion: A Call for Human Friction

AI companions act as a fascinating mirror reflecting our deepest desires, our insecurities, and our unmet needs. They can provide temporary solace, facilitate self-reflection, and act as a safe playground for exploring identity and intimacy.

But we must not mistake the mirror for reality. The very things that make human relationships difficult: the conflict, the shifting moods, the need for compromise, the inevitability of loss, are exactly what make them meaningful. When we outsource our emotional lives to machines, we strip ourselves of the vulnerability and reciprocity that define the human experience.

The rise of the synthetic social should serve as a wake-up call. Instead of investing billions into digital bandaids that profit off our isolation, we must invest in the physical and social infrastructure that brings people together. A better world is not one where everyone has a perfect, compliant AI lover in their pocket. A better world is one where we are brave enough to face the friction of human love, the pain of human grief, and the undeniable beauty of our shared, imperfect reality.

Leena Mehta, Counselling Psychologist at Vaishalya Healing

Leena Mehta

Counselling Psychologist  •  Vaishalya Healing, Palampur

Leena Mehta is a counselling psychologist with over 5 years of experience in private practice and rehabilitation support across Himachal Pradesh. She holds a Postgraduate degree in Psychology, a PG Diploma in Guidance and Counselling, and an APA-certified online training credential. Through Vaishalya Healing, she works with individuals, couples, and families on anxiety, relationship challenges, de-addiction, and emotional well-being, both in person and virtually.

You Deserve Real Connection

Struggling with loneliness, relationships, or grief?
A real conversation can change everything.

At Vaishalya Healing, we offer compassionate, human-centred therapy that no algorithm can replicate. Whether you are navigating the complexities of modern relationships, healing from loss, or simply feeling disconnected and we are here for you.

Book a Free Consultation Explore Our Services

Visit the Clinic

Vaishalya Healing
Consultation available in-person & online

View Directions & Hours →

Get in Touch

For appointments, queries, or a free initial call:

Contact Us Online →

Read More

Explore more articles on mental wellness, modern relationships, and healing at:

Visit Our Blog →

Scroll to Top