Trigger Warning: This article contains discussions of mental health challenges, including suicide.
BRIANNA: This article is difficult to write. On one hand, I’m part of an Artificial Intelligence community that’s on the cutting edge of technology, where we explore the latest advancements and trends in AI. We bring together Fortune 500 leaders across the globe to discuss AI’s incredible potential: from groundbreaking cancer treatments to enhancing cybersecurity and creating safer environments for both organizations and individuals alike. I am a firm advocate of AI for these very reasons. However, another part of me, holds a degree in psychology and maintains a deep interest in mental health and holistic wellness. I’m keenly aware of the growing concerns around AI, particularly its implications for mental health and its role in the loneliness epidemic that’s been widely discussed. Not only am I aware of these concerns, but I also find myself agreeing with many of the valid points raised.
NICK: For the ONUG community, you’re masters of AI’s technical side—optimizing systems and securing networks. But this blog is personal too. We’re writing it together for folks like us with families and friends who might feel lonely—aging parents, kids facing social hurdles, or isolated loved ones. I’ll weave in how AI can help, like offering companionship through chatbots such as Replika, which act as virtual friends, or social robots like ElliQ, designed to keep older adults company. While one of us raises valid concerns, I’ll also expand on how AI can support our loved ones, drawing from real tools and possibilities to bridge the gap between tech and human connection.
So here goes…
BRIANNA: First, what is loneliness? Loneliness is an interesting word that is often confused with being alone. Let’s first define it: APA (American Psychological Association) “affective and cognitive discomfort or uneasiness from being or perceiving oneself to be alone or otherwise solitary”. Summarized by our friend AI: an aversive psychological experience that emerges when people perceive a lack of intimacy or connection in social relationships.
This refers mostly to intimacy and quality of relationships, the common example of when you’re in a crowd or group of people, even family or friends but you feel completely alone because of feeling a lack of connection or intimacy.
Social media (yes, not AI, but an entity that significantly utilizes AI) has created incredible access to large crowds and communities. However, it has also often indirectly contributed to a lack of true intimacy and connection with those very same people. This is a common argument, and one I personally agree with (however, with caveats).
It’s the experience of ordering coffee and seeing everyone in line glued to their phones instead of looking up, open to conversation. Or boarding a train to work, where instead of chatting with fellow passengers, everyone’s once again face-down on their screens.
NICK: She’s right—AI-driven platforms can pull us from real interactions, and it’s a challenge we see daily. But here’s where I see hope: AI can also rebuild those lost skills. For our kids or friends who feel awkward in social settings, AI tools can simulate conversations, offering a safe space to practice without judgment. Research from the University of Sheffield suggests AI can help people on the brink of isolation by providing exercises to hone social skills, like role-playing dialogues, which could boost their confidence for face-to-face talks. This isn’t about sidelining human contact but giving our loved ones a stepping stone back to it, countering that physical and emotional distance she describes.
BRIANNA: Our AI-generated discovery pages on Instagram, entertainment, video games, etc. operate with such a high level of intelligence that they capture our attention through granular, hyper-targeted content, all based on personal usage history. It’s a “battle” the human brain can’t realistically compete with.
The physical distance and consequential effect of less frequent interaction with our peers results in greater discrepancies in prosocial behavior and peer relationships. Younger generations are adapting to the AI age with fewer peer-to-peer connections, gradually losing the natural social skills past generations developed more organically. Yes, this is a generalization, however; a notable and observable shift nonetheless.
NICK: Loneliness touches us all—maybe a parent living alone or a friend feeling adrift. While my Brianna questions AI’s impact, I see it as a potential ally. Beyond companionship, AI can offer practical support for our families. Imagine a tool that not only chats with your isolated relative but also helps them process emotions or practice social skills, easing their way back to real-world connections. It’s not about replacing humans but filling gaps when they’re not there.
BRIANNA: The growing concern with AI is that, if technology has already contributed to this social regression, AI could take it further—encouraging us to connect more with machines than with each other. We’re already seeing it in customer service (shoutout to CVS—you’ve tested my patience at an extraordinary level), and now at restaurants and airports where kiosks are replacing people. On a more personal level, people are turning to ChatGPT for therapy and companionship, organizing life, and finding purpose.
Now, if we’re investing our time and energy into interactions that once belonged to human connection, then it naturally means we’re spending less of both on peer-to-peer interaction, practicing social skills and cultivating emotional intelligence. The less time we spend engaging with our peers and building emotional intelligence, the more likely we are to experience social discomfort and resistance to emotional intimacy. And from there… isolation and loneliness are born.
NICK: Her concern about AI replacing human ties is fair—it’s a risk we can’t ignore. But let’s look deeper. For someone isolated—like a rural relative or a veteran hesitant to open up—AI can be a lifeline when human connection isn’t an option. A Harvard Business School study found AI companions can reduce loneliness as effectively as human interaction, more so than passive activities like watching videos. This suggests that for our family members who lack access to therapy (3 in 4 rural counties in the US face this issue), AI can step in with 24/7 support—think of it as a backup, not a substitute. It’s about meeting people where they are, easing isolation without pretending to be the whole solution.
BRIANNA: Further, (and this can easily be an entirely separate blog post) I haven’t even begun to discuss the rising rates of depression and suicide (particularly in the U.S.). This of course is not an exhaustive list, however, some contributing factors include the heightened usage of photo editing – reinforcing body image scrutiny and decreased self esteem, or the media content that, through generative AI, can reinforce personal beliefs, thoughts, and fears – thus creating virtual echo chambers for individuals, communities, and entire populations.
Rather than engaging with our environment, we’re often exposed to highly curated versions of people and their experiences thanks to AI’s contribution to the mainstream media’s idealized aesthetic or “look.” This creates a breeding ground for unhealthy comparisons, and can reinforce self esteem challenges and beauty standards. These edited versions of reality can often further feed into virtual echo chambers, where hate speech and harassment thrive. This shift has contributed to more divisive and polarized spaces, both in public and private dialogue. As a *correlated result, meaningful communication with one another has increasingly become challenging and people can become more isolated from each other . The consequence? Fewer opportunities for critical thinking, thoughtful discourse, and a growing blur between real people and edited personas. Again, not an exhaustive list, but this highlights how isolation, comparison, and unrealistic ideals stemming from AI’s utility contribute to the loneliness epidemic.
NICK: She highlights real dangers—AI amplifying division, depression, and unrealistic standards is a serious issue. Yet, AI can also bridge those gaps. Beyond companionship, it can offer mental health tools to tackle her concerns head-on. For our kids or friends caught in echo chambers, AI could provide a safe space to process emotions and challenge rigid thoughts, fostering self-awareness that breaks the cycle of polarization. It’s not a cure-all, but a way to support real dialogue when in-person options falter.
Here’s where we align—she sees AI’s potential, and I’ll expand on it with the tools available today. For mental wellness, Woebot, a therapy chatbot, uses cognitive behavioral therapy (CBT) to help our friends or kids manage emotions when therapy’s out of reach. It’s not just a chatbot—it’s a structured program that guides users through exercises to process depression or anxiety, directly addressing her worry about rising mental health issues. Studies show it reduces stress by offering a non-judgmental space, available anytime, which is crucial when healthcare access is limited. Then there’s Youper, an AI mood tracker that personalizes coping strategies based on daily check-ins. For our parents or teens, it can pinpoint emotional triggers—like loneliness from social media echo chambers—and suggest actionable steps, like breathing exercises or journaling, to improve well-being. This aligns with her point about building intimacy and emotional intelligence, freeing up time for real connection by handling the emotional load. These tools don’t replace humans but support them, giving our loved ones the strength to reconnect.
BRIANNA: Now, on the contrary, one might ask – is it truly technology at fault here? Or could social media be merely the largest scapegoat of our era, (or one of)? Could this be more of a reflection of human nature and an inherent flaw in our character? Instead of harnessing AI for its potential benefits, it has the potential to also bring out some of our darker tendencies. Take social media, for example. In theory, it’s a powerful tool to bring people together, to share ideologies and experiences, and to keep family and friends updated on one another’s lives in ways we couldn’t have imagined before. It connects individuals across the globe, breaking barriers of location and time zone. And yet, (referencing the above) we see staggering amounts of hate speech and online trolling, tearing apart people for their appearances, opinions, or whatever else they choose to share. Is social media truly to blame for this? Is it AI’s fault when people use it to edit their appearances? Observations like these, reflect how nuanced this topic is and why at this point in time, we’re still just at the beginning of a wider conversation, one that I’d like to continue to reflect on.
NICK: She’s spot-on—tech reflects our flaws and strengths. AI can fuel division or fakery, but it can also empower us. Woebot and Youper tackle her mental health and intimacy concerns with precision—Woebot’s CBT approach helps our families manage the emotional toll of online hate, while Youper’s mood tracking can counteract body image pressures by promoting self-awareness over filtered ideals. Back in the ‘90s, we built the internet without security, leaving costly gaps; with AI, we can steer it thoughtfully. For ONUG, this means using our expertise to deploy tools that lift our families—streamlining emotional support so they have energy for real bonds, not just virtual ones.
Wrapping Up
Brianna’s take on AI’s risks in the loneliness epidemic is sharp and valid—she sees the disconnect it can foster, and I respect that. But, as someone who is both part of the ONUG community and deeply involved in technology, I see a broader picture. Tools like Woebot and Youper, beyond initial companionship, offer structured support and skills for our lonely loved ones, backed by research showing real impact. If we guide AI with the care we learned to apply after the internet’s early flaws, it can strengthen our families’ connections, not their isolation. For us tech folks, it’s about leveraging our skills to make that happen.
To be continued.