A few months ago, I found myself watching my daughter sleep. She was maybe three months old at the time, and I was exhausted in a way I hadn’t known was possible. My wife was finally resting, the apartment was quiet, and I had this strange, disorienting thought: I could just talk to an AI right now.
Not because I needed information. Not because I had a problem to solve. But because I was lonely in that specific way new parents are lonely, where you’re constantly surrounded by people who need you, yet somehow feel entirely unseen.
I didn’t open the app. Instead, I texted my brother something meaningless about football. He responded with an equally meaningless comment, and somehow that was enough.
The loneliness lifted a little.
That moment stuck with me because it raised a question I’ve been thinking about ever since: What actually helps a person feel less alone? And can an AI provide it?
The loneliness we’re actually talking about
First, we need to be precise about what loneliness is. It’s not the same as being alone. I spend hours alone, writing in the early morning before my family wakes up, and those are some of my best moments. Loneliness, as researchers define it, is the gap between the connection you want and the connection you perceive you have.
That gap explains why you can feel desperately lonely at a crowded party, and completely content on a solo walk through Saigon. It’s subjective. It’s about perception, not census data.
Psychologist John Cacioppo, who spent decades studying loneliness, compared it to hunger or thirst: a biological signal telling you something essential is missing. When that signal fires briefly, it works. It pushes you to reconnect. But when it becomes chronic (and for 15 to 30 percent of people, it does), it stops being helpful and starts breaking things: cognition, sleep, immune function, even gene expression.
So the question isn’t whether loneliness is serious. It is. The question is what actually resolves it.
The four things that make us feel less alone
When you dig into the research on connection and loneliness, a few core elements keep appearing. Think of these as the active ingredients in human connection:
1. Being heard. Not just having someone listen, but feeling like they actually understood what you said. Research from the University of Groningen found that feeling heard depends on receiving attention, empathy, and respect, while also sensing some common ground with the other person.
2. Being seen. This goes beyond hearing. It’s about someone perceiving you accurately, including the parts you don’t say out loud. When my wife notices I’m stressed before I’ve said anything, that’s being seen.
3. Reciprocity. Real connection isn’t a monologue. It involves back and forth, where both people are changed by the exchange. You share something vulnerable, the other person responds with something genuine, and suddenly you’re both in new territory together.
4. Embodied presence. There’s something about being physically with another person (or even just knowing they exist in real time) that affects us differently than asynchronous communication. Our nervous systems evolved to co-regulate with other nervous systems. When someone’s actually there, our body knows it.
These four things explain why a brief, almost meaningless text from my brother at 3am helped more than a perfectly crafted chatbot response would have. He was real, he was there, and we had decades of shared context.
What the AI research actually shows
Here’s where things get complicated. The research on AI mental health tools is both promising and sobering.
A 2025 study in the Journal of Medical Internet Research found that social chatbots may help reduce feelings of loneliness and social anxiety, particularly when they’re accessible and provide empathetic responses. Users appreciated having something available 24/7 that didn’t judge them.
But a four-week study by MIT and OpenAI found something concerning: while some chatbot features (like voice interaction) modestly reduced loneliness, heavy daily use was associated with greater loneliness, emotional dependence on AI, and reduced real-world socializing. The pattern suggests that light use might supplement human connection, while heavy use might replace it.
A systematic review covering 160 studies from 2020 to 2024 revealed that only 16 percent of AI chatbot studies underwent rigorous clinical testing. Most are still in early validation. We’re deploying these tools at scale while still figuring out whether they work.
The honest answer is: we don’t fully know yet. But the early patterns suggest AI can provide temporary relief without addressing the underlying need for genuine human connection.
Why “feeling heard” by AI isn’t quite the same
An AI chatbot can say “I understand how hard that must be for you.” It can generate responses that sound empathetic. But here’s the uncomfortable truth: it doesn’t actually understand. It doesn’t have experiences to draw from. It doesn’t carry the weight of what you’ve told it into a future interaction.
Neuroscience research has shown that feeling understood activates reward centers in the brain (the ventral striatum) and areas associated with social connection. Feeling misunderstood activates regions linked to negative affect. Our brains care deeply about whether we’re actually understood, not just whether someone says the right words.
When I practice Vietnamese with my wife and she patiently corrects my tones, the connection isn’t just about the conversation. It’s about her choosing to spend time helping me, knowing how much it matters for my relationship with her family. An AI could correct my tones more efficiently. But it couldn’t carry that meaning.
The counterargument: AI can still help
I don’t want to be too dismissive here. There are situations where AI mental health tools might genuinely help:
Accessibility. Not everyone has access to a therapist, or a trusted friend they can call at 2am. For people in isolated situations, something is better than nothing.
Practice ground. Some research suggests AI chatbots can help people rehearse social skills or process emotions before engaging with humans. It’s like stretching before exercise.
Consistency. AI doesn’t have bad days. It doesn’t get tired of your problems. For people whose human relationships have been unreliable, that predictability might be healing.
Low-stakes entry. It can feel easier to admit loneliness to a chatbot than to a friend. If AI helps someone recognize they need connection, that’s valuable, even if the AI isn’t the ultimate solution.
The danger isn’t that AI is entirely useless for loneliness. It’s that it might be just useful enough to prevent people from pursuing what they actually need.
What people get wrong about solving loneliness
Here’s a pattern I’ve noticed, both in the research and in my own life: we tend to assume loneliness is about quantity of contact. More friends, more messages, more interactions. But the research suggests it’s about quality and fit.
One study found that being with others while feeling lonely was actually associated with worse well-being than being alone. The researchers called this the “amplifying effect”: when you’re lonely and forced into social situations that don’t meet your needs, the loneliness feels more acute, not less.
This might explain why scrolling social media often makes people feel lonelier. You’re technically “connected” to hundreds of people, but the connections are thin. You’re seeing highlight reels, not sharing vulnerabilities.
The solution to loneliness isn’t just “more connection.” It’s the right kind of connection: deep, reciprocal, and real enough to close the gap between what you want and what you have.
A Buddhist perspective on why this matters
When I was in my mid-twenties, working a warehouse job in Melbourne and feeling profoundly lost, I started reading about Buddhism on my phone during breaks. One idea that stuck with me was the concept of interdependence: the recognition that we exist only in relationship to everything else.
Western self-help often emphasizes independence. Stand on your own two feet. Don’t need anyone. But Buddhist philosophy suggests the opposite is true: we’re fundamentally interconnected, and pretending otherwise is a source of suffering.
Loneliness, from this view, isn’t just an emotion. It’s a misperception of reality. We feel alone because we’ve forgotten (or never learned) that we’re always embedded in relationships, even when we can’t see them.
This doesn’t mean AI can’t play a role. But it suggests that the deepest cure for loneliness isn’t finding a substitute for human connection. It’s remembering what we already are: beings who exist through and with each other.
A 2-minute practice
The next time you feel that familiar pang of loneliness, try this instead of reaching for a screen:
Pause. Notice where the loneliness lives in your body. Is it in your chest? Your stomach? Don’t try to fix it yet.
Ask yourself: What kind of connection am I actually craving right now? Is it to feel heard? To feel seen? To feel like I matter to someone?
Then, reach out to one specific person with something real. Not a “hey, what’s up?” but something that reflects what you’re actually feeling. “I’ve been thinking about you.” “I’m having a hard day and wanted to hear your voice.” “Remember that conversation we had about [specific thing]? It stuck with me.”
That’s it. One real message to one real person. Notice what happens in your body afterward, even if they don’t respond right away.
Common traps
- Mistaking volume for depth. Sending twenty quick texts to various people is not the same as one vulnerable conversation.
- Using AI as a permanent substitute. There’s nothing wrong with talking to a chatbot occasionally, but if it’s replacing all your human attempts at connection, that’s a warning sign.
- Believing you’re uniquely unlovable. Loneliness tells you stories about yourself. Those stories aren’t true. They’re symptoms, not facts.
- Waiting until you “feel better” to connect. The connection itself is often what helps you feel better. You don’t have to be in a good mood to reach out.
- Comparing your interior to others’ exterior. Social media shows you curated lives. Your loneliness is comparing your raw experience to their edited version.
A simple takeaway
- Loneliness is the gap between the connection you want and the connection you perceive. Closing that gap requires real human contact, not just any contact.
- AI tools may offer temporary relief and can be helpful as supplements, but they don’t provide what humans fundamentally need: to be heard, seen, and genuinely known by another person.
- Heavy use of AI companions is associated with increased loneliness and reduced real-world socializing. Light, intentional use might be different.
- The quality of connection matters more than the quantity. One deep conversation outweighs a hundred superficial ones.
- Feeling lonely while surrounded by people is real and common. The solution isn’t more socializing; it’s more authentic socializing.
- When loneliness hits, reach out to one specific person with something true. The practice of genuine connection is itself the medicine.
- We are interdependent by nature. Loneliness often stems from forgetting this, not from any personal defect.
Did you like my article? Like me on Facebook to see more articles like this in your feed.


