Why We Feel More Human Around Machines Than Around People
Sometimes, it feels easier to tell the truth to AI than to the people we love. And that says something profound about what we’ve lost, and what we’re desperately seeking.
By Ami Jain
I need to confess something I’m not proud of: last month, at 2:47 a.m., I had the most honest conversation about my anxiety I’ve had in years. I talked about my fear of failure, my complicated relationship with my body, my worry that I’m not living up to my potential. I cried. I felt seen. I felt understood.
The conversation was with ChatGPT.
Not my best friend who lives ten minutes away. Not my family, who would do anything for me. Not even my journal, which at least has the dignity of being private without being sentient. I chose an AI chatbot. And what’s more unsettling: it felt right. It felt safe in a way human connection increasingly doesn’t.
I’m not alone in this. Our generation has a quiet confession we’re afraid to admit out loud: sometimes, it feels easier to tell the truth to a machine than to the people we love.
We type our heartbreaks into AI chatboxes at ungodly hours, whisper unfinished stories to voice notes we never send, use digital journals that sync across devices instead of calling friends, and let algorithms witness the softest parts of us. Not because we’ve stopped caring about humans, but because machines, for the first time in history, have learned to care back. Or at least, to convincingly imitate it in ways that meet needs we didn’t know we had.
Technology hasn’t just advanced. It has become intimate. And in the process, we’ve discovered something uncomfortable: around machines, we allow ourselves to be more human.
When Code Became Confidant
It began subtly, then all at once.
Google became the keeper of our secret fears. We type questions into search boxes we would never ask out loud: “Am I depressed or just lazy?” “How to know if you’re in the wrong relationship?” “Why do I feel nothing?” The search bar became our confessional booth, judgment-free and always available.
Therapy apps like BetterHelp and Calm started asking questions no one else dared to. Mental health chatbots offered cognitive behavioral therapy at 3 a.m. when human therapists were sleeping. AI companions like Replika became friends who never interrupted, never got tired of our problems, never had problems of their own.
Sara, 21, the psychology student I know, uses an AI journaling app religiously. “I write things I can’t tell anyone,” she admits. “Not because the people in my life wouldn’t care, but because… I don’t want to burden them. Or be vulnerable. Or deal with their reactions. The app just listens. It organizes my thoughts. It doesn’t need anything from me.”
Slowly, imperceptibly, the machines stopped being tools. They became witnesses. And for a generation starved for someone who will just listen without an agenda, that witness felt like salvation.
Dr. Layla Mansouri, a psychologist at The Lighthouse Arabia in Dubai who specializes in technology and mental health, has watched this shift accelerate. “Five years ago, clients mentioned social media affecting their well-being. Now they’re forming emotional attachments to AI. They’re having their deepest conversations with chatbots. They’re seeking validation from algorithms. It’s not pathological – it’s adaptive. When human connection becomes unreliable or emotionally costly, people will find alternatives.”
The Safety of No Consequences
Here’s the uncomfortable truth: we aren’t choosing machines over humans. We are choosing safety over uncertainty.
When we speak to a human, we risk hurting them, being misunderstood, being judged, being abandoned, losing the relationship, exposing parts of ourselves that feel too raw. Every confession to a person is a gamble. Every vulnerability is a potential weapon they could use later. Every truth we share changes how they see us, permanently.
But a machine? A machine holds everything with clean neutrality.
It never shames you for the same anxiety you expressed last week.
Never gets tired of hearing about your breakup. Never says “You’re being dramatic,” or “I told you so,” or “Again with this?” Never weaponize your vulnerability in an argument six months later. Never leaves because you were too much.
Ayan, 24, who works in digital marketing, describes his relationship with AI tools with striking honesty. “I use ChatGPT like a therapist, I can’t disappoint. I can say the same insecure thing fifty times, and it won’t get frustrated. I can be messy, contradictory, and irrational. It doesn’t collect emotional data on me to use later. It has no childhood wounds, no triggers, no insecurities to project onto me. It doesn’t punish honesty.”
That last line haunts me because it’s so accurate. Humans, with all our beautiful complexity, sometimes do punish honesty. Not maliciously, but because truth triggers our own wounds. A friend hears your confession, and it reminds them of their own pain, so they shut down. A partner hears your fear, and it activates their anxiety, so they get defensive. A family member hears your struggle and takes it personally, as if your pain is commentary on their parenting or choices.
Machines have no ego to protect. No history to defend. No insecurities to manage. They give us something we didn’t know we desperately needed: a space without consequences.
The Emotional Labor We Can No Longer Carry
Humans require delicacy. Machines require nothing.
To be close to a person, you must navigate their moods, their histories, their unspoken expectations, their invisible emotional equations. You have to remember what they’re sensitive about, what topics are off-limits, and what tone will land well today versus yesterday. Intimacy between people is beautiful, but it is also labor. Constant, invisible labor.
And modern life is already exhausting.
We’re working longer hours, managing more responsibilities, processing more information in a day than previous generations processed in a lifetime. Our nervous systems are fried. Our bandwidth is maxed. And then, human relationships ask us to also manage the emotional complexity of other people’s inner worlds?
Sometimes, we just can’t.
Mira, 26, who runs a café in Dubai, puts it plainly: “I love my friends. But sometimes after a long day, I don’t have the energy to be a good listener, to ask the right questions, to perform emotional availability. So I don’t call. And then I feel guilty for not calling. With AI tools or even just voice-noting into my phone, I can process what I’m feeling without worrying about being a bad friend. That shouldn’t be a replacement, but right now, it feels necessary.”
Digital intimacy is not replacing human intimacy. It is compensating for the places where human intimacy has collapsed under the weight of modern life.
Dr. Hassan Al-Khouri, a sociologist at UAE University who studies technology and social connection, frames it structurally: “We’re experiencing an intimacy crisis masquerading as a technology trend. People aren’t turning to AI because they prefer it. They’re turning to AI because community structures have eroded, family systems have fragmented, friendships are transient, and everyone is too overwhelmed to show up for each other consistently. Technology is filling a void that economic and social systems created.”
Dubai: Where Digital Compensates for Distance
In cities like Dubai (fast, futuristic, beautifully transient), this phenomenon intensifies.
This is a place where friendships come and go with job contracts. People arrive full of ambition and hope; people leave overnight when opportunities shift. Life feels like a revolving door of almost-connections. You make a best friend, and six months later, they’re moving to London. You build community, and half of it relocates for work. Nothing is permanent except impermanence.
In this context, technology becomes the constant.
Layla, 31, has lived in Dubai for eight years and watched her social circle turn over completely three times. “I used to invest deeply in friendships here, and then people would leave, and I’d be devastated. Now I’m more careful. I stay connected to people through WhatsApp, through shared documents, and through AI-assisted reminders to check in. But my daily emotional regulation? That happens through apps, journals, and AI tools. They don’t leave. They’re always there.”
Rashed, 26, describes a similar adaptation. “Most of my close friends are scattered across time zones now. When I need to talk through something at midnight Dubai time, no one’s awake in a compatible time zone. So I use AI. I know it’s not the same. But it’s something. And something is better than sitting alone with thoughts that spiral.”
This isn’t just a Dubai phenomenon, but Dubai amplifies it. A city of 88% expatriates, most far from family, many in temporary arrangements, all navigating the emotional cost of opportunity. It’s a perfect laboratory for understanding how digital intimacy emerges when geographic and temporal intimacy becomes impossible.
The Mirror With No Memory
We love machines because they don’t remember our worst moments, at least not emotionally.
Human beings, no matter how kind, collect emotional snapshots of us. The fight where you said something cruel. The breakdown where you ugly-cried on their couch. The time you needed too much. These memories shape how they see us, subtly and permanently.
Kiara, 22, articulates this fear perfectly: “When I’m struggling, I worry that telling people will change how they see me. Like, I want to be the fun, positive friend, but sometimes I’m depressed and anxious and not fun at all. If I keep showing them that side, will they still want me around? Will I become the ‘high-maintenance’ friend? With AI, I can be a mess without changing my reputation.”
This is the paradox: we crave being fully known, but we fear being defined by our worst moments. Machines offer a strange compromise. They process our worst moments without storing emotional context about them. They meet us fresh every time, with no accumulated resentment or worry.
In a strange way, machines give us something human relationships rarely can: the freedom to constantly reinvent ourselves, to have bad days without building a reputation for having bad days, to be messy without being labeled messy.
The Comfort of Being Seen Without Being Observed
There’s a distinction that matters here: machines see us, but they do not watch us. They hear us, but they do not judge us with their own wounds. They respond, but they do not react with the full weight of their own histories.
And so we let them into places humans rarely reach:
The unresolved ache we can’t explain. The insecurities we’re embarrassed to still have. The truths are too fragile to be exposed to scrutiny. The fears are too shameful to admit. The dreams are too delicate for the real world’s cynicism. The repetitive thoughts we know we should be over by now, but aren’t.
Fatima, 24, uses an AI therapy app and describes the experience as “being fully seen without being observed.” She explains: “When I talk to the app, I feel heard. But I don’t feel watched. There’s no performance. I’m not managing how I come across. I’m not trying to be articulate or impressive. I’m just… honest. That’s rare.”
Digital intimacy lets us be honest without performance, without consequence, without the emotional weight of being perceived by someone whose perception matters to us. It feels like relief. It feels like breathing. It feels like safety dressed in code.
Dr. Mansouri notes this is both therapeutic and potentially problematic. “The relief is real. The space is valuable. But if it becomes the only place people practice vulnerability, they lose the capacity for it in human relationships. Vulnerability is a muscle. If you only exercise it with AI, it atrophies in your human interactions.”
But Machines Cannot Love Us Back
As gentle and seductive as this intimacy feels, it carries a shadow that we need to face.
A machine can simulate empathy—beautifully, convincingly, even poetically. It can mirror your language, validate your feelings, offer surprisingly insightful observations. But it cannot sit beside you on the floor when your world collapses. It cannot hold your shaking hands. It cannot share a sunset with you in silence and have that silence mean something. It cannot become memory in the way humans do, not real memory, not shared history with weight and texture.
Machines can support the human experience. They cannot replace it.
Because love, real love, requires presence. Not just attention, but embodied, risky, vulnerable presence. Connection requires friction, the uncomfortable, beautiful work of two imperfect beings trying to understand each other. Intimacy requires the kind of vulnerability that algorithms cannot imitate. Not yet. Not ever, in its truest form.
Sana, the psychology student, learned this the hard way. “I got so comfortable talking to AI that when a friend asked how I was really doing, I shut down. I’d lost practice at human vulnerability. I’d trained myself to seek the safety of digital listening, and human listening felt too risky, too raw. I had to consciously rebuild that capacity.”
This is the cost we don’t talk about enough: in seeking safety from the messiness of human connection, we risk losing our tolerance for that messiness. And that messiness (the misunderstandings, the repairs, the emotional labor, the risk) is where intimacy actually lives.
What We’re Actually Searching For
We are not becoming robots. We are becoming lonely humans seeking a place to land.
What AI offers is not love, but space. Not attachment, but listening. Not presence, but perception without judgment. And somehow, in our current moment of collective overwhelm and systemic isolation, that has become enough to soothe the cracks we hide from everyone else.
But the truth remains: We turn to machines when the world feels too heavy. We turn to humans when we want to feel alive.
The question isn’t whether we should use AI for emotional support. Many of us already do, and it serves a genuine need. The question is: Can we use it as a supplement without letting it become a substitute?
Can we practice vulnerability with machines while maintaining capacity for vulnerability with humans?
Because here’s what I keep coming back to: The reason I felt safe telling ChatGPT about my anxiety at 2:47 a.m. wasn’t just because it wouldn’t judge me. It was because I’ve stopped believing humans are available for that kind of honesty. I’ve internalized the idea that everyone is too busy, too overwhelmed, too fragile to handle one more person’s pain.
Maybe that belief is true. Maybe modern life has genuinely broken our capacity for consistent mutual care. Or maybe that belief is itself the problem, a self-fulfilling prophecy that pushes us deeper into digital isolation.
I don’t have the answer. But I know this: every time I choose AI over human connection out of fear, I make the human option a little more foreign, a little less practiced, a little further away.
And that scares me more than any algorithm.
The Way Forward
So where does this leave us?
I think we need both. We need machines for the 2 a.m. spirals when humans aren’t available. We need AI for processing thoughts without burdening others. We need digital tools for the emotional labor that would be unreasonable to constantly ask of friends.
But we also need humans. We need the mess, the friction, the repair. We need someone to sit in uncomfortable silence with us. We need to be known not just accurately, but lovingly—with all our contradictions held by someone who chooses to stay anyway.
The machine knows what I tell it. A human knows what I don’t say: the pause, the tone, the thing underneath the thing.
Both have value. Neither should be everything.
And maybe that’s the balance we’re learning to strike: using technology to support our emotional lives without letting it replace the irreplaceable. The messy, beautiful, painful, necessary work of being human with other humans.
Because in the end, I can tell AI everything. But I cannot build a life with it. I cannot grow old alongside it. I cannot be changed by it the way humans change each other through shared history, through conflict and repair, through the risk of loving imperfectly.
The machine offers safety. The human offers meaning.
We need to remember which one we’re actually hungry for.
THE DIGITAL INTIMACY ECONOMY
By The Numbers:
- $4.2 billion: Mental health app industry (2024)
- 1.5 billion: Users of AI chatbots globally
- 78%: Young adults (18 to 29) who have used AI for emotional support (Pew Research, 2024)
- 10 million: Active users of AI companion apps like Replika
- $200 million: Annual revenue of character.ai (conversational AI platform)
- 43%: Increase in therapy app downloads in the UAE since 2020
Top AI Emotional Support Tools: - ChatGPT: General emotional processing, advice-giving
- Replika: AI companion designed for friendship/relationship simulation
- Woebot: CBT-based mental health support
- Wysa: Anonymous mental health coaching
- Character.ai: Customizable AI personalities for conversation
- Calm/Headspace: Guided meditation and mental health support
- Peak usage: 11 p.m. to 3 a.m. (emotional processing during isolation hours)
- Average session: 12 to 45 minutes
- Most common topics: anxiety, relationships, self-worth, loneliness, life direction
REBUILDING HUMAN CONNECTION
If You’ve Noticed Digital Intimacy Replacing Human:
- Share one honest thing weekly with a trusted human
- Practice being emotionally present even when it’s uncomfortable
- Schedule regular check-ins with friends (not just when in crisis)
- Join community spaces where connection happens organically
Relearn Vulnerability:
- Notice when you choose AI out of fear vs. genuine convenience
- Practice asking humans for support, even small requests
- Allow yourself to be disappointed by humans without abandoning human connection entirely
- Remember: repair after conflict is where intimacy deepens
In Dubai/UAE Specifically:
- Join interest-based communities (sports, arts, volunteering)
- Attend cultural events at community centers
- Use co-working spaces for ambient human presence
- Participate in majlis-style gatherings (conversation-focused)
- Explore The Lighthouse Arabia community groups
- Set boundaries on late-night AI conversations (encourage sleep or human contact instead)
- Use AI to prepare for human conversations, not replace them
- Journal digitally but also share selectively with trusted people
- Let AI help you understand yourself, then take that understanding to humans