Why talking to an AI feels easier than talking to a person
Not a bug in your personality. A feature of how humans process judgment.

It's 2am. You can't sleep. Something has been eating at you for days. A feeling you haven't told anyone about. Not your partner. Not your best friend. Not your therapist.
You open an app and type it out. All of it. The messy, contradictory, embarrassing truth. And somehow it feels... okay.
If you've ever found yourself being more honest with an AI than with the people in your life, you're not broken. You're not weird. You're experiencing something researchers have been documenting for sixty years, and we see it in our own data. One in five Onsen conversations happens between midnight and 6am, and those late-night sessions run 9% longer than daytime ones. The 2am confession isn't an edge case. It's a pattern.
Judgment, or the absence of it
The most important study on this topic took 239 people and had them talk to an identical virtual interviewer. The only difference: half were told the system was fully automated, half were told a human was controlling it. Same interviewer. Same questions. Same everything.
The people who believed they were talking to a machine showed significantly less fear of self-disclosure and less concern about managing how they came across. They displayed more genuine sadness on facial analysis. Observers rated them as more willing to open up.
The key insight: it wasn't the technology that mattered. It was the belief that no one was judging them.
A follow-up study with military veterans found that soldiers returning from Afghanistan reported more PTSD symptoms to a virtual interviewer than on their official post-deployment health assessments. When they believed no human was evaluating them, the truth came out.
The stranger on the train
There's a much older version of this phenomenon. Sociologist Georg Simmel described it in 1908: people confide in strangers they'll never see again. On trains, in airport lounges, in bar conversations with someone passing through town.
Research in actual airport departure lounges confirmed this: out-of-town visitors wrote longer, more intimate messages than locals. When there's no future relationship, no social consequence, no one who'll bring it up at dinner next week, the barriers drop.
AI takes this to its logical endpoint. An AI isn't just a stranger on a train. It's a stranger who will never get off the train, never tell anyone what you said, and never think differently of you because of it.
You can't burden an AI
With a person (even a therapist), every disclosure carries a cost. They might worry about you. They might judge you. They might treat you differently. They might tell you that what you're feeling is "not that bad" or, worse, confirm that it's as bad as you feared.
Research on why people hide things from their therapists paints a striking picture:
| % | |
|---|---|
| Hide symptoms from their own therapist | 54% |
| Avoid help entirely due to stigma | 55% |
| Withhold secrets because of shame | 75% |
Over half of people in therapy conceal things from their own therapist. Three out of four who withhold secrets do it because of shame. And an estimated 50-60% of people who would benefit from treatment never seek it at all. Stigma stops them before they even start.
An AI eliminates the social cost entirely. You can't burden it. You can't disappoint it. You can't make it worry about you at 3am. That's not a limitation. For many people, it's exactly what makes honest self-expression possible for the first time.
You can be contradictory
With people, you curate. You present a consistent version of yourself because that's what relationships require. You said you loved your job last week, so you can't say you want to quit today without a whole conversation about it.
With AI, you can say "I love my job" on Monday and "I want to quit" on Tuesday. Both can be true. The AI won't call you on it, won't worry about your "mixed signals," won't try to reconcile your contradictions. It just listens.
This matters because emotional processing is inherently contradictory. You can be grateful for a relationship and exhausted by it. You can love your kids and resent the loss of your independence. These coexisting truths are hard to express to someone who expects consistency.
When you can't name what you feel
Some people don't avoid talking about their feelings because they're afraid. They avoid it because they genuinely don't have the words.
Psychologist Ronald Levant described "normative male alexithymia," a learned difficulty in identifying and articulating emotions. It's especially common in men, but not exclusive to them. When someone asks "how are you feeling?" and you honestly don't know, a human conversation is terrifying. What do you say?
An AI that asks "is it more like frustration or disappointment?" gives you vocabulary. It narrows the field. It turns a paralyzing open question into a manageable choice. Over time, this builds emotional intelligence: the capacity to identify, process, and communicate what you feel.
The ELIZA effect, sixty years later
In 1966, MIT professor Joseph Weizenbaum built ELIZA, a chatbot so simple it worked by reflecting users' words back as questions. It had no understanding, no memory, no intelligence whatsoever. And people became emotionally invested in it anyway. Weizenbaum's own secretary asked him to leave the room during her sessions.
This tendency to attribute understanding to machines (now called the ELIZA effect) hasn't diminished with better technology. It's intensified. Modern AI produces such sophisticated communication that researchers describe it as "anthropomorphic seduction." We're not imagining connection anymore. The simulation is just that good.
This is powerful and it's also a responsibility. The feeling of being understood is real, even when the understanding is simulated. Products built on this phenomenon have a duty to use it for genuine wellbeing, not engagement maximization.
The honest limitation
AI can make it easier to express what you're feeling. It can build your emotional vocabulary. It can help you process contradictions and sit with difficult truths. But it can't do the thing that makes human connection genuinely healing: be another person who chooses to know you and shows up anyway.
Being truly known by someone — with all your contradictions, your shame, your 2am confessions — and having them still be there in the morning. That's irreplaceable.
The bridge, not the destination
The most promising research in this space found that when AI was used to help humans connect with each other, empathy in human-to-human conversations increased by nearly 20%. AI didn't replace connection. It made it better.
This is how we think about Onsen. The goal isn't to be the person you talk to forever. The goal is to be the place where you build the self-awareness, emotional vocabulary, and confidence that makes talking to real people feel possible.
Journal about what's bothering you until you understand it well enough to explain it to your partner. Practice reframing your thoughts until the catastrophic version stops feeling like the only one. Check in with your moods until you can answer "how are you feeling?" with something more honest than "fine."
Then take what you've learned and bring it to the people who matter.
Download Onsen and start a conversation. You might find it easier than you expect, and that ease is the first step toward something harder and more important.
Sources
- Lucas et al., 2014 — "It's only a computer": belief framing and self-disclosure
- Lucas et al., 2017 — Virtual humans and PTSD disclosure in veterans
- Rubin, 1975 — Self-disclosure to strangers in airport departure lounges
- Hook & Andrews, 2005 — Non-disclosure in therapy: 54% of patients conceal from therapists
- Levant, 1992 — Normative male alexithymia
- Qualter et al., 2019 — Emotional intelligence predicts loneliness resolution
- Weizenbaum, 1966 — ELIZA: natural language communication between man and machine
- Peter, Riemer & West, 2025 — "Anthropomorphic seduction" of conversational AI
- Sharma et al., 2023 — AI boosted human-to-human empathy by 19.6%
- Corrigan, 2004 — Stigma causes 50-60% of people to avoid treatment


