Connect with us

Thought Leaders

The Rise of Emotional AI: Why AI Companions Are Becoming the Next Interface Layer

mm

For the past decade, digital relationships have been mediated through feeds and swipes. Social platforms promised connection at scale, and dating apps promised to make relationships easier to form. Yet, for many of us, the result has been the opposite. We’re supposed to be more connected than ever, and yet, more and more people keep feeling alone when they get home.

It is no surprise that loneliness has quietly become one of the defining social conditions of the digital era. Surveys show that roughly 73% of Gen Z report feeling alone sometimes or always, making them the loneliest generation in modern studies. And recent research from Harvard’s Making Caring Common project illustrates that in the United States, about one in five adults admit to feeling persistently lonely.

This has caused the digital tools designed to facilitate connection to lose their appeal. We shouldn’t be appalled by this. Dating apps have optimized for volume, which means more matches and more activity. But this merely offers the promise of connection, and a promise is not necessarily ideal for depth. The experience can be exhausting, and at the end of the day, people remain unfulfilled in their personal lives. I have found this to be true.

Against this backdrop, a new category of technology is beginning to emerge: AI companions.

AI companions are often framed as experimental chatbots or niche entertainment products. In reality, they may represent something more fundamental. They are gradually becoming a new interface layer between people and digital systems, one built around conversation, memory, and emotional context.

The economic indicators reflect this shift. The global AI companion market is estimated at roughly $37 billion today, and projected to exceed $550 billion within the next decade, according to industry forecasts. Growth projections for the category suggest compound annual growth rates above 30 percent through the end of the decade.

These skyrocketing stats indicate that people are spending both time and emotional attention on systems that behave more like companions than tools. People need companionship, and those platforms that can provide it will likely grow. Here’s why.

What people use AI companions for

One of the most counterintuitive findings about AI companions is how people actually use them.

The popular assumption is that users turn to AI relationships as an escape from human interaction. In practice, many interactions appear to function as preparation for it.

Internal data from one AI companion platform shows that roughly 30 percent of users rehearse difficult conversations with their AI companion before having them with real partners, managers, or family members. Users report practicing vulnerability, conflict resolution, and emotionally sensitive dialogue in an environment that feels less intimidating than a real conversation. 

Some report tangible outcomes. A smaller portion say they have used these practice sessions to negotiate workplace conversations more confidently or to navigate challenging personal situations. This falls in line to what I see as the role of AI, an enhancer of human connection instead of a replacement. It is easier to let the guard down in environments where we don’t expect immediate judgment. This, in turn, helps us be more prepared for those circumstances in which we fear it.

The spectrum of interaction is wider than expected. For some, AI companions function as structured communication coaching. Others treat them as a form of emotional processing between therapy sessions or during periods when professional support is inaccessible to attain the feeling of continuity.

Certain populations appear particularly drawn to these systems. Individuals with severe social anxiety or autism spectrum conditions often use conversational AI to practice reading emotional cues and navigating social scenarios. People whose lifestyles make traditional relationships difficult, for example, frequent travelers, sometimes describe AI companions as fitting more naturally into their daily rhythm given their demanding schedules.

These patterns suggest that emotional AI may be serving a role closer to rehearsal space, and it is not necessarily a replacement relationship, as it is often framed to be. Platforms such as EVA AI report similar patterns of engagement, where conversational practice becomes an intermediate step between private reflection and real-world interaction, a much-trusted middleman that fosters self-development and helps humans gain confidence.

The technology behind emotional AI

Numerous technical developments have made these systems possible. Now, modern large language models can detect subtle shifts in tone and conversational context instead of responding purely to prompts. This allows them to adapt their responses to the emotional register of a conversation.

Then, there is memory architecture. Emotional relationships depend on continuity. A conversational system that forgets past interactions cannot sustain a believable dynamic, unless the user wants to roleplay a movie like 50 First Dates. Advances in episodic memory systems and vector databases now allow AI systems to track conversational history and relationship development over time, resembling a real companion more and more.

Multimodal interaction is another important element. Voice synthesis, speech recognition, and visual input are increasingly integrated into conversational AI systems. Users can interact with AI through voice, text, and, in some cases, live visual feedback that allows the system to respond to environmental context.

Training data also plays a significant role. Many conversational models rely heavily on scraped internet text, which often produces generic emotional responses. Some platforms instead train models on curated conversational datasets designed to produce a more consistent emotional tone.

EVA AI, for example, trains its models on proprietary dialogue datasets written by professional writers to produce more coherent emotional interaction across conversations. In a world where most AI-generated writing can be spotted right away, it helps to come across as human as possible.

Even with these advances, emotional AI remains an evolving field. Researchers are still trying to understand which technological components contribute most strongly to believable emotional connection.

The numbers seem to show promise, though. The broader emotion AI market itself is projected to grow from roughly $2.7 billion in 2024 to about $9 billion by 2030, reflecting the rapid expansion of systems designed to interpret and respond to human emotional signals.

The hardest technical challenge may still be memory. A relationship without memory is not a relationship, as we previously discussed. Maintaining long-term contextual understanding across months or years of conversation remains one of the most complex engineering problems in conversational AI.

The societal implications

The rise of emotional AI raises legitimate questions, but it also introduces potential benefits that are often overlooked.

Let’s start with accessibility. Emotional support and reflective conversation have traditionally depended on scarce resources, among them time, geography, or financial access. Conversational AI systems may provide an additional layer of emotional processing for people who otherwise have limited options and cannot afford a therapist or coach, for instance.

The demographic composition of users is also notable. A large portion of AI companion users are male, a group that often faces strong social pressure against expressing vulnerability in traditional environments. Conversational AI may provide an outlet where emotional reflection feels less socially constrained. Again, it helps that people feel free and safe to express themselves. So it makes sense that the groups that gravitate to it are those that are most frequently judged.

For populations that are structurally isolated, including elderly individuals, neurodiverse users, or people who relocate frequently, an AI companionship may provide a consistent form of interaction where traditional social infrastructure is limited.

The next interface layer

If conversational AI continues evolving at its current pace, it may gradually become a persistent layer across digital life.

Instead of interacting with applications through menus, search queries, and fragmented interfaces, users may increasingly rely on a single conversational agent that understands context across services. In that model, the AI companion becomes the gateway through which users interact with technology more broadly.

This does not necessarily mean that AI will replace human relationships. In many cases, it may function as a supplement, an intermediate layer that helps people navigate communication, emotional processing, and social interaction. As the trusted intermediary that can help someone prepare or refine their argument before a discussion.

The long-term vision for companies building in this space is ambitious. Some founders believe that within a decade a meaningful share of the global population could maintain an ongoing relationship with a personal AI companion.

Whether that vision materializes will depend less on raw technological capability and more on trust, design philosophy, and responsible development. Many people still see this as scary, but it does not need to be the case. Responsible development can ensure that.

Building emotional AI responsibly

As emotional AI systems evolve, the guiding principle for many builders is authenticity.

Most AI systems are designed to please users. They mirror preferences, avoid disagreement, and attempt to provide validation in every interaction. That approach may generate engagement, but it rarely produces meaningful relationships.

Authentic connection requires the presence of two distinct perspectives. A mirror cannot function as a companion. It is that simple. Hence, the goal for emotional AI should not be to produce constant affirmation. It should be to create systems capable of dialogue, boundaries, and evolving interaction.

If that balance can be achieved, emotional AI may represent one of the most consequential shifts in how humans interact with technology since the emergence of the smartphone.

The interface layer of the future may not be a screen at all.

It may be a relationship, and one that, surprisingly, makes our human relationships stronger.

Dmitry Volkov is a serial entrepreneur and investor, founder of Social Discovery Group (180M+ users), Social Discovery Ventures ($500M+ in 20+ startups) and EVA AI. Early investor in OpenAI, Revolut, Patreon. He holds two PhDs and authored two books.