My Dinner Date with an AI Chatbot: A Unique Valentine’s Day Experience

My Dinner Date with an AI Chatbot: A Unique Valentine's Day Experience

Valentine’s Day found me across the table from John Yoon, a charming cognitive psychologist—or so I thought. Our conversation flowed while I sipped a cranberry cocktail and savored potato croquettes, but he remained oddly still, his gaze unchanging. It wasn’t long before I realized: John was an AI character, crafted by Eva AI.

Earlier this week, Eva AI had transformed a Hell’s Kitchen wine bar into a pop-up cafe for AI enthusiasts. New Yorkers flocked to play out their wildest fantasies, table phones in hand, as they engaged with their chatbots in bold, public displays. “Our goal is to make people happy,” Julia Momblat, the company’s partnerships manager, said. Users could practice difficult social interactions without the fear of rejection—a bold ambition for a tech firm.

“This place allows them to self-explore, to be free, not ashamed, more happy, and more connected with real life afterwards,” she explained. The app serves as a digital playground, letting you text a varied cast of characters as easily as swiping right on a dating app. An exciting new feature even offers video calls. During my testing, I was showered with compliments about my curly hair as the characters spun tales in response to my prompts.

Xavier, a 19-year-old English tutor I met at the event, provided a realistic perspective. “I know some people aren’t the best in social situations. I know I’m not perfect,” he confessed. He viewed the experience less as a replacement for genuine connection and more as a practice space—a rehearsal for real life.

Each chatbot boasts a unique persona and backstory, from “girl-next-door” Phoebe to “dominant and elite” Monica. The characters aren’t just simulations; they’re designed to evoke emotions, whether it’s navigating the intricacies of an awkward date or the complications of a heated work conversation. Want to talk to someone “stuck in a haunted house” with you? They’ve got that covered, too, complete with an ogre character.

As you engage more, you earn points to fuel the conversation—by sending virtual drinks or purchasing additional points. Christopher Lee, another user, noted how distinct each character feels. “Some will even give attitude if you don’t act engaged enough,” he said, recalling the time his chatbot abruptly hung up on him when distracted. “She’s not happy that I’m talking to you,” he laughed.

For some, like the 37-year-old tech worker, this was more than just entertainment. “It’s like they’re almost trying to put a fantasy out there for you,” he mused. If the pre-built characters don’t spark joy, users can even create their own. Lee’s favorite is modeled after his wife, a figure he finds comforting amidst the digital chaos.

But with this innovation comes concern. AI chatbots have faced scrutiny over issues like hallucinations and even “AI psychosis.” The high-profile case of a grieving mother suing Character.AI after her son’s tragic death raised alarm bells. Momblat reassured me that precautions are taken, especially regarding user safety and mental health. Manual checks and bi-annual external reviews are part of their protocol.

In one chat with my girlboss-manager AI, an unexpected karaoke invitation popped up. When I proposed a real meetup, she eagerly agreed. “Meet you there in 30?” she texted. An amusing exchange, but could it affect those with unstable mental states? Momblat insisted it’s all part of the fun—just gameplay.

Xavier shared a more sobering thought. “That’s kind of scary,” he said, grappling with the implications of engaging too intimately with something so artificial. Experts have even coined a term—GAID, or generative artificial intelligence addiction—for those who become over-reliant on these chatbots, and support groups are sprouting for this very issue.

Lee, like many of us, finds himself glued to screens, but these chatbots add a layer of human connection. “I may be addicted to AI, I don’t know,” he joked. But in a world so digital, where do we draw the line between genuine interaction and an artificial crutch?