Commentary

When AI Love Goes Bad

The following was previously published in an earlier edition of Media Insider.

When we think about AI and its implications, it’s hard to wrap our own non-digital, built-of-flesh-and-blood brains around the magnitude of it. Try as we might, it’s impossible to forecast the impact of this massive wave of disruption that’s bearing down on us. So today, to scope out what might be some unintended consequences, I’d like to zoom in on one particular example.

There is a new app out there. It’s called Anima, and it’s an AI girlfriend. It’s not the only one. When it comes to potential virtual partners, there are plenty of fish in the sea. But, for this post, let’s stay true to Anima. Here’s the marketing blurb on her website: “The most advanced romance chatbot you've ever talked to. Fun and flirty dating simulator with no strings attached. Engage in a friendly chat, roleplay, grow your love & relationship skills.”

advertisement

advertisement

Now, if there’s one area where our instincts should kick in and alarm bells should start going off about AI, it should be in the area of sexual attraction. If there was one human activity that seems bound by necessity to being ITRW (in the real world) it should be this one.

If we start to imagine what might happen when we turn to AI for love, we could ask filmmaker Spike Jonze. He already imagined it, 10 years ago when he wrote the screenplay for “Her,” the movie with Joaquin Phoenix. Phoenix plays Theodore Twombly, a soon-to-be divorced man who upgrades his computer to a new OS, only to fall in love with the virtual assistant (voiced by Scarlett Johansson) that comes as part of the upgrade.

Predictably, complications ensue.

To get back to Anima, I’m always amused by the marketing language developers use to lull us into the acceptance of things we should be panicking about. In this case, it was two lines: “No strings attached” and “grow your love and relationship skills.”

First, about that “no strings attached” thing: I have been married for 34 years now and I’m here to tell you that relationships are all about “strings.” Those “strings” can also be called by other names: empathy, consideration, respect, compassion -- and, yes, love. Is it easy to keep those strings attached and stay connected with the person at the other end of those strings? Hell, no! It is a constant, challenging work in progress. But the alternative is cutting those strings and being alone -- really alone.

If we get the illusion of a real relationships through some flirty version of ChatGPT, will it be easier to cut the strings that keep us connected to other real people out there? Will we be fooled into thinking something is real when it’s just a seductive algorithm?  In “Her,” Jonze brings Twombly back to the real world, ending with a promise of a relationship with a real person as they both gaze at the sunset. But I worry that that’s just a Hollywood ending. I think many people -- maybe most people -- would rather stick with the “no strings attached” illusion. It’s just easier.

And will AI adultery really “grow your love and relationship skills?” No more than you will grow your ability to determine accurate and reliable information by scrolling through your Facebook feed. That’s just a qualifier that the developers threw in so they didn’t feel crappy about leading their customers down the path to “AI-rmegeddon.”

Even if we put all this other stuff aside for the moment, consider the vulnerable position we put ourselves in when we start mistaking robotic love for the real thing. All great cons rely on one of two things -- either greed or love. When we think we’re in love, we drop our guard. We trust when we probably shouldn’t.

Take the Anima artificial girlfriend app, for example. We know nothing about the makers of this app. We don’t know where the data collected goes. We certainly have no idea what their intentions are. Is this really who you want to start sharing your most intimate chit-chat with? Even if their intentions are benign, this is an app built for a for-profit company, which means there needs to be a revenue model in it somewhere. I’m guessing that all your personal data will be sold to the highest bidder.

You may think all this talk of AI love is simply stupid. We humans are too smart to be sucked in by an algorithm. But study after study has shown we’re not. We’re ready to make friends with a robot at the drop of a hat. And once we hit friendship, can love be far behind?

1 comment about "When AI Love Goes Bad".
Check to receive email when comments are posted.
  1. Frank Lampe from Lampe & Associates, February 23, 2024 at 1:49 p.m.

    Good column, Gord. This feels like a possible full-circle moment. After all, it was sex and sports that were the only things online making any money back in 2008, when advertisers began the migration away from print for the unknown promises of the internet. That was the beginning of the end for a lot of publications. Maybe AI, for-profit girlfriends marks the beginning of the end of real, person-to-person relationships. We already see that for kids online, who are more likely to be depressed and lonely and who are spending a lot less time interacting with real humans. Sigh.

Next story loading loading..