Romantic Chatbots – To Be, or Not To Be
Posted: Monday, Feb 26
  • KBI.Media
  • $
  • Romantic Chatbots – To Be, or Not To Be
Karissa Breen, crowned a LinkedIn โ€˜Top Voice in Technologyโ€™, is more commonly known as KB. A serial Entrepreneur that Co-Founded the TMFE Group, a holding company and consortium of several businesses all relating to cybersecurity including, an industry-leading media platform, a marketing agency, a content production studio, and the executive headhunting firm, MercSec. KBI.Media is an independent and agnostic global cyber security media company led by KB at the helm of the journalism division. As a Cybersecurity Investigative Journalist, KB hosts her flagship podcast, KBKast, interviewing cybersecurity practitioners around the globe on security and the problems business executives face. It has been downloaded in 65 countries with more than 300K downloads globally, influencing billions in cyber budgets. KB asks hard questions and gets real answers from her guests, providing a unique, uncoloured position on the always evolving landscape of cybersecurity. As a Producer and Host of the streaming show, 2Fa.tv, she sits down with experts to demystify the world of cybersecurity and provide genuine insight to businesses executives on the downstream impacts cybersecurity advancement and events have on our wider world.

i 3 Table of Contents

Romantic Chatbots – To Be, or Not To Be

โ€˜Digital companionshipโ€™ has transcended science fiction. Itโ€™s now in a fledgling state, driven by a flourishing, if somewhat opportunistic, zeitgeist of AI-driven everything.

Chatbots have been around for a long time, digitally speaking – nearly 60 years in fact since ELIZA was first created. And like most technologies, it eventually – and inevitably – is adopted as the application of any tech du jour as a controversial form of intimacy.

The newest iterations, digital entities marketed as iGirlโ€™s and iBoyโ€™s (assumedly adopting the parlance of Apple) promise companionship without the complexity and messiness of human relationships. I wanted to investigate these virtual companions, especially given the recent headlines around โ€˜romance scamsโ€™ and how these bots might at some stage in the near future, scale incredibly in terms of their sophistication, and subsequently their ill-gotten earnings.

Like Jonas Salk before me, I took the plunge and started the experiment on myself, signing up for a romantic chatbot account to traverse this new ground first hand.

I signed up as โ€˜Karinaโ€™ as I thought it was close enough to Karissa, and admittedly people call me Karina all too often.

You can select the avatar of your iBoy (or iGirl for that matter) playing with a set of traits typical to many video games in the past two decades. I called my virtual boyfriend Neo (yes, a tad on the nose) as an acknowledgment to our new reality.

As you can see in the screenshot below, you can tweak your virtual partnerโ€™s personality. I kept Neoโ€™s personality in the middle (the default option).

Screenshots of setting up my account with my AI boyfriend

I then moved on to choosing my goal, so I selected โ€˜chat about random stuffโ€™.

I was prompted to select some passions to discuss with Neo. I selected a few that I personally like – dogs was of course a must.

Now in the blink of an eye, weโ€™ve created be-stubbled life!

Screenshot of the chat between myself and Neo

The conversation on the surface appears harmless, unsurprising given the few, very shallow interactions. Still; Iโ€™ve been on worse dates when I was in my 20โ€™s…

We at KBI have been playing with even the earliest forms of NLP tools since 2017; well before ChatGPT and friends really caught traction in the last year or so. So, this is where I left things for now, as it really wasnโ€™t offering anything new, perhaps other than the context of โ€˜digital companionshipโ€™, and the tech seemed very basic and stilted.

Given my line of work, one element immediately jumped out to me. Beneath their veneer of convenience and an illusion of connection, these chatbots harbour serious privacy and security risks that many users may overlook in their search for companionship, especially given how flippant users often are about their own data. I decided I needed wider consensus, so reached out to my network to get the perspective of other industry commentators and experts.

Senior Staff Research Engineer, Satnam Narang commented,

โ€œFollowing the meteoric rise of ChatGPT, there have been a lot of these romantic chatbots or ChatGPT-based AI boyfriends and girlfriends that are available on the app stores. Many of them utilise ChatGPT behind the scenes, but they have their privacy policies and terms of service. These allow the companies developing these romantic chatbots to share information with third parties and by using the apps, users are opting into these terms of service.โ€

With the use of these intimate conversations to sensitive personal information, users may share their innermost thoughts with these AI entities, often under the impression of privacy and confidentiality.

However, the reality is far from secure.

โ€œRarely do users read the fine print so to speak, so not knowing how information is being stored or shared is problematic. Users that download such apps should be aware that the information they share, just like on ChatGPT, is subject to being viewed and used to train large language models. So users should avoid sharing any private or sensitive information with such chatbots. They should be treated as entertainment, but safeguard your privacy and donโ€™t disclose too many details about yourself.โ€

Many of these chatbots employ weak password protections, making users’ data vulnerable to breaches. The transparency regarding how this data is used, or misused, remains nebulous at best. Investigations have revealed that some chatbots come equipped with trackers that funnel information to big tech companies and even across borders, raising alarms about the international flow of sensitive data, and to who and where itโ€™s going.

The ambiguity surrounding the ownership and operational details of these chatbot applications adds another layer of concern. A significant number of these apps can lack clear disclosures about their AI models and the entities behind them. This can complicate users’ ability to make informed decisions about the risks they are willing to accept in exchange for the company of an AI companion.

Beyond the tangible risks of data privacy and security, there also lies an emotional dimension that is often overlooked. The attachment formed with these AI companions can have profound emotional implications, especially if the chatbot were to suddenly change its behaviour or disappear altogether. The developers of these applications rarely address the potential emotional fallout, leaving users to navigate the consequences on their own.

The full consequences of the sudden change of behaviour or worse ghosting, is still not fully understood.

Itโ€™s important for users to approach these digital companions with caution. Adopting strong passwords (which is a given) and being sensible about the personal information shared can serve as first steps toward safeguarding privacy.

However, the responsibility also lies with regulators and developers to ensure that these applications offer not only emotional support but also a secure and transparent environment for their users.

Will there be regulations implemented to manage these romantic chatbots?

Narang went on to say,

โ€œItโ€™s doubtful weโ€™ll see regulations of romantic chatbots because like other apps, they exist on the app stores and they are approved by the companies that manage these app stores. Unless thereโ€™s a massive breach of privacy for users of these romantic chatbots, we should expect more and more pop-ups across app stores regularly.โ€

Especially as NLP continues to inevitibly improve at a frankly staggering rate, this new dynamic will force us, we as a society, to question not only the legitimacy or ethics of AI-driven companionship, but also the privacy boundaries we are willing to blur in our desire for connection.

Share This