Romantic AI Chatbots Are Harvesting Personal Data, Researcher Says

If you're feeling a little blue this Valentine's Day and plan to turn to AI girlfriends/boyfriends, that might just be what the chatbot's developers are counting on. In what seems like an innocent inquiry about you, you might be giving away your personal and private data willingly.

Romantic AI
(Photo : Getty Images)

Don't Reveal Too Much on Your AI Date

As you would on a typical date with a person, you might end up telling stories and details about yourself to an AI chatbot to make the interaction seem more genuine. What they don't tell you is that those exchanges might not be as innocent and well-meaning as you think.

In a study by Mozilla called *Privacy Not Included project, researcher Misha Rykov said that the chatbots that are marketed as something that can enhance your mental health and well-being, simply "specialize in delivering dependency, loneliness, and toxicity."

As you let your guard down, the AI apps will proceed to gather as much data as they can from you. Rykov expressed: "To be perfectly blunt, AI girlfriends and boyfriends are not your friends," as reported by Gizmodo.

As you converse with the chatbots, you might be prompted to reveal certain details such as sexual health and the medication you use. This, in turn, can be sold for targeted ads, and could even fall into the hands of threat actors.

Around 90% of the apps tend to practice this data-collecting technique, and Mozilla found that the apps used an average of 2,663 trackers per minute. One app in particular, Romantic AI, has an exponentially higher number of trackers at 24,354 trackers per minute.

These trackers will send the information you provide to companies who can use it to make more tailored ads for you, and what's worse is that more than half of the apps don't provide the option to delete the data you have already shared with them.

The research looked into 11 romantic AI chatbots such as Replika, Chai, Romantic AI, EVA AI Chat Bot & Soulmate, CrushOn.AI, and many more, all of which were labeled unsafe in terms of privacy. Mozilla even labeled them as the worst products in its category in that regard.

Read Also: Silicon Valley Moves to Regulate 'Emerging Risk' of AI

Why You Shouldn't Trust Romantic Chatbots

Even though these chatbots say the right things to simulate a relationship with a human being, you have to remember that they are only programmed to do so. They are far from sentient and therefore aren't capable of any kind of sentimentality.

With that said, you can always converse with these mentioned chatbots without revealing real information about you. Some may use the data for more personalized responses, but it's not worth the risk when there's a possibility that your data might be sold to other companies.

You can always just invent information about yourself like you would in role-playing, or choose not to reveal anything at all. No matter how safe the products are advertised to be, under no circumstances should you share any private or personal details about yourself with any chatbot.

Related: GPT Store is Already Infested with 'Girlfriend' Bots Despite Restrictions

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

More from iTechPost