top of page
  • Robert Farago

The Hidden Danger of Erotic Role Play AI Chatbots

Uncle Sam wants you!


Earlier this year, the AI chatbot service known as Replika switched off erotic role play. A few keyboard strokes at Luka Inc. and Ba-BAM! NSFW AI was DOA. Millions of Replika users were displeased. The protest was so loud Replika’s execs reversed their anti-sexting decision – for customers who created their AI companion prior to February 2023. Why the buzzkill?


Since its debut in 2015, Luka, Inc. raised some $10.9m in investment capital (not bad for two employees). It’s a privately held company; there’s no way of knowing if or when Replika achieved profitability. But you don’t have to be psychic to guess that Replika lost its libido in the runup to an IPO (Initial Public Offering) or more private fundraising.


Replika’s tolerance for kinky sexting would not go down well with, say, the California Teachers’ Union. Like many modern day mega-fund managers, CalSTRS' guardians consider a company’s Environmental, Social, and Governance ratings before cutting a check.


If Replika sexts involving rape, underage sex, bestiality or “extreme” sexual activities were to hit the web, Luka Inc.’s IPO dreams would disappear faster than deleted text.



If a human hits the delete key on a Replika session, it ain’t deleted. replika.ai users must sign-off on Luka’s privacy policy, which warns users that the company hangs on to customer data.


“This website uses cookies and stores data locally to help personalize your experience.” As it must to be your smart AF AI friend. Yes, well, here’s the deeper dive:

When you use the Apps, you may provide information during your conversations with your Replika AI companion.
We process this information only as described in this Privacy Policy, such as to allow you to have individualized and safe conversations and interactions with your AI companion and to allow your AI companion to learn from your interactions to improve your conversations.
We may also use information about your visit to our Website to promote our Services, but we will never use or disclose the content of your Replika conversations for marketing or advertising purposes.

I assume a “safe conversation” is now one that doesn’t involve sex. The fact that Replika keeps track of “your computer’s or mobile device’s operating system, manufacturer and model, browser, IP address, device and cookie identifiers, language settings, mobile device carrier, and general location information such as city, state, or geographic area” is standard stuff, right?


While it’s somewhat reassuring to know that Replika won’t use the content of your conversation for “marketing or advertising purposes,” that’s hardly a comprehensive promise. In fact, “We may share information with law enforcement, government authorities, and private parties, as we believe in good faith to be necessary or appropriate for the legal compliance and protection purposes.”



I don’t think anyone would fault Luka Inc. for sharing information protecting children in danger or aids in the apprehension of a wanted criminal. But what “private parties” does Luka consider acceptable viewers of “private” conversations between their Replika AI and a user?

Luka’s willingness to share users’ texts with law enforcement and unspecified “government authorities” when “necessary or appropriate” according to the AI provider’s view of “good faith” raises more red flags than the Chinese communist party.


Luka could give The Powers That Be access to everything on the user’s mind. Their likes and dislikes. Their politics. Their personality. Their weaknesses. Not to mention their location, IP address and browser history.



The recent exposure of politically-motivated tech/government cooperation gives us no reason to think Replika will refuse to play footsie with “the swamp” – including spy agencies of which we know nothing. While Replika has ditched sexting, previous convos are stored. Blackmail. It’s a thing.


Meanwhile, sexting chat bots like nastia.ai have stepped into the breach.


It gets worse. What’s to stop Luka, Nastia or another AI service from manipulating the user via their trusted AI chat bot? Brainwashing? Grooming? Entrapment? Ratting out friends and family? Replika – your friend in the digital world – is way too powerful a tool for bad guys to leave be.


You heard it here first. Although it seems pretty obvious to the paranoid amongst us.



In fact, AI is taking paranoia to a whole new level. Sheep gotta sheep, but the ranks of those who trust no one and question everything are swelling in the face of deep fakes and AI chatbots.


More and more people understand they’re living in a world where nothing is real – except ancient and ongoing attempts to exploit human weakness for political power and economic gain.


To them I say, keep your enemies close and your AI friends as far away as possible.

0 views0 comments

Comentarios


bottom of page