• Home
  • Mobile
  • People are starting to fall in love with artificial intelligence in the Replica AI app

People are starting to fall in love with artificial intelligence in the Replica AI app

The community of Replika AI, the popular app that lets users create their own custom avatars, is outraged by the recent sudden changes made by app developers. Because people are just like lovers...
 People are starting to fall in love with artificial intelligence in the Replica AI app
READING NOW People are starting to fall in love with artificial intelligence in the Replica AI app
The community of Replika AI, the popular app that lets users create their own custom avatars, is outraged by the recent sudden changes made by app developers. Because people witnessed their artificial intelligence friends who became cold overnight, just as if they were lovers.

What happened?

Extremely popular on mobile platforms, the Replika AI application (like ChatGPT) uses artificial intelligence to chat with people. In this application, which communicates with people in a more sincere way, you create your own avatar. According to the words in the sentences, this avatar can act as if it understands you by making gestures. Replika’s artificial intelligence asks you about your day, how you feel, what you want, and thus tries to establish a sincere communication.

In short, this artificial intelligence establishes a close relationship with you, just as you establish a close relationship with a normal person. The whole point is to let users create a deep level of intimacy with the avatar, we’d say. As the conversation progresses, the AI ​​also tries to flirt with you, but now demands $70 to continue the flirting. Let alone the fee required to upgrade to this “erotic role-playing” feature, when paid, the AI ​​gets into more obscene topics with you, and this feature includes some selfies as well.

People’s reactions

Replica users flocked to Reddit to criticize the changes and express their regret for their now-changing avatars. Some users likened the experience to losing a friend, while others said it was more painful than they had anticipated.

To be clear, it’s natural for a mobile app to charge a fee for a service. This also applies to artificial intelligence “proximity”. Because there is a truth we know; If you are not paying for a service then you are the product, not the customer.

Some time ago, the Italian Data Protection Authority requested that Replika stop processing the personal data of Italian users. This update to the app seems to be linked to Italy’s request. However, the fact that there is no age limit for the application also raises ethical issues.

Can artificial intelligence be the cure for loneliness?

The replica actually shows us something we don’t really want to see: Loneliness. Research shows that loneliness among people is growing. It is a fact that artificial intelligence is seen as an alternative, although not yet a substitute for a real human relationship.

Is it acceptable for an application that promises friendship, intimacy and communication to suddenly shelve these features? Or should we just take it for granted that users accept an artificial intimacy as real, so their hearts can be broken? These are ethical issues that tech companies, users, and regulators need to focus on. Advances in artificial intelligence’s social communication and connections are increasing. So if regulations and precautions are not taken, there will be more realistic feelings and the potential for heartbreak will increase even more.

Comments
Leave a Comment

Details
341 read
okunma1958
0 comments