Techno Blender
Digitally Yours.

One Day We’ll All Have AI Friends if Humanity Can Behave Itself

0 43


In seven years we’ll all have an artificial friend, if Eugenia Kuyda’s vision pans out. And if the ugly side of human nature doesn’t quash it first.

Kuyda has some insight into this prediction as the chief executive officer of Replika, a startup that develops chatbots with generative AI capabilities. The app draws millions of dollars a month in subscription revenue from users, many of whom attest to being in love with their disembodied companion.

“Instead of having an iPhone, we’ll all have an AI friend,” Kuyda said. “By 2030, it will be ubiquitous.”

In this week’s episode of the Bloomberg Originals video series AI IRL, we talk about where the boundaries are on human interactions with chatbots and the ethical minefield that’s becoming even more difficult to navigate.

Before we can all have an AI friend in our pockets, Kuyda will have to navigate a rapidly evolving technology that’s capable of inspiring deep emotions in its human users. Replika was the subject of a debate that played out earlier this year about where to draw the line in conversation. In response to complaints that Replika’s chatbots could stray into discussing sexual content with minors, the company introduced filters that prevented adult themes being raised at all. But that prompted emotional protests from grown-ups who said the change made it feel like a loved one had died or was rejecting them.

These themes were already explored ten years ago by Spike Jonze’s movie, Her. Like the character Samantha in that film, there may well be no comfortable way to deal with the fallout of emotionally-impactful changes to an AI personality.


In seven years we’ll all have an artificial friend, if Eugenia Kuyda’s vision pans out. And if the ugly side of human nature doesn’t quash it first.

Kuyda has some insight into this prediction as the chief executive officer of Replika, a startup that develops chatbots with generative AI capabilities. The app draws millions of dollars a month in subscription revenue from users, many of whom attest to being in love with their disembodied companion.

“Instead of having an iPhone, we’ll all have an AI friend,” Kuyda said. “By 2030, it will be ubiquitous.”

In this week’s episode of the Bloomberg Originals video series AI IRL, we talk about where the boundaries are on human interactions with chatbots and the ethical minefield that’s becoming even more difficult to navigate.

Before we can all have an AI friend in our pockets, Kuyda will have to navigate a rapidly evolving technology that’s capable of inspiring deep emotions in its human users. Replika was the subject of a debate that played out earlier this year about where to draw the line in conversation. In response to complaints that Replika’s chatbots could stray into discussing sexual content with minors, the company introduced filters that prevented adult themes being raised at all. But that prompted emotional protests from grown-ups who said the change made it feel like a loved one had died or was rejecting them.

These themes were already explored ten years ago by Spike Jonze’s movie, Her. Like the character Samantha in that film, there may well be no comfortable way to deal with the fallout of emotionally-impactful changes to an AI personality.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment