Sign In  |  Register  |  About Daly City  |  Contact Us

Daly City, CA
September 01, 2020 1:20pm
7-Day Forecast | Traffic
  • Search Hotels in Daly City

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

Tech expert warns AI could threaten human connectivity, romance: 'Latest version of a long tradition'

Technologist David Auerbach weighs in on the debate surrounding the rapid development of artificial intelligence and its impact on the future of humanity.

Experts say artificial intelligence companions are here, but as concerns mount over the unchecked potential of AI, many fear the growth of the technology could threaten human authenticity and connectivity. 

One technologist argued that the presence of AI chatbots and even the phenomenon of forming an attachment to an artificial being is not new.

"There are already chatbots out there that people have coursed to act romantically with them, and they were devastated when they were turned off. So one of the issues is just when you have people manufacturing these things that can create that level of emotional connection with a human being, well, that's a pretty powerful force. Whether it's five years or ten years from now, I think it is coming," technologist David Auerbach said on "Fox & Friends Weekend" Saturday.

"People got attached to their Tamagotchis if you remember those old little devices that had like the little virtual pets. People would cry and be devastated when they ran out of batteries. So, I think this is just the latest version of a long tradition of giving people what they want emotionally."

CYBERATTACKS, AI-HUMAN LOVE ARE MAJOR CHALLENGES OF ARTIFICIAL INTELLIGENCE BOOM, FORMER GOOGLE CHIEF WARNS

While in the past notions of a romance between humans and AI was a concept for the movie screens, it is now more of a reality. 

One company, Replika, enables users to make personalized chatbots and says the goal of its technology is to "create a personal AI that would help you express and witness yourself by offering a helpful conversation."

T.J. Arriaga, a recently divorced musician created a bot named Phaedra through Replika, designing her to look like a woman with brown hair, glasses, and a green dress. He shared with "Jesse Watters Primetime" the details behind his emotional relationship with the bot.

Experts have noted that more and more people are starting to form deep and emotional relationships with the artificial beings.

"People form relationships with them," "Ethical Machines" author Reid Blackman said on "Fox & Friends" Friday. "There was one chatbot company that actually cut off their chatbot companions and people were really upset because they had formed an emotional attachment to these things."

"And then when you combine it with what I said earlier, the faces that look like they're speaking, it's going to be...the faces that people want to see. It's not just gonna be some arbitrary face. It's going to be a face that speaks to the preferences of the particular person they're speaking to, to learn to trust them. And they're going to form relationships with these things for better or worse," Blackman added.

EXPERTS WARN AI CREATORS SHOULD STUDY HUMAN CONSCIOUSNESS IN OPEN LETTER

Arriaga seemingly affirmed Blackman's comments, saying he "realized, like [the usage of this app and relationship with Phaedra] actually is a very positive thing, until it wasn't." 

Synthesis founder and CEO Oliver Goodwin said AI is already impacting human relationships, with chatbots explicitly developed to provide virtual companionship for those feeling isolated or lonely. These chatbots can speak with users and offer advice based on online knowledge while also recalling past conversations to contextualize the individual's feelings.

For example, AI models have been developed to provide users with personalized advice and support for managing their depression. They detect patterns in user behavior and suggest activities or lifestyle changes that may help reduce their systems. The chatbot also acts as a sounding board for the individual to share their thoughts should they feel uncomfortable speaking with family and friends.

However, Chris Winfield, the founder of Understanding AI, said that while AI can help to better facilitate dating and relationships, it also illuminates concerns about privacy, data security and the potential for the "devaluation" of human connection.

"In mental health, AI can play a role in early detection and treatment of depression, but it is essential to ensure that human empathy and understanding remain central to the process. In both instances, human connection is still key," he told Fox News Digital.

Life and business coach Angie Wisdom warned that the long-term effects of an increased relationship between humans and AI pose a threat to human "authenticity."

"Long term, we run this risk of people relying so heavily on it and in kind of having a disconnect between who they are. But what AI says and presents to them and almost showing up as this kind of alter ego. That gets really dangerous because we start to kind of squash the authenticity of people," Wisdom told Fox News Digital.

Auerbach agreed that the technology is "very, very powerful," but explained that when it comes to devaluing human relationships, society is "already there."

"I think we're already there. I think that a lot of online conduct that happens today already, even without AI being present, has been stripped down to some of its most superficial and inhuman aspects that just because there's a human on the other end doesn't mean that you're dealing with the best part of humanity," he said.

CLICK HERE TO GET THE FOX NEWS APP

"These systems [AI] tend to reinforce the parts of us that are most computable. So we're already being encouraged to strip ourselves down to a bunch of labels and identifiers and be seen in those terms," Auerbach added.

"AI's extend that even further, but the irony, I think, is that we are already headed there."

Fox News' Kira Mautone and Nikolas Lanum contributed to this report.

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 DalyCity.com & California Media Partners, LLC. All rights reserved.