Sign In  |  Register  |  About Burlingame  |  Contact Us

Burlingame, CA
September 01, 2020 10:18am
7-Day Forecast | Traffic
  • Search Hotels in Burlingame

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

AI could be 'nail in the coffin' for the internet, warns Neil DeGrasse Tyson

Astrophysicist Neil deGrasse Tyson warned AI could be the 'final nail in the coffin' for the internet as concerns surrounding the technology continue to mount

Astrophysicist Neil deGrasse Tyson issued a stark warning on the rise of artificial intelligence, noting that the development of fake videos and other media could be the final "nail in the coffin" for the internet. 

The renowned astrophysicist and author discussed his thoughts surrounding the future of the global computer network during Thursday's episode of "The Fox News Rundown" podcast.

"Part of me wonders, maybe AI will create such good fakes that no one will trust the Internet anymore for anything, and we just have to simply shut it down," deGrasse Tyson said. "Maybe it's the final nail in the coffin in the internet."

"Thirty years, it was a good run from the early nineties to the early twenties and 2020s, now it's time for the next thing," he continued. "That could be the greatest gift of AI to the internet. The internet gets a vote of no confidence from us."

His remarks come as tech leaders are expected to meet with Vice President Kamala Harris at the White House Thursday to address concerns surrounding the technology's implementation. 

But even as key tech executives like Elon Musk have called for a pause in AI development citing safety concerns, deGrasse called that expectation "completely unrealistic."

ELON MUSK, APPLE CO-FOUNDER, OTHER TECH EXPERTS CALL FOR PAUSE ON 'GIANT AI EXPERIMENTS': 'DANGEROUS RACE'

"The United Arab Emirates has a minister of AI, just to put this in context. Countries care about AI all around the world in a letter signed by a bunch of people, I don't think that's going to stop it or pause it," he said. 

"But what they should have said was given how fast it's going let's re-double our efforts to see all the bad things it could do and try to prevent that," he continued. 

Musk, Steve Wozniak, and other tech leaders signed a letter in March, which asked AI developers to "immediately pause for at least six months the training of AI systems more powerful than GPT-4." The letter was made by the Future of Life Institute and signed by over 1,000 people.

CREEPY APOLLO 11 NIXON DEEPFAKE VIDEO CREATED BY MIT TO SHOW DANGERS OF HIGH-TECH MISINFORMATION

Despite widespread concern over AI's potential, deGrasse detailed its "immediate threat" as its capability to "fake" things, including the ability to replicate a human voice. 

Scammers have reportedly used the technology to clone voices in an effort to secure ransom money. One Arizona woman said her family fell victim to the "life-changing" scam that she ultimately remembered as the "worst day" of her entire life. 

"There's a whole manner of evil, nefarious ways this can be abused," he said. 

"Maybe making things much worse… induces a level of distrust where it has no power over us at all," he continued. 

AI-generated pictures, videos and voices — called deepfakes — are so believable and widely available that people will soon not be able to discern between real and manipulated media, an image analyst told Fox News last month.

"When we enter this world where any audio, image or video can be manipulated, well, then how do you believe anything?" said Hany Farid, a professor at University of California, Berkeley's School of Information.

"Anytime I see the president speak or a candidate speak or a CEO speak or a reporter speak, now there's this lingering doubt."

Listen to the full interview, here.

Fox News' Adam Sabes contributed to this report. 

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 Burlingame.com & California Media Partners, LLC. All rights reserved.