homehome Home chatchat Notifications


This AI sounds just like Joe Rogan -- and the possibilities are disturbing

Impressive and scary at the same time. Brace yourself for a new era of fraud.

Mihai Andrei
May 23, 2019 @ 12:03 am

share Share

The world better brace itself for an era of deepfakes.

Lately, we’ve seen that you can’t always trust what you see — now, you shouldn’t trust your ears either. Up until recently, artificial voices have sounded robotic and metallic, but an AI startup has recently published an incredibly realistic fake voice, mimicking famous podcaster and announcer Joe Rogan.

Rogan’s voice was re-created to talk about a chimp hockey team and the advantages of being a robot — topics which, while not out of the realm of what Rogan might discuss, have never been addressed by the podcaster. Deesa, the company behind the new voice algorithm, says that the implications of this are massive.

“Clearly, the societal implications for technologies like speech synthesis are massive,” Dessa writes. “And the implications will affect everyone. Poor consumers and rich consumers. Enterprises and governments.”

The consequences can be both positive and negative. Just think about the possibility of offering realistic synthetic voices to people with speech impairments, or the revolution that can happen in audiobooks and dubbing. However, at the same time, the possibility for fraud is also very concerning. You think “fake news” is a problem now? Wait ’til something like this hits the shelves.

Understandably, Deesa has not released any details about how its AI works and will not be publishing the results in a scientific journal — the possibility for malicious use of the technology is simply too great. However, with over 1,300 episodes of the Joe Rogan podcast, the AI sure had a lot of material to train itself with.

It remains to be seen just how useful or dangerous this technology will be. So far, although deep fakes emerged quite a while ago, they’ve yet to make a real impact in the world, leading many to believe that fears and concerns are far overblown. However, if we can learn anything from technology cycles in the past, it takes a while for these technologies to be implemented, so there may yet be an emergence of these issues in the not-too-distant future.

If deepfakes actually take off, hearing AI-Joe-Rogan saying that chimps will rip your balls off will the least of our concerns.

share Share

Ronan the Sea Lion Can Keep a Beat Better Than You Can — and She Might Just Change What We Know About Music and the Brain

A rescued sea lion is shaking up what scientists thought they knew about rhythm and the brain

Did the Ancient Egyptians Paint the Milky Way on Their Coffins?

Tomb art suggests the sky goddess Nut from ancient Egypt might reveal the oldest depiction of our galaxy.

Dinosaurs Were Doing Just Fine Before the Asteroid Hit

New research overturns the idea that dinosaurs were already dying out before the asteroid hit.

Denmark could become the first country to ban deepfakes

Denmark hopes to pass a law prohibiting publishing deepfakes without the subject's consent.

Archaeologists find 2,000-year-old Roman military sandals in Germany with nails for traction

To march legionaries across the vast Roman Empire, solid footwear was required.

Mexico Will Give U.S. More Water to Avert More Tariffs

Droughts due to climate change are making Mexico increasingly water indebted to the USA.

Chinese Student Got Rescued from Mount Fuji—Then Went Back for His Phone and Needed Saving Again

A student was saved two times in four days after ignoring warnings to stay off Mount Fuji.

The perfect pub crawl: mathematicians solve most efficient way to visit all 81,998 bars in South Korea

This is the longest pub crawl ever solved by scientists.

This Film Shaped Like Shark Skin Makes Planes More Aerodynamic and Saves Billions in Fuel

Mimicking shark skin may help aviation shed fuel—and carbon

China Just Made the World's Fastest Transistor and It Is Not Made of Silicon

The new transistor runs 40% faster and uses less power.