Researchers around the world are realizing how easy it is for criminals to manipulate voices and videos to make people look and sound like they are saying things that they didn't really say.
Using an off-the-shelf voice-morphing tool, scientists at the University of Alabama at Birmingham (UAB) successfully got past both automated and human verification systems like those used to access bank accounts.
Criminals can "steal" a voice in a variety of ways. UAB researchers say they might find it online and copy it, get it from a public presentation, or record it during a spam call.
Then they build a model of the sound by just using a few sentences. From there the attacker can create any message in the victim's voice.
UAB says the potential consequences are endless. The attacker could leave fake voice messages, create fake audio evidence in court, and impersonate the victim in real-time phone conversations. The researchers found automated verification algorithms were largely ineffective.
Advances in artificial intelligence and computer graphics make it possible to manipulate video footage of a public official to make them say anything. Stanford University researchers demonstrate:
There are ways to determine if you are watching legitimate footage. Mandy Jenkins works for Storyful, which verifies news content. She says people should be suspicious of things they can't believe are true and should be thinking "this doesn't seen right. Even though I kind of wish this was true, it's not necessarily true. I want to look into this a little more."
Jenkins determines the source for the video, audio or text, and then tries to figure out the agenda through a social profile. She says it comes down to "old-fashioned journalism, who's putting it out and what their role is."