Deepfakes made by Artificial Intelligence can imitate anyone's voice - including Derek Kevra's

Artificial intelligence has come so far that one may not even know if they're interacting with their best friend - or a criminal looking to steal their money.

Recent examples of the Pope decked out in Balenciaga, Comedian Bill Hader morphing into Tom Cruise, and an eerily-convincing demo of voice-dubbing Morgan Freeman are all part of the parade of videos underscoring just how far A.I. has come. But that's only the tip of the iceberg.

They're entertaining to watch online, but there's a more nefarious temptation that comes with the technology: Deepfakes.

The mimicking of another person's voice could soon be a major source of online scams, misinformation, and trouble for governmental institutions. 

"These days, technology has advanced to the extent that we just need one minute of samples to create anyone's voice," said Dr. Khalid Malik, who studies A.I. at Oakland University.

That could mean online scammers pranking unsuspecting seniors and trying to steal money. 

"They'll say, 'hey, I'm in terrible trouble. I've been kidnapped, somebody's got a gun to my head,' you know, that kind of thing is going to come up very often," said J.J. Ryan, a grad student at Oakland University. "And it'll be believable because it's their voice."

It's not just celebrity's voices that people could imitate. It could be a person someone knows closely. Ryan described one real-world example where a CEO at a bank sent someone $100,000 pounds after getting a call from a familiar voice.

"The chairman of the board or something called and said ‘transfer $100,000 pounds into this account.’ He did it - he thought it was that person. It was something that was 100% real to him," he said.

To show off just how powerful the technology has become, FOX 2's Derek Kevra tested it out on some of his co-workers. With the help of Malik and his students from Oakland University, he sent them recordings of his voice. 

Check out more of FOX 2's coverage of A.I. here

From there, the students created completely artificial recordings of Kevra's voice. With the help of a made-up story about a haircut, he called FOX 2 meteorologists Alan Longstreet and Rich Luterman during separate calls. 

Once they answer, the deepfake recording of Kevra is played. It did not disappoint.

"Uhh, hello," says Longstreet when he picks up the phone.

"Right man, I gotta talk fast since I don't have much time. I made a big mistake and I need your help talking through it," says the deepfake Kevra. "I went to get a haircut today and the stylist went a little too far up with the clippers, essentially shaved half my head."

Longstreet laughs in response. "Okay, that's not funny. This is serious, okay."

"My whole head is shaved man. On the list of things honestly is to go look for a wig, so first off, can you work for me tomorrow and second, what the heck do I do?" the deepfake Kevra says.

"Wait, what the **** are you telling me? Yeah sure, wait, who's going to be mad at you?" Longstreet replies.

A similar exchange unfolded with Luterman. You can watch both phone calls, including a second deepfake telling the person on the other line that the first call was a deepfake. Both were believed before Kevra breaks things up.

The realness is what troubles experts because of scammers using the voices of real people and calling their parents asking for money. 

MORE: Artificial intelligence having seismic impact on music and entertainment industry

Kevra also tested out a separate deepfake recording with his mom and dad, both of whom were skeptical of the call due to the voice asking for money. 

That's what the students at Oakland University say will be important going forward as A.I. makes replication of people's voices and faces easier.

"Be skeptical. That’s going to be your No. 1 defense, be skeptical," Ryan said. "I saw a great meme on Facebook the other day: ‘treat every day as if it were April Fools' Day" and have that kind of skeptical-ness."

Malik says there's also a worry about what A.I. will mean for larger governing bodies and periods of elections.

"The biggest worry that people like us have is what if deepfakes start appearing in Democratic institutions," the professor said.

"What I would say is, if you thought fake news was bad in the last election, wait for this one," Ryan said. 

"We're hoping that people will have the ability to detect those and not air them, but it may go out on the airways at some point and I like to say sooner or later CNN is going to air something that's going to be 100% fake and the world's going to change on a dime."