Your connection to the University of Michigan-Dearborn | Fall 2019

Eye on Research: Battling the Future of Fake News

Technology is taking political disinformation to scary new extremes. Could real-time debunking be one way to fight back?

As the country heads toward the next presidential election year, get ready to hear a lot more about “deepfakes.” These are bogus, computer-generated audio and video clips, often of influential people saying things they didn’t really say. They look and sound so real, it’s hard to tell they’re not. And experts warn this kind of faked multimedia promises to be one of the scarier new tools for spreading political disinformation. 

Professor of Electrical and Computer Engineering Hafiz Malik saw it coming years ago and has spent the past decade researching possible defenses. As an expert in audio forensics, his first step was trying to discover a reliable way to sort out real from fake audio. “Our initial hypothesis was that audio generated through computers, even if it was perceptually identical to human voice-based audio, would look different on the microscopic level,” said Malik. That turned out to be the case. When Malik subjected fake audio to spectral analysis, he found some telltale markers, and the method he has now developed has near 100 percent accuracy. Plus, because audio is a crucial component of most video, his method can also be useful in calling out video deepfakes.

Still, Malik said that’s only a partial solution. 

“With political disinformation, a deepfake will spread almost instantly on social media. But by the time law enforcement or the media or a campaign responds, in some ways the damage is already done,” he said. 

Because of this, Malik said a robust defense must not only be able to detect fake multimedia — it also needs to do it in real time. That’s where Malik is now turning his attention. The new software he’s developing would allow non-experts to render an “authenticity score” for suspect audio. 

A tool like that could be particularly useful for journalists who could fact-check digital audio or video. Social media platforms could even directly integrate the forensic software — automatically processing multimedia and alerting users of suspicious content. Even then, in a world where seeing is no longer believing, Malik said we could be in for a wild ride.

Back to top of page