Q: How can deepfake videos be debunked?

Video camera showing image on viewfinder Photo by Donald Tong from Pexels


Deepfake videos, which superimpose images on videos that aren’t real, are becoming more sophisticated. Jordan Peele made one of President Barack Obama (which was shown in class) to illustrate the dangers. Others have been made for more humourous purposes, like this one showing a mashup of Mr. Bean and President Donald Trump and another combining the face of Steve Buschemi with Jennifer Lawrence. 

When the images look real, how can a news literate viewer spot these hoaxes? 

There’s no “silver bullet” to detecting them, says Tom Van de Weghe, a John S. Knight Fellow Stanford University. But Siwei Lyu, a computer science professor at SUNY Albany, discovered one reliable tell: blinking. Because a vast number of photographs are used to make the videos and so few published photographs show the subjects with their eyes closed, those depicted in deepfake videos rarely blink.

Another writer who studies deep learning, Jonathan Hui, says there are also some ways to identify deepfakes with the naked eye.  He says to look for blurring in the face, particularly when the face is partially blocked by hands or something else. Also note whether the subject has a slight double chin, double eyebrows or double edges on his or her face. Look for changes in skin tone, especially near the edges of the face. Finally, take note of any flicking, particularly when the face is partially blocked. 

If counting blinks or searching for blurry double chins sounds a bit daunting, the folks at Poynter have one piece of reassurance: It’s still very difficult to make a convincing deepfake video. And each deepfake that is done well and goes viral serves as a PSA to alert the public about the practice.