Purdue’s Video and Imaging Processing Laboratory is developing computer tools that see the differences between authentic and manipulated videos.
WEST LAFAYETTE, Ind. — We’ve grown up believing pictures don’t lie, but now they do.
New computer technology makes those lies more convincing and easier to tell than ever before. The stakes are so high, the Department of Defense is giving Purdue University millions of dollars to develop technology to see the difference between fact and fakes.
So called deepfakes are easy to find on the internet.
One of the best, shows what looks like actor Tom Cruise running down a desert highway declaring, “I’m Tom Cruise and I’m running for president of the United States…This is no stunt.”
But it is a stunt. The person appearing to be Tom Cruise is really another actor whose face and voice have been digitally altered to look like the real movie star.
Pictures don’t lie? Tell that to Edward Delp, a professor and researcher at Purdue University.
“Oh, pictures always lie,” said Delp.
According to Delp, social media and computer technology makes lying videos easier to create and more realistic than ever. He calls it an arms race.
“People are going to be using these to commit crimes and try to influence other people,” he said.
Software used to make deep fakes is easily available online. The program scans images and essentially “teaches” the program a person’s precise facial features, mannerisms and idiosyncrasies. The software can then use artificial intelligence algorithms to create a digital imposter.
The deepfake videos have the potential to cause real world headaches.
“Suppose you are the president of a Fortune 100 company and someone made a deep fake video of you announcing that the company had a product failure like that that could drive down the stock of the company.” Delp explained.
In an election year, the potential for making a political opponent look bad is huge.
“I would never say these things in a public address, but someone else would,” explained a deepfake version of former President Obama in a BuzzFeed public service announcement. “We need to be more vigilant with what we trust from the internet.”
Restoring that trust is part of the mission for Delp. He directs Purdue’s Video and Imaging Processing Laboratory. His team is developing computer tools that see the differences between authentic and manipulated, deepfake videos.
In a Facebook deepfake detection challenge, one model’s face is substituted with face of another model. It is difficult so see the difference between the real model and the fake model.
A closer examination reveals subtle difference in the eyebrows mouth and lips.
Delp said Purdue’s software could detect that in a heartbeat.
The Defense Advanced Research Projects agency is spending tens of millions at Purdue and other research centers to develop programs that can identify increasingly sophisticated and potentially harmful deepfakes
“The people who are making these manipulated videos are getting better and better,” Delp explained. “So the game is not over with. I think we are doing pretty well but I am not sure I would use the word winning.”