Congress hopes a $5 million prize competition will unlock the secret to automatically detecting deepfakes.

The annual defense policy bill, which the president signed into law Dec. 20, called on the Intelligence Advanced Research Projects Activity to start the competition as a way to stimulate the research, development, or commercialization of technologies that can automatically detect deepfakes. Congress authorized up to $5 million in cash prizes for the competition.

Deepfakes are machine-manipulated media that depict events that never happened. For example, many deepfakes commonly superimpose one individual’s face onto another’s person’s head as a way to deceive viewers into thinking the first individual said or did things that they never did. But the government’s broader of deepfakes includes any digitally altered video, audio or image that depicts something that does not exist or did not happen.

With the technology becoming more advanced and widespread, the Pentagon now viewss machine-manipulated media to be a national security issue. Military leaders imagine a digitally altered video that shows a national security leader giving orders they never gave or behaving unprofessionally could cause significant problems and confusion.

In addition to the competition, the new law requires the Director of National Intelligence to produce a report on the potential national security implications of deepfakes as well as the capabilities of foreign governments to produce and disseminate that media. Lawmakers are particularly concerned about the threat posed by Russia and China and in the law specifically required information on those countries’ capabilities and intentions.

Furthermore, the law calls on the Director of National Intelligence to notify Congress whenever there is a credible attempt by a foreign entity to deploy machine-manipulated media or machine-generated text aimed at interfering with U.S. elections.

The United States has undertaken multiple efforts to develop technology that can automatically detect deepfakes — such as DARPA’s Media Forensics (MediFor) program — and the legislation requires the director to produce a report on those efforts.

Nathan Strout covers space, unmanned and intelligence systems for C4ISRNET.

Share:
More In Intel/GEOINT