Hello, Guest!

Defense and Intelligence

DeepMedia to Help AFRL Spot Deep Fakes

Data integrity

DeepMedia to Help AFRL Spot Deep Fakes

The U.S. Air Force Research Laboratory is working with small business DeepMedia to identify and protect users from the threat of deep fakes.

A deep fake is a form of synthetic media, or false images and videos, generated by software. The technology has been used to deceive people into thinking that a well-known person is involved in a fake activity.

DeepMedia will use its technology to detect deep fakes and will offer analytic tools for enhanced exploitation of multimedia data and publicly available information. According to Daniel Selli, defense and intelligence industry outreach director for DeepMedia, military operations can be disrupted and warfighters could face increased threats when adversaries create realistic videos with superimposed figures.

He said the company is also developing synthetic face and voice technologies to ensure that servicemen would be less exposed to false information.

The partnership was made possible through the Air Force and Space Force Tech Connect, AFRL said.

Amanda Lannie, a computer scientist at AFRL, said the Department of the Air Force and the rest of the military need reliable information at all times.

Deep fakes are used to fool people into thinking that false information is real. It can be used to execute scams and hoaxes, social media manipulation, disinformation attacks, identity theft, financial fraud election manipulation and other cyber-related attacks.

The Tech Connect tool gives small businesses an opportunity to reach out and share their proposals with the Department of the Air Force. According to Selli, the tool could give other small businesses an improved opportunity to compete with larger companies.

Sign Up Now! Potomac Officers Club provides you with Daily Updates and News Briefings about Defense and Intelligence