DeepMedia to Help AFRL Spot Deep Fakes
A deep fake is a form of synthetic media, or false images and videos, generated by software. The technology has been used to deceive people into thinking that a well-known person is involved in a fake activity.
DeepMedia will use its technology to detect deep fakes and will offer analytic tools for enhanced exploitation of multimedia data and publicly available information. According to Daniel Selli, defense and intelligence industry outreach director for DeepMedia, military operations can be disrupted and warfighters could face increased threats when adversaries create realistic videos with superimposed figures.
He said the company is also developing synthetic face and voice technologies to ensure that servicemen would be less exposed to false information.
Amanda Lannie, a computer scientist at AFRL, said the Department of the Air Force and the rest of the military need reliable information at all times.
Deep fakes are used to fool people into thinking that false information is real. It can be used to execute scams and hoaxes, social media manipulation, disinformation attacks, identity theft, financial fraud election manipulation and other cyber-related attacks.
The Tech Connect tool gives small businesses an opportunity to reach out and share their proposals with the Department of the Air Force. According to Selli, the tool could give other small businesses an improved opportunity to compete with larger companies.
Category: Defense and Intelligence
Tags: Air Force and Space Force Tech Connect analytic tools data integrity deep fake DeepMedia Defense and Intelligence small business synthetic media US Air Force Research Laboratory