Listen to the article
DARPA Awards Aptima Contract to Combat Deepfakes and Digital Disinformation
The Defense Advanced Research Projects Agency (DARPA) has taken a significant step forward in the fight against digital disinformation by awarding a commercialization contract to Aptima, Inc. The agreement marks a pivotal moment in government efforts to counter the rapidly growing threat of synthetic and manipulated media across public and private sectors.
This new contract, issued through DARPA’s Commercial Strategy Office, tasks Aptima with transforming years of government-funded research on media forensics into practical tools that can be deployed across industries vulnerable to deepfakes, AI-generated disinformation, and other forms of digital manipulation.
Aptima will lead the commercialization of DARPA’s Semantic Forensics program, known as SemaFor, building on its previous role as the initiative’s test and evaluation lead. Launched by DARPA’s Information Innovation Office in 2020, SemaFor represents a conceptual evolution from earlier forensics efforts by focusing not just on technical detection of manipulated content but on understanding the semantic meaning and intent behind such manipulations.
“As falsified media technologies improve, they move faster than traditional forensic tools, leaving industries without reliable ways to spot and fight advanced media manipulations, like deepfakes,” said Shawn Weil, Chief Growth Officer at Aptima. “DARPA is leading the way to fill this gap by going beyond improving detection capabilities, developing better ways to determine why and how content has been synthesized or manipulated – ultimately enabling trust and security in digital media across different sectors.”
The SemaFor program builds on DARPA’s earlier Media Forensics (MediFor) program, which began in 2016 and focused primarily on detecting visual and auditory manipulations through technical means. While MediFor concentrated on signal-level analysis—examining inconsistencies in lighting, shadows, geometry, and metadata—SemaFor takes a more sophisticated approach by analyzing the semantic content and context of potentially manipulated media.
This shift reflects DARPA’s recognition that the most dangerous digital manipulations are increasingly difficult to detect through technical artifacts alone, particularly as generative AI technologies continue to advance at a rapid pace.
“Statistical detection techniques have been successful, but media generation and manipulation technologies applicable to imagery, voice, video, text, and other modalities are advancing rapidly. Purely statistical detection methods are now insufficient to detect these manipulations, especially when multiple modalities are involved,” DARPA explained in its FY 2025 budget justification document.
Under the new commercialization initiative, Aptima’s responsibilities extend beyond technical development. The company must identify viable markets for these capabilities, develop operational prototypes suitable for deployment in non-classified environments, and create engagement strategies with government agencies, civil society organizations, and private companies.
Potential applications include integration with content moderation workflows at social media companies, forensic analysis tools for newsrooms and fact-checking organizations, and early-warning systems for election officials and public safety agencies. The technology could also strengthen government communications and counterintelligence efforts as hostile foreign actors increasingly employ generative media in disinformation campaigns.
The urgency of this work has been highlighted during the 2024 U.S. election cycle, which saw numerous instances of AI-generated political disinformation. From deepfake videos of candidates making false statements to cloned voices in misleading robocalls, these sophisticated forgeries often spread widely before being debunked, underscoring the need for fast, reliable verification tools.
DARPA’s decision to partner with Aptima for this commercialization effort reflects its broader strategy of promoting dual-use technology—adapting innovations developed for defense and intelligence purposes for wider societal benefit. For Aptima, which has a long history of developing cognitive and behavioral technologies for the Department of Defense, the contract represents both a technological challenge and an opportunity to bridge the gap between classified research and commercial applications.
The road ahead will likely involve forming partnerships with cybersecurity firms, AI startups, content authenticity consortiums, and academic institutions. It may also include developing APIs that allow social media platforms to integrate semantic verification capabilities into their content pipelines, as well as user-facing tools like browser extensions that enable individuals to verify content authenticity on demand.
However, commercialization faces significant challenges. Issues of trust, data privacy, and potential misuse of forensic tools must be addressed. If attribution algorithms lack transparency or sufficient error mitigation, they could mistakenly flag legitimate content or be misused to justify censorship. Ensuring these tools are auditable, explainable, and aligned with democratic values will be crucial to their responsible adoption across society.
As digital manipulation technologies continue to advance, DARPA’s investment in semantic forensics represents a critical effort to maintain information integrity in an increasingly complex media landscape.
Fact Checker
Verify the accuracy of this article using The Disinformation Commission analysis and real-time sources.


15 Comments
Interesting development in the fight against deepfakes and digital disinformation. Glad to see DARPA taking proactive steps to commercialize advanced media forensics tools. Crucial that we stay ahead of these emerging threats.
Absolutely. Synthetic media is a serious concern that requires sophisticated solutions. This DARPA contract should help bring powerful forensic capabilities to the private sector.
Interesting to see DARPA focusing on the semantic aspects of media manipulation, not just the technical detection. A holistic approach to understanding the intent behind deepfakes is essential. This contract with Aptima could yield powerful new tools.
The government’s investment in media forensics technology is a prudent move given the growing deepfake problem. Glad to see DARPA taking a proactive stance and leveraging its research to develop commercial solutions.
Absolutely. Protecting the integrity of information is critical, and these advanced forensic tools could be invaluable in combating digital disinformation across various industries.
Deepfakes pose a serious threat to the credibility of information and media. This DARPA contract with Aptima is a welcomed step in the right direction. Curious to see what practical tools emerge from this commercialization effort.
Glad to see the government taking proactive measures to address the deepfake threat. Commercializing DARPA’s media forensics research could have far-reaching implications across industries. Looking forward to seeing what practical tools emerge from this.
The rise of deepfakes is a major threat to the credibility of information. This DARPA contract with Aptima to commercialize advanced media forensics is a welcome development. Looking forward to seeing how these tools can be deployed to combat digital disinformation.
Agreed. Synthetic media is a growing concern, so proactive steps like this DARPA initiative are crucial. Commercializing these forensic capabilities could have a significant impact across various sectors.
This DARPA contract with Aptima seems like an important step in the right direction. Protecting the integrity of media and information is vital, especially as deepfake technology becomes more advanced and accessible.
The rise of digital disinformation is a major concern, so it’s reassuring to see DARPA investing in solutions. Curious to learn more about how the SemaFor program’s semantic approach differs from previous forensics efforts.
Yes, a semantic-based system that can understand the intent behind manipulated content could be a significant advancement. Detecting technical fakery is important, but grasping the underlying meaning is crucial.
Glad to see the government taking the deepfake threat seriously and investing in practical solutions. The SemaFor program’s semantic-focused approach to media forensics sounds intriguing. This Aptima contract could lead to some powerful new tools for combating digital manipulation.
The growing challenge of deepfakes is alarming, so it’s encouraging to see the government investing in practical tools to combat this issue. Curious to learn more about the SemaFor program and its semantic-focused approach.
Yes, a semantic-based forensics system could be a game-changer. Detecting the intent and meaning behind manipulated content, not just the technical fakery, is a crucial next step.