Upload a piece of media to be evaluated
Reality defender runs a series of models to detect manipulation
Review each models probability of the media being manipulated
There is increasing evidence of synthetic media being abused in media, technology, and political campaigns. AI-generated deepfakes could become even more harmful in the 2020 election, used to create highly suggestive, damaging, false versions of candidates' statements and actions. It's important to have a concerted effort to establish truth.
We are doing this in a completely inclusive, non-partisan manner by bringing together the best minds in media forensic research, AI, technology and journalism. The RD 2020 tool is a realization of this motivation.
Reality Defender is powered by a global collaboration of stakeholders invested in the fight against misinformation.
Forensic researchers with advanced
AI algorithms and mutual
Leaders in technology, media, and
entertainment pledge, inform,
Released in 2017 in partnership with TUM, FaceForensics and FaceForensics++ are the first large scale video data sets containing faces that can be used to study image or video forgeries. With over 500,000 manipulated frames, FaceForensics was one of the first tools used by researchers to enable the public to participate in detection of deep fakes.
To anticipate and counteract the dangers of AI by giving each of us the tools to protect our lives, dignity, and freedom.
Director, The AI Foundation Non-Profit
Chief AI Officer, The AI Foundation
Past President, Association for the Advancement of Artificial Intelligence
Professor, University of California, Berkeley
Board of Directors, The AI Foundation Non-Profit
Invite-only submission page where potential fake content is entered for analysis.
Utilizing independent AI-based and human-vetted analysis, Reality Defender 2020 scans and verifies whether content is fake, manipulated, or original. A report on each submitted video summarizes the findings without speculating on the intent of the manipulations.Journalists / Campaigns
Help develop and scale our technology and research. We’re always looking for new detection models, new expertise, infrastructure and donations.