Synthetic human-like fakes: Difference between revisions

→‎Countermeasures against synthetic human-like fakes: + 'Media Forensics Challenge' at nist.gov, an iterative research challenge by the w:National Institute of Standards and Technology with the upcoming challenge being the 3rd one in action. Previous rounds were held in 2017 and 2018 and the evaluation criteria for the 2019 iteration are being formed
(→‎Countermeasures against synthetic human-like fakes: + 'Media Forensics and DeepFakes: an overview' at arXiv.org, a 2020 review on the subject of digital look-alikes and media forensics)
(→‎Countermeasures against synthetic human-like fakes: + 'Media Forensics Challenge' at nist.gov, an iterative research challenge by the w:National Institute of Standards and Technology with the upcoming challenge being the 3rd one in action. Previous rounds were held in 2017 and 2018 and the evaluation criteria for the 2019 iteration are being formed)
Line 141: Line 141:
* [https://www.darpa.mil/program/media-forensics '''Media Forensics (MediFor)''' at darpa.mil] aims to develop technologies for the automated assessment of the integrity of an image or video and integrating these in an end-to-end media forensics platform.  
* [https://www.darpa.mil/program/media-forensics '''Media Forensics (MediFor)''' at darpa.mil] aims to develop technologies for the automated assessment of the integrity of an image or video and integrating these in an end-to-end media forensics platform.  
* '''[https://cyabra.com/ Cyabra.com]''' is an AI-based system that helps organizations be on the guard against disinformation attacks<ref group="1st seen in" name="ReutersDisinfomation2020">https://www.reuters.com/article/us-cyber-deepfake-activist/deepfake-used-to-attack-activist-couple-shows-new-disinformation-frontier-idUSKCN24G15E</ref>. [https://www.reuters.com/article/us-cyber-deepfake-activist/deepfake-used-to-attack-activist-couple-shows-new-disinformation-frontier-idUSKCN24G15E Reuters.com reporting] from July 2020.
* '''[https://cyabra.com/ Cyabra.com]''' is an AI-based system that helps organizations be on the guard against disinformation attacks<ref group="1st seen in" name="ReutersDisinfomation2020">https://www.reuters.com/article/us-cyber-deepfake-activist/deepfake-used-to-attack-activist-couple-shows-new-disinformation-frontier-idUSKCN24G15E</ref>. [https://www.reuters.com/article/us-cyber-deepfake-activist/deepfake-used-to-attack-activist-couple-shows-new-disinformation-frontier-idUSKCN24G15E Reuters.com reporting] from July 2020.
''' Studies and events'''
* [https://www.nist.gov/itl/iad/mig/media-forensics-challenge ''''''Media Forensics Challenge'''''' at nist.gov], an iterative research challenge by the [[w:National Institute of Standards and Technology]] with the upcoming challenge being the 3rd one in action. Previous rounds were held in [https://www.nist.gov/itl/iad/mig/nimble-challenge-2017-evaluation 2017] and [https://www.nist.gov/itl/iad/mig/media-forensics-challenge-2018 2018] and [https://www.nist.gov/itl/iad/mig/media-forensics-challenge-2019-0 the evaluation criteria for the 2019 iteration are being formed]
* [https://arxiv.org/abs/2001.06564 ''''''Media Forensics and DeepFakes: an overview'''''' at arXiv.org], a '''2020''' '''review''' on the subject of digital look-alikes and media forensics
* [https://arxiv.org/abs/2001.06564 ''''''Media Forensics and DeepFakes: an overview'''''' at arXiv.org], a '''2020''' '''review''' on the subject of digital look-alikes and media forensics