3,958
edits
Juho Kunsola (talk | contribs) m (rm accidental dup) |
Juho Kunsola (talk | contribs) (+ Real-time digital look-and-sound-alike in a video call was used to defraud a substantial amount of money in 2023 + entry into the timeline) |
||
Line 44: | Line 44: | ||
When it cannot be determined by human testing or media forensics whether some fake voice is a synthetic fake of some person's voice, or is it an actual recording made of that person's actual real voice, it is a pre-recorded '''[[Synthetic human-like fakes#Digital sound-alikes|digital sound-alike]]'''. | When it cannot be determined by human testing or media forensics whether some fake voice is a synthetic fake of some person's voice, or is it an actual recording made of that person's actual real voice, it is a pre-recorded '''[[Synthetic human-like fakes#Digital sound-alikes|digital sound-alike]]'''. | ||
Real-time digital look-and-sound-alike in a video call was used to defraud a substantial amount of money in 2023.<ref name="Reuters real-time digital look-and-sound-alike crime 2023"> | |||
{{cite web | |||
| url = https://www.reuters.com/technology/deepfake-scam-china-fans-worries-over-ai-driven-fraud-2023-05-22/ | |||
| title = 'Deepfake' scam in China fans worries over AI-driven fraud | |||
| last = | |||
| first = | |||
| date = 2023-05-22 | |||
| website = [[w:Reuters.com]] | |||
| publisher = [[w:Reuters]] | |||
| access-date = 2023-06-05 | |||
| quote = | |||
}} | |||
</ref> | |||
<section end=definitions-of-synthetic-human-like-fakes /> | <section end=definitions-of-synthetic-human-like-fakes /> | ||
::[[Synthetic human-like fakes|Read more about '''synthetic human-like fakes''']], see and support '''[[organizations and events against synthetic human-like fakes]]''' and what they are doing, what kinds of '''[[Laws against synthesis and other related crimes]]''' have been formulated, [[Synthetic human-like fakes#Timeline of synthetic human-like fakes|examine the SSFWIKI '''timeline''' of synthetic human-like fakes]] or [[Mediatheque|view the '''Mediatheque''']]. | ::[[Synthetic human-like fakes|Read more about '''synthetic human-like fakes''']], see and support '''[[organizations and events against synthetic human-like fakes]]''' and what they are doing, what kinds of '''[[Laws against synthesis and other related crimes]]''' have been formulated, [[Synthetic human-like fakes#Timeline of synthetic human-like fakes|examine the SSFWIKI '''timeline''' of synthetic human-like fakes]] or [[Mediatheque|view the '''Mediatheque''']]. | ||
Line 443: | Line 457: | ||
== 2020's synthetic human-like fakes == | == 2020's synthetic human-like fakes == | ||
* '''2023''' | '''<font color="orange">Real-time digital look-and-sound-alike crime</font>''' A man in northern China was defrauded of 4.3 million yuan by a criminal employing a digital look-and-sound-alike pretending to be his friend on a video call.<ref name="Reuters real-time digital look-and-sound-alike crime 2023"/> | |||
* '''2023''' | '''<font color="orange">Election meddling with digital look-alikes</font>''' | The [[w:2023 Turkish presidential election]] saw numerous deepfake controversies. | * '''2023''' | '''<font color="orange">Election meddling with digital look-alikes</font>''' | The [[w:2023 Turkish presidential election]] saw numerous deepfake controversies. |