Help against appearance and voice theft: Difference between revisions

Jump to navigation Jump to search
(+ '''Recent losses''' '''FacePinPoint.com''' was a countermeasure to non-consensual pornography...)
(mv content unchanged)
Line 3: Line 3:
''' Global '''
''' Global '''


Information on '''[https://support.google.com/websearch/answer/9116649?hl=en removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select "''I want to remove: A fake nude or sexually explicit picture or video of myself''"
Information on '''[https://support.google.com/websearch/answer/9116649?hl=en removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select "''I want to remove: A fake nude or sexually explicit picture or video of myself''". Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in '''September 2018''', allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018">
{{cite web
|url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/
|title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'
|last= Harwell
|first= Drew
|date= 2018-12-30
|website=
|publisher= [[w:The Washington Post]]
|access-date= 2021-01-31
|quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}}
 
</ref>


''' USA '''
''' USA '''
Line 23: Line 35:
'''Recent advances'''
'''Recent advances'''


Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018">
{{cite web
|url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/
|title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'
|last= Harwell
|first= Drew
|date= 2018-12-30
|website=
|publisher= [[w:The Washington Post]]
|access-date= 2021-01-31
|quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}}


</ref>


On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">  
On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">