How to protect yourself and others from covert modeling: Difference between revisions

Jump to navigation Jump to search
fixed the citation templates
(→‎How to protect our societies from covert modeling: removed some text as we know that latest in 2019 implementation of the sound-like-anyone-machines was sold to the criminals)
(fixed the citation templates)
(2 intermediate revisions by the same user not shown)
Line 8: Line 8:
== Help in case of appearance theft ==
== Help in case of appearance theft ==


'''[https://support.google.com/websearch/answer/9116649?hl=en Information on removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select 'I want to remove: A fake nude or sexually explicit picture or video of myself'
Information on '''[https://support.google.com/websearch/answer/9116649?hl=en removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select "''I want to remove: A fake nude or sexually explicit picture or video of myself''"


Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018">https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/ </ref>  
Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018">
<!-- FIX THIS WHEN GET THE CITATION TEMPLATES {{cite web
{{cite web
  |url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/
  |url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/
  |title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'
  |title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'
Line 18: Line 18:
  |date= 2018-12-30
  |date= 2018-12-30
  |website=  
  |website=  
  |publisher= [[The Washington Post]]
  |publisher= [[w:The Washington Post]]
  |access-date= 2019-10-13
  |access-date= 2021-01-31
  |quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}}
  |quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}}
-->


On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/</ref>
</ref>
On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">


<!--
{{cite web
{{cite web
  | last = Mihalcik
  | last = Mihalcik
  | first = Carrie
  | first = Carrie
  | title = California laws seek to crack down on deepfakes in politics and porn
  | title = California laws seek to crack down on deepfakes in politics and porn
  | website = [[cnet.com]]
  | website = [[w:cnet.com]]
  | publisher = [[CNET]]
  | publisher = [[w:CNET]]
  | date = 2019-10-04
  | date = 2019-10-04
  | url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/
  | url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/
  | access-date = 2019-10-13 }}
  | access-date = 2021-01-31 }}
</ref>
</ref>
-->


== Protect your appearance from covert modeling ==
== Protect your appearance from covert modeling ==
Line 54: Line 51:
== Protect your mind from the products of covert modeling ==
== Protect your mind from the products of covert modeling ==
* '''Teach your loved ones''' the 2 key '''media literacy''' skills for this age of industrial disinformation:
* '''Teach your loved ones''' the 2 key '''media literacy''' skills for this age of industrial disinformation:
## Not everything that looks like a video of people is actually a video of people
*# Not everything that looks like a video of people is actually a video of people
## Not everything that sounds like a recording of a known human's voice is actually a recording of that person's voice.
*# Not everything that sounds like a recording of a known human's voice is actually a recording of that person's voice.
* '''Don't''' watch '''porn'''. A dodgy porn site or few is a hefty risk of seeing some [[digital look-alikes]].
* '''Don't''' watch '''porn'''. A dodgy porn site or few is a hefty risk of seeing some [[digital look-alikes]].
* Be critical of gossip about stuff claimed seen on the Internet.
* Be critical of gossip about stuff claimed seen on the Internet.
Line 65: Line 62:
== How to protect the humankind from products of covert appearance modeling ==
== How to protect the humankind from products of covert appearance modeling ==


'''[[Adequate Porn Watcher AI]]''' is name for concept development to make an AI to look at and model all findable porn to provide protection the humankind and individual humans by modeling what it sees.  
'''[[Adequate Porn Watcher AI (concept)]]''' is name for concept development to make an AI to look at and model all findable porn to provide protection the humankind and individual humans by modeling what it sees.  


The reason why APW_AI makes sense is that if you can trust the service providers to keep safe your own model it will alert you when it finds something that looks really like a match and therefore lifting the looming threat of [[digital look-alikes|digital look-alike]] attacks and the destructiveness of the attacks that take place considerably lower and thus also the monetary value to the criminals.
The reason why APW_AI makes sense is that if you can trust the service providers to keep safe your own model it will alert you when it finds something that looks really like a match and therefore lifting the looming threat of [[digital look-alikes|digital look-alike]] attacks and the destructiveness of the attacks that take place considerably lower and thus also the monetary value to the criminals.
We use only those cookies necessary for the functioning of the wiki and we will never sell your data. All data is stored in the EU.

Navigation menu