How to protect yourself and others from covert modeling: Difference between revisions

From Stop Synthetic Filth! wiki
Jump to navigation Jump to search
Tags: Mobile edit Mobile web edit
(fixed the citation templates)
Line 8: Line 8:
== Help in case of appearance theft ==
== Help in case of appearance theft ==


'''[https://support.google.com/websearch/answer/9116649?hl=en Information on removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select 'I want to remove: A fake nude or sexually explicit picture or video of myself'
Information on '''[https://support.google.com/websearch/answer/9116649?hl=en removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select "''I want to remove: A fake nude or sexually explicit picture or video of myself''"


Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018">https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/ </ref>  
Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018">
<!-- FIX THIS WHEN GET THE CITATION TEMPLATES {{cite web
{{cite web
  |url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/
  |url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/
  |title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'
  |title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'
Line 18: Line 18:
  |date= 2018-12-30
  |date= 2018-12-30
  |website=  
  |website=  
  |publisher= [[The Washington Post]]
  |publisher= [[w:The Washington Post]]
  |access-date= 2019-10-13
  |access-date= 2021-01-31
  |quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}}
  |quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}}
-->


On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/</ref>
</ref>
On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019">


<!--
{{cite web
{{cite web
  | last = Mihalcik
  | last = Mihalcik
  | first = Carrie
  | first = Carrie
  | title = California laws seek to crack down on deepfakes in politics and porn
  | title = California laws seek to crack down on deepfakes in politics and porn
  | website = [[cnet.com]]
  | website = [[w:cnet.com]]
  | publisher = [[CNET]]
  | publisher = [[w:CNET]]
  | date = 2019-10-04
  | date = 2019-10-04
  | url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/
  | url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/
  | access-date = 2019-10-13 }}
  | access-date = 2021-01-31 }}
</ref>
</ref>
-->


== Protect your appearance from covert modeling ==
== Protect your appearance from covert modeling ==

Revision as of 21:20, 31 January 2021

Do not agree and do not be fooled to having your reflectance field captured on a light stage, such as the he ESPER LightCage in the picture..

“I feel pretty confident that mister photograph man will not be selling much of my data to the no camera scene.”

~ Honestly made up quote on the protecting power of e.g. niqāb

“If your whole industry's shared secret is that digital look-alikes are going to pass human testing (i.e. people in the delusion that they are seeing images of humans) then popular culture product such as w:The Matrix will appear. It is widely known that meetings of surfaces, especially soft ones are very very difficult to do convincingly. Look-alikes of eyes meeting look-alikes of eye-lids are additionally hard to do, because there is also a liquid phase in the equation.”

~ Juboxi on sunglasses


Sunglasses or sun glasses (informally called shades) are a form of w:protective eyewear designed primarily to prevent bright w:sunlight and w:high-energy visible light from damaging or discomforting the eyes.”

~ Wikipedia on sunglasses
Some humans in burqas at the Bornholm burqa happening

Help in case of appearance theft

Information on removing involuntary fake pornography from Google at support.google.com if it shows up in Google. Form for removing involuntary fake pornography at support.google.com, select "I want to remove: A fake nude or sexually explicit picture or video of myself"

Google added “involuntary synthetic pornographic imagery” to its ban list in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”[1] On 3rd of October 2019 California outlawed with the AB-602 the use of w:human image synthesis technologies to make fake pornography without the consent of the people depicted in. The law was authored by Assembly member w:Marc Berman.[2]

Protect your appearance from covert modeling

  • Avoid uploading facial and full body photos and video of yourself to services where they are exposed to the whole Internet.
  • If you need to upload photos, wear protective clothing, e.g. niqāb or burqa or protective accessories e.g. sunglasses.
  • Consider getting a non-photorealistic w:avatar of your liking and use pictures of it to shield your appearance.
  • Do not agree or get fooled to having your reflectance captured in a light stage.

Protect your voice from covert modeling

  • Avoid uploading unaltered recordings of your w:human voice to services where they are exposed to the whole Internet.
  • Consider altering the voice of your recordings if you must upload to the Internet with a voice changer or synthetic voice that does not match any human's voice.
  • Avoid getting recorded by parties whose identity and reliability you cannot verify, especially if they do not expressly state how, where and for what purpose they will use the recording
  • Ask for a voice changer to be applied if getting recorded to something that will be publicly broadcast

Protect your mind from the products of covert modeling

  • Teach your loved ones the 2 key media literacy skills for this age of industrial disinformation:
    1. Not everything that looks like a video of people is actually a video of people
    2. Not everything that sounds like a recording of a known human's voice is actually a recording of that person's voice.
  • Don't watch porn. A dodgy porn site or few is a hefty risk of seeing some digital look-alikes.
  • Be critical of gossip about stuff claimed seen on the Internet.

Protect others from the products of covert modeling

How to protect the humankind from products of covert appearance modeling

Adequate Porn Watcher AI (concept) is name for concept development to make an AI to look at and model all findable porn to provide protection the humankind and individual humans by modeling what it sees.

The reason why APW_AI makes sense is that if you can trust the service providers to keep safe your own model it will alert you when it finds something that looks really like a match and therefore lifting the looming threat of digital look-alike attacks and the destructiveness of the attacks that take place considerably lower and thus also the monetary value to the criminals.

How to protect our societies from covert modeling

Contact your representatives

Contact your representatives and ask them ...

  • If they are aware of these new classes of disinformation weapons?
  • What is their position on the criminalizing covert modeling-question?
  • If steps are being taken to protect the judiciary from covert modeling?
  • What, if anything, they are doing to put this hyper-modern lawlessness under some check?
  • To talk with colleagues and also publicly about the problems caused by covert modeling.

If they don't believe ask them to...

  • Work for the creation of fact-finding taskforce to ascertain the truth that synthetic terror porn has been used as a weapon for a longer time.

How to protect judiciaries from covert modeling

Digital look-alikes and digital sound-alikes technologies prompt some changes to w:rules of evidence and updates to what should be deemed deniable.

Recordings that sound like someone saying something may not be genuine and therefore the suspect should be allowed to state to the court "I did never say that thing you got on tape."

Pictures and videos that looks like someone doing something may not be genuine and therefore the suspect should be allowed to state to the court "I am not in that image/video."

If media forensics proves beyond suspicion the genuinity of the media in question or if credible witness to its creation is found, the media should be considered evidence.

References

  1. Harwell, Drew (2018-12-30). "Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target'". w:The Washington Post. Retrieved 2021-01-31. In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list
  2. Mihalcik, Carrie (2019-10-04). "California laws seek to crack down on deepfakes in politics and porn". w:cnet.com. w:CNET. Retrieved 2021-01-31.