3,839
edits
Juho Kunsola (talk | contribs) (typo) |
Juho Kunsola (talk | contribs) (fixed the citation templates) |
||
(12 intermediate revisions by the same user not shown) | |||
Line 6: | Line 6: | ||
[[File:20180613 Folkemodet Bornholm burka happening 0118 (42739707262).jpg|thumb|left|400px|Some humans in '''[[Glossary#Burqa|burqa]]s''' at the Bornholm burqa happening]] | [[File:20180613 Folkemodet Bornholm burka happening 0118 (42739707262).jpg|thumb|left|400px|Some humans in '''[[Glossary#Burqa|burqa]]s''' at the Bornholm burqa happening]] | ||
== Help in case of appearance theft == | |||
= Protect your appearance from covert modeling = | Information on '''[https://support.google.com/websearch/answer/9116649?hl=en removing involuntary fake pornography from Google at support.google.com]''' if it shows up in Google. '''[https://support.google.com/websearch/troubleshooter/3111061#ts=2889054%2C2889099%2C2889064%2C9171203 Form for removing involuntary fake pornography at support.google.com]''', select "''I want to remove: A fake nude or sexually explicit picture or video of myself''" | ||
Google added “'''involuntary synthetic pornographic imagery'''” to its '''ban list''' in September 2018, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation.”<ref name="WashingtonPost2018"> | |||
{{cite web | |||
|url= https://www.washingtonpost.com/technology/2018/12/30/fake-porn-videos-are-being-weaponized-harass-humiliate-women-everybody-is-potential-target/ | |||
|title= Fake-porn videos are being weaponized to harass and humiliate women: 'Everybody is a potential target' | |||
|last= Harwell | |||
|first= Drew | |||
|date= 2018-12-30 | |||
|website= | |||
|publisher= [[w:The Washington Post]] | |||
|access-date= 2021-01-31 | |||
|quote= In September [of 2018], Google added “involuntary synthetic pornographic imagery” to its ban list}} | |||
</ref> | |||
On 3rd of October 2019 '''California outlawed''' with the [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602 the use of [[w:human image synthesis]] technologies to make '''fake pornography without the consent''' of the people depicted in]. The law was authored by Assembly member [[w:Marc Berman]].<ref name="CNET2019"> | |||
{{cite web | |||
| last = Mihalcik | |||
| first = Carrie | |||
| title = California laws seek to crack down on deepfakes in politics and porn | |||
| website = [[w:cnet.com]] | |||
| publisher = [[w:CNET]] | |||
| date = 2019-10-04 | |||
| url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/ | |||
| access-date = 2021-01-31 }} | |||
</ref> | |||
== Protect your appearance from covert modeling == | |||
* '''Avoid uploading''' facial and full body '''photos''' and '''video''' of yourself to services where they are exposed to the whole Internet. | * '''Avoid uploading''' facial and full body '''photos''' and '''video''' of yourself to services where they are exposed to the whole Internet. | ||
Line 14: | Line 43: | ||
* '''Do not agree''' or get fooled to having your '''[[Glossary#Reflectance capture|reflectance captured]]''' in a '''[[Glossary#Light stage|light stage]]'''. | * '''Do not agree''' or get fooled to having your '''[[Glossary#Reflectance capture|reflectance captured]]''' in a '''[[Glossary#Light stage|light stage]]'''. | ||
= Protect your voice from covert modeling = | == Protect your voice from covert modeling == | ||
* '''Avoid uploading''' unaltered '''recordings of your [[w:Human voice|w:human voice]]''' to services where they are exposed to the whole Internet. | * '''Avoid uploading''' unaltered '''recordings of your [[w:Human voice|w:human voice]]''' to services where they are exposed to the whole Internet. | ||
* '''Consider altering''' the voice of your recordings if you must upload to the Internet with a '''[[Glossary#Voice changer|voice changer]]''' or '''synthetic voice''' that does not match any human's voice. | * '''Consider altering''' the voice of your recordings if you must upload to the Internet with a '''[[Glossary#Voice changer|voice changer]]''' or '''synthetic voice''' that does not match any human's voice. | ||
Line 20: | Line 49: | ||
* Ask for a voice changer to be applied if getting recorded to something that will be publicly broadcast | * Ask for a voice changer to be applied if getting recorded to something that will be publicly broadcast | ||
= Protect your mind from the products of covert modeling = | == Protect your mind from the products of covert modeling == | ||
* '''Teach your loved ones''' the 2 key '''media literacy''' skills for this age of industrial disinformation: | * '''Teach your loved ones''' the 2 key '''media literacy''' skills for this age of industrial disinformation: | ||
*# Not everything that looks like a video of people is actually a video of people | |||
*# Not everything that sounds like a recording of a known human's voice is actually a recording of that person's voice. | |||
* '''Don't''' watch '''porn'''. A dodgy porn site or few is a hefty risk of seeing some [[digital look-alikes]]. | * '''Don't''' watch '''porn'''. A dodgy porn site or few is a hefty risk of seeing some [[digital look-alikes]]. | ||
* Be critical of gossip about stuff claimed seen on the Internet. | * Be critical of gossip about stuff claimed seen on the Internet. | ||
= Protect others from the products of covert modeling = | == Protect others from the products of covert modeling == | ||
* Read [[#Protect your mind from the products of covert modeling]] first. It is usually found above this. Also reading it to others has potential to help them. | * Read [[#Protect your mind from the products of covert modeling]] first. It is usually found above this. Also reading it to others has potential to help them. | ||
* If you have strong evidence to '''suggest''' that some person may be '''under the influence''' of '''digital look-alikes''', '''try to talk to them'''. The '''[[Glossary|BCM! wiki glossary]]''' helps talk about things with their real names. | * If you have strong evidence to '''suggest''' that some person may be '''under the influence''' of '''digital look-alikes''', '''try to talk to them'''. The '''[[Glossary|BCM! wiki glossary]]''' helps talk about things with their real names. | ||
= How to protect our societies from covert modeling = | == How to protect the humankind from products of covert appearance modeling == | ||
== Contact your representatives == | |||
'''[[Adequate Porn Watcher AI (concept)]]''' is name for concept development to make an AI to look at and model all findable porn to provide protection the humankind and individual humans by modeling what it sees. | |||
The reason why APW_AI makes sense is that if you can trust the service providers to keep safe your own model it will alert you when it finds something that looks really like a match and therefore lifting the looming threat of [[digital look-alikes|digital look-alike]] attacks and the destructiveness of the attacks that take place considerably lower and thus also the monetary value to the criminals. | |||
== How to protect our societies from covert modeling == | |||
=== Contact your representatives === | |||
'''Contact''' your '''representatives''' and '''ask''' them ... | '''Contact''' your '''representatives''' and '''ask''' them ... | ||
* '''If''' they are '''aware''' of these '''new classes of disinformation weapons'''? | * '''If''' they are '''aware''' of these '''new classes of disinformation weapons'''? | ||
Line 42: | Line 77: | ||
'''If they don't believe''' ask them to... | '''If they don't believe''' ask them to... | ||
* Work for the creation of '''fact-finding taskforce''' to '''ascertain''' the truth that '''[[Glossary#Synthetic terror porn|synthetic terror porn]]''' has been used as a weapon for a longer time. | * Work for the creation of '''fact-finding taskforce''' to '''ascertain''' the truth that '''[[Glossary#Synthetic terror porn|synthetic terror porn]]''' has been used as a weapon for a longer time. | ||
=== How to protect judiciaries from covert modeling === | |||
'''[[Digital look-alikes]]''' and '''[[digital sound-alikes]]''' technologies prompt some changes to '''[[w:Rules of evidence|w:rules of evidence]]''' and updates to what should be '''deemed deniable'''. | '''[[Digital look-alikes]]''' and '''[[digital sound-alikes]]''' technologies prompt some changes to '''[[w:Rules of evidence|w:rules of evidence]]''' and updates to what should be '''deemed deniable'''. | ||
Line 54: | Line 87: | ||
If '''[[Glossary#Media forensics|media forensics]]''' proves beyond suspicion the genuinity of the media in question or if '''credible witness''' to its creation '''is found''', the media should be considered evidence. | If '''[[Glossary#Media forensics|media forensics]]''' proves beyond suspicion the genuinity of the media in question or if '''credible witness''' to its creation '''is found''', the media should be considered evidence. | ||
== References == | |||
<references/> |