Adequate Porn Watcher AI (concept): Difference between revisions
Juho Kunsola (talk | contribs) (→A service identical to APW_AI that used to exist - FacePinPoint.com: + launched on Saturday 2017-10-28 + Lionel Hagege on linkedin.com) |
Juho Kunsola (talk | contribs) mNo edit summary |
||
(35 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
'''Adequate Porn Watcher AI''' ('''APW_AI''') is an [[w:Artificial intelligence|w:AI]] and [[w:Computer vision|w:computer vision]] concept to search for any and all '''porn that should not be''' by watching and modeling '''all porn''' ever found on the [[w:Internet]] thus effectively '''protecting humans''' by '''exposing [[Synthetic human-like fakes#List of possible naked digital look-alike attacks|covert naked digital look-alike attacks]] ''' and also other contraband. | '''Adequate Porn Watcher AI''' ('''APW_AI''') is an [[w:Artificial intelligence|w:AI]] and [[w:Computer vision|w:computer vision]] concept to search for any and all '''porn that should not be''' by watching and modeling '''all porn''' ever found on the [[w:Internet]] thus effectively '''protecting humans''' by '''exposing [[Synthetic human-like fakes#List of possible naked digital look-alike attacks|covert naked digital look-alike attacks]] ''' and also other contraband. | ||
Obs. '''[[#A service identical to APW_AI used to exist - FacePinPoint.com]]''' | |||
''' The method and the effect ''' | ''' The method and the effect ''' | ||
Line 5: | Line 7: | ||
The method by which '''APW_AI''' would be providing <font color="blue">'''safety'''</font> and security to its users, is that they can briefly upload a model they've gotten of themselves and then the APW_AI will either say <font color="green">'''nothing matching found'''</font> or it will be of the opinion that <font color="red">'''something matching found'''</font>. | The method by which '''APW_AI''' would be providing <font color="blue">'''safety'''</font> and security to its users, is that they can briefly upload a model they've gotten of themselves and then the APW_AI will either say <font color="green">'''nothing matching found'''</font> or it will be of the opinion that <font color="red">'''something matching found'''</font>. | ||
If people are <font color="green">'''able to check'''</font> whether there is '''[[Glossary#Synthetic | If people are <font color="green">'''able to check'''</font> whether there is '''[[Glossary#Synthetic pornography|synthetic porn]]''' that looks like themselves, this causes synthetic hate-illustration industrialists' product <font color="green">'''lose destructive potential'''</font> and the attacks that happen are less destructive as they are exposed by the APW_AI and thus <font color="green">'''decimate the monetary value'''</font> of these disinformation weapons to the <font color="red>'''criminals'''</font>. | ||
If you feel comfortable to leave your model with the good people at the benefactor for safekeeping you get alerted and help if you ever get attacked with a synthetic porn attack. | If you feel comfortable to leave your model with the good people at the benefactor for safekeeping you get alerted and help if you ever get attacked with a synthetic porn attack. | ||
Line 27: | Line 29: | ||
The idea of APW_AI occurred to [[User:Juho Kunsola]] on Friday 2019-07-12. Subsequently (the next day) this discovery caused the scrapping of [[User:Juho_Kunsola/Law_proposals#Law_proposal_to_ban_covert_modeling_of_human_appearance|the plea to ban convert modeling of human appearance]] as that would have rendered APW_AI legally impossible. | The idea of APW_AI occurred to [[User:Juho Kunsola]] on Friday 2019-07-12. Subsequently (the next day) this discovery caused the scrapping of [[User:Juho_Kunsola/Law_proposals#Law_proposal_to_ban_covert_modeling_of_human_appearance|the plea to ban convert modeling of human appearance]] as that would have rendered APW_AI legally impossible. | ||
= | = Countermeasures elsewhere = | ||
Partial transclusion from [[Organizations, studies and events against synthetic human-like fakes]] | |||
== Companies against synthetic filth == | |||
{{#lst:Organizations, studies and events against synthetic human-like fakes|companies-against-synthetic-filth}} | |||
== A service identical to APW_AI used to exist - FacePinPoint.com == | |||
Partial transclusion from [[FacePinPoint.com]] | |||
{{#lst:FacePinPoint.com|FacePinPoint.com}} | |||
<section begin=See_also /> | |||
= Resources = | |||
''' Tools ''' | |||
* | * '''[[w:PhotoDNA]]''' is an image-identification technology used for detecting [[w:child pornography]] and other illegal content reported to the [[w:National Center for Missing & Exploited Children]] (NCMEC) as required by law.<ref> | ||
{{cite web | |||
|url=https://www.theguardian.com/technology/2014/aug/07/microsoft-tip-police-child-abuse-images-paedophile | |||
|title=Microsoft tip led police to arrest man over child abuse images | |||
|work=[[w:The Guardian]] | |||
|date=2014-08-07 | |||
}} | |||
[[ | </ref> It was developed by [[w:Microsoft Research]] and [[w:Hany Farid]], professor at [[w:Dartmouth College]], beginning in 2009. ([https://en.wikipedia.org/w/index.php?title=PhotoDNA&oldid=1058600051 Wikipedia]) | ||
* The '''[[w:Child abuse image content list]]''' (CAIC List) is a list of URLs and image hashes provided by the [[w:Internet Watch Foundation]] to its partners to enable the blocking of [[w:child pornography]] & [[w:Obscene Publications Acts|w:criminally obscene adult content]] in the UK and by major international technology companies. ([https://en.wikipedia.org/w/index.php?title=Child_abuse_image_content_list&oldid=968491079 Wikipedia]). | |||
''' Legal ''' | |||
* [[w:Outline of law]] | |||
* [[w:List of national legal systems]] | |||
* [[w:List of legislatures by country]] | |||
== Traditional porn-blocking == | == Traditional porn-blocking == | ||
Traditional porn-blocking done by [[w:Pornography laws by region|w:some countries]] seems to use [[w:Domain Name System|w:DNS]] to deny access to porn sites by checking if the domain name matches an item in a porn sites database and if it is there then it returns an unroutable address, usually [[w:0.0.0.0]]. | Traditional porn-blocking done by [[w:Pornography laws by region|w:some countries]] seems to use [[w:Domain Name System|w:DNS]] to deny access to porn sites by checking if the domain name matches an item in a porn sites database and if it is there then it returns an unroutable address, usually [[w:0.0.0.0]]. | ||
Line 74: | Line 86: | ||
* [https://github.com/thelesson/Miniblog-Laravel-7-Google-Vision-detecta-faces-e-restringe-pornografia ''''''Laravel 7 Google Vision restringe pornografia detector de faces'''''' porn restriction app in Portuguese at github.com by ''thelesson''] that utilizes [https://cloud.google.com/vision Google Vision API] to help site maintainers stop users from uploading porn has been written for the for [https://github.com/madskristensen/MiniBlog MiniBlog] [[w:Laravel]] blog app. | * [https://github.com/thelesson/Miniblog-Laravel-7-Google-Vision-detecta-faces-e-restringe-pornografia ''''''Laravel 7 Google Vision restringe pornografia detector de faces'''''' porn restriction app in Portuguese at github.com by ''thelesson''] that utilizes [https://cloud.google.com/vision Google Vision API] to help site maintainers stop users from uploading porn has been written for the for [https://github.com/madskristensen/MiniBlog MiniBlog] [[w:Laravel]] blog app. | ||
== Links regarding pornography == | == Links regarding pornography censorship == | ||
* [[w:Pornography laws by region]] | * [[w:Pornography laws by region]] | ||
* [[w:Internet pornography]] | * [[w:Internet pornography]] | ||
Line 92: | Line 104: | ||
* [[w:Domain fronting]] is a technique for [[w:internet censorship]] circumvention that uses different [[w:domain names]] in different communication layers of an [[w:HTTPS|w:HTTPS connection]] to discreetly connect to a different target domain than is discernable to third parties monitoring the requests and connections. (Wikipedia 2020-09-22) | * [[w:Domain fronting]] is a technique for [[w:internet censorship]] circumvention that uses different [[w:domain names]] in different communication layers of an [[w:HTTPS|w:HTTPS connection]] to discreetly connect to a different target domain than is discernable to third parties monitoring the requests and connections. (Wikipedia 2020-09-22) | ||
* [[w:Internet censorship in China]] and [[w:Internet censorship in China#Evasion|w:some tips to how to evade internet censorship in China]] | * [[w:Internet censorship in China]] and [[w:Internet censorship in China#Evasion|w:some tips to how to evade internet censorship in China]] | ||
= Sources for technologies = | = Sources for technologies = | ||
Line 109: | Line 116: | ||
|- | |- | ||
| | | | ||
* '''[[Main Page]]''' | * '''[[Main Page]]''' and '''[[synthetic human-like fakes|synthetic human-like fakes]]''' i.e. '''[[synthetic human-like fakes#Digital look-alikes|digital look-alikes]]''' and '''[[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]''' so far, audio samples from a '''[https://google.github.io/tacotron/publications/speaker_adaptation/ sound-like-anyone machine]''' from 2018, '''[[synthetic human-like fakes#Media perhaps about synthetic human-like fakes|media perhaps about synthetic human-like fakes]]''' and '''[[how to protect yourself and others from covert modeling]]'''. | ||
[[File:Deb2000-reflectance-separation-2-rows.png|thumb|320px|center|link=[[Main Page]]|Image 1: Separating specular and diffuse reflected light | [[File:Deb2000-reflectance-separation-2-rows.png|thumb|320px|center|link=[[Main Page]]|Image 1: Separating specular and diffuse reflected light | ||
Line 121: | Line 128: | ||
(c) Image of the highlight specular reflection which is caught by placing both polarizers vertically | (c) Image of the highlight specular reflection which is caught by placing both polarizers vertically | ||
<br/><br/> | <br/><br/> | ||
(d) | (d) The difference of c and b yields the specular highlight component | ||
<br/><br/> | <br/><br/> | ||
Images are scaled to seem to be the same luminosity. | Images are scaled to seem to be the same luminosity. | ||
Line 128: | Line 135: | ||
| | | | ||
'''[[Biblical | '''[[Biblical connection - Revelation 13 and Daniel 7]]''', wherein '''[[Biblical connection - Revelation 13 and Daniel 7#Daniel 7|Daniel 7]]''' and '''[[Biblical connection - Revelation 13 and Daniel 7#Revelation 13|Revelation 13]]''' we are warned of this age of industrial filth. | ||
In '''Revelation 19''':'''20''' it says that the '''beast is taken prisoner''', can we achieve this without ''''APW_AI'''? | In '''Revelation 19''':'''20''' it says that the '''beast is taken prisoner''', can we achieve this without ''''APW_AI'''? | ||
[[File:Saint John on Patmos.jpg|thumb|center|link=[[Biblical | [[File:Saint John on Patmos.jpg|thumb|center|link=[[Biblical connection - Revelation 13 and Daniel 7]]|320px|'Saint John on Patmos' pictures [[w:John of Patmos]] on [[w:Patmos]] writing down the visions to make the [[w:Book of Revelation]] | ||
<br/><br/> | <br/><br/> | ||
'Saint John on Patmos' from folio 17 of the [[w:Très Riches Heures du Duc de Berry]] (1412-1416) by the [[w:Limbourg brothers]]. Currently located at the [[w:Musée Condé]] 40km north of Paris, France.]] | 'Saint John on Patmos' from folio 17 of the [[w:Très Riches Heures du Duc de Berry]] (1412-1416) by the [[w:Limbourg brothers]]. Currently located at the [[w:Musée Condé]] 40km north of Paris, France.]] |
Latest revision as of 09:06, 24 September 2024
Adequate Porn Watcher AI (APW_AI) is an w:AI and w:computer vision concept to search for any and all porn that should not be by watching and modeling all porn ever found on the w:Internet thus effectively protecting humans by exposing covert naked digital look-alike attacks and also other contraband.
Obs. #A service identical to APW_AI used to exist - FacePinPoint.com
The method and the effect
The method by which APW_AI would be providing safety and security to its users, is that they can briefly upload a model they've gotten of themselves and then the APW_AI will either say nothing matching found or it will be of the opinion that something matching found.
If people are able to check whether there is synthetic porn that looks like themselves, this causes synthetic hate-illustration industrialists' product lose destructive potential and the attacks that happen are less destructive as they are exposed by the APW_AI and thus decimate the monetary value of these disinformation weapons to the criminals.
If you feel comfortable to leave your model with the good people at the benefactor for safekeeping you get alerted and help if you ever get attacked with a synthetic porn attack.
Rules
Looking up if matches are found for anyone else's model is forbidden and this should probably be enforced with a facial w:biometric w:facial recognition system app that checks that the model you want checked is yours and that you are awake.
Definition of adequacy
An adequate implementation should be nearly free of false positives, very good at finding true positives and able to process more porn than is ever uploaded.
What about the people in the porn-industry?
People who openly do porn can help by opting-in to help in the development by providing training material and material to test the AI on. People and companies who help in training the AI naturally get credited for their help.
There are of course lots of people-questions to this and those questions need to be identified by professionals of psychology and social sciences.
History
The idea of APW_AI occurred to User:Juho Kunsola on Friday 2019-07-12. Subsequently (the next day) this discovery caused the scrapping of the plea to ban convert modeling of human appearance as that would have rendered APW_AI legally impossible.
Countermeasures elsewhere[edit | edit source]
Partial transclusion from Organizations, studies and events against synthetic human-like fakes
Companies against synthetic filth[edit | edit source]
- Alecto AI at alectoai.com[1st seen in 1], a provider of an AI-based face information analytics, founded in 2021 in Palo Alto.
- Facenition.com, an NZ company founded in 2019 and ingenious method to hunt for the fake human-like images. Probably has been purchased, mergered or licensed by ThatsMyFace.com
- ThatsMyFace.com[1st seen in 1], an Australian company.[contacted 1] Previously, another company in the USA had this same name and domain name.[1]
A service identical to APW_AI used to exist - FacePinPoint.com[edit | edit source]
Partial transclusion from FacePinPoint.com
FacePinPoint.com was a for-a-fee service from 2017 to 2021 for pointing out where in pornography sites a particular face appears, or in the case of synthetic pornography, a digital look-alike makes make-believe of a face or body appearing.[contacted 2]The inventor and founder of FacePinPoint.com, Mr. Lionel Hagege registered the domain name in 2015[2], when he set out to research the feasibility of his action plan idea against non-consensual pornography.[3] The description of how FacePinPoint.com worked is the same as Adequate Porn Watcher AI (concept)'s description.
Resources[edit | edit source]
Tools
- w:PhotoDNA is an image-identification technology used for detecting w:child pornography and other illegal content reported to the w:National Center for Missing & Exploited Children (NCMEC) as required by law.[4] It was developed by w:Microsoft Research and w:Hany Farid, professor at w:Dartmouth College, beginning in 2009. (Wikipedia)
- The w:Child abuse image content list (CAIC List) is a list of URLs and image hashes provided by the w:Internet Watch Foundation to its partners to enable the blocking of w:child pornography & w:criminally obscene adult content in the UK and by major international technology companies. (Wikipedia).
Legal
Traditional porn-blocking[edit | edit source]
Traditional porn-blocking done by w:some countries seems to use w:DNS to deny access to porn sites by checking if the domain name matches an item in a porn sites database and if it is there then it returns an unroutable address, usually w:0.0.0.0.
Topics on github.com
- Topic "porn-block" on github.com (8 repositories as of 2020-09)[1st seen in 2]
- Topic "pornblocker" on github.com (13 repositories as of 2020-09)[1st seen in 3]
- Topic "porn-filter" on github.com (35 repositories as of 2020-09)[1st seen in 4]
Curated lists and databases
- 'Awesome-Adult-Filtering-Accountability' a list at github.com curated by wesinator - a list of tools and resources for adult content/porn accountability and filtering[1st seen in 2]
- 'Pornhosts' at github.com by Import-External-Sources is a hosts-file formatted file of the w:Response policy zone (RPZ) zone file. It states itself as " a consolidated anti porn hosts file" and states is mission as "an endeavour to find all porn domains and compile them into a single hosts to allow for easy blocking of porn on your local machine or on a network."[1st seen in 2]
- 'Amdromeda blocklist for Pi-hole' at github.com by Amdromeda[1st seen in 2] lists 50MB worth of just porn host names 1 (16.6MB) 2 (16.8MB) 3 (16.9MB) (As of 2020-09)
- 'Pihole-blocklist' at github.com by mhakim[1st seen in 4] 1
- 'superhostsfile' at github.com by universalbyte is an ongoing effort to chart out "negative" hosts.[1st seen in 3]
- 'hosts' at github.com by StevenBlack is a hosts file for negative sites. It is updated constantly from these sources and it lists 559k (1.64MB) of porn and other dodgy hosts (as of 2020-09)
- 'Porn-domains' at github.com by Bon appétit was (as of 2020-09) last updated in March 2019 and lists more than 22k domains.
Porn blocking services
- w:Pi-hole - https://pi-hole.net/ - Network-wide Ad Blocking
Software for nudity detection
- 'PornDetector' consists of two python porn images (nudity) detectors at github.com by bakwc and are both written in w:Python (programming language)[1st seen in 4]. pcr.py uses w:scikit-learn and the w:OpenCV Open Source Computer Vision Library, whereas nnpcr.py uses w:TensorFlow and reaches a higher accuracy.
- 'Laravel 7 Google Vision restringe pornografia detector de faces' porn restriction app in Portuguese at github.com by thelesson that utilizes Google Vision API to help site maintainers stop users from uploading porn has been written for the for MiniBlog w:Laravel blog app.
Links regarding pornography censorship[edit | edit source]
- w:Pornography laws by region
- w:Internet pornography
- w:Legal status of Internet pornography
- w:Sex and the law
Against pornography
- Reasons for w:opposition to pornography include w:religious views on pornography, w:feminist views of pornography, and claims of w:effects of pornography, such as w:pornography addiction. (Wikipedia as of 2020-09-19)
Technical means of censorship and how to circumvent
- w:Internet censorship and w:internet censorship circumvention
- w:Content-control software (Internet filter), a common approach to w:parental control.
- w:Accountability software
- w:Employee monitoring is often automated using w:employee monitoring software
- A w:wordfilter (sometimes referred to as just "filter" or "censor") is a script typically used on w:Internet forums or w:chat rooms that automatically scans users' posts or comments as they are submitted and automatically changes or w:censors particular words or phrases. (Wikipedia as of 2020-09)
- w:Domain fronting is a technique for w:internet censorship circumvention that uses different w:domain names in different communication layers of an w:HTTPS connection to discreetly connect to a different target domain than is discernable to third parties monitoring the requests and connections. (Wikipedia 2020-09-22)
- w:Internet censorship in China and w:some tips to how to evade internet censorship in China
Sources for technologies[edit | edit source]
A map of technologies courtesy of Samsung Next, linked from 'Why it’s time to change the conversation around synthetic media' at venturebeat.com[1st seen in 5] |
See also[edit | edit source]
|
Biblical connection - Revelation 13 and Daniel 7, wherein Daniel 7 and Revelation 13 we are warned of this age of industrial filth. In Revelation 19:20 it says that the beast is taken prisoner, can we achieve this without 'APW_AI? |
References[edit | edit source]
- ↑ https://www.crunchbase.com/organization/thatsmyface-com
- ↑ whois facepinpoint.com
- ↑ https://www.facepinpoint.com/aboutus
- ↑ "Microsoft tip led police to arrest man over child abuse images". w:The Guardian. 2014-08-07.
1st seen in[edit | edit source]
- ↑ 1.0 1.1 https://spectrum.ieee.org/deepfake-porn
- ↑ 2.0 2.1 2.2 2.3 Seen first in https://github.com/topics/porn-block, meta for actual use. The topic was stumbled upon.
- ↑ 3.0 3.1 Seen first in https://github.com/topics/pornblocker Saw this originally when looking at https://github.com/topics/porn-block Topic
- ↑ 4.0 4.1 4.2 Seen first in https://github.com/topics/porn-filter Saw this originally when looking at https://github.com/topics/porn-block Topic
- ↑ venturebeat.com found via some Facebook AI & ML group or page yesterday. Sorry, don't know precisely right now.
Cite error: <ref>
tags exist for a group named "contacted", but no corresponding <references group="contacted"/>
tag was found