Editing Laws against synthesis and other related crimes
Jump to navigation
Jump to search
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 1: | Line 1: | ||
= Laws and their application = | |||
== Law on synthetic filth in Virginia == | |||
[[File:Flag of Virginia.svg|thumb|left|200px|[[w:Virginia]], an avant-garde state. Motto "[[w:Sic semper tyrannis]]"]] | |||
== Law | |||
[[File:Flag of Virginia.svg|thumb| | |||
<section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | <section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | ||
Line 499: | Line 23: | ||
| quote = }} | | quote = }} | ||
</ref>, as | </ref>, as [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''. | ||
'''[https://law.lis.virginia.gov/vacode/ Code of Virginia (TOC)]''' » [https://law.lis.virginia.gov/vacode/title18.2/ Title 18.2. Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ Chapter 8. Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ Article 5. Obscenity and Related Offenses] » [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty] | |||
[https://law.lis.virginia.gov/vacode/ | The [https://law.lis.virginia.gov/vacode/18.2-386.2/ law '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows: | ||
A. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]]. For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.'' | |||
B. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.'' | |||
C. ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.'' | |||
D. ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/> | |||
The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | ||
<section end=Virginia2019 /> | <section end=Virginia2019 /> | ||
== Law | == Law on synthetic filth in Texas == | ||
[[File:Flag of Texas.svg|thumb|left|200px|[[w:Texas]], the Lone Star State has protected the political candidates, but not ordinary folk against synthetic filth.]] | |||
[[File:Flag of Texas.svg|thumb| | |||
<section begin=Texas2019 />On September 1 2019 [[w:Texas | <section begin=Texas2019 />On September 1 2019 [[w:Texas]] senate bill [https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00751F.htm '''SB 751''' - '''''Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election'''''] [[w:amendment]]s to the election code came into effect in the [[w:Law of Texas]], giving [[w:candidates]] in [[w:elections]] a 30-day protection period to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "''a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality''"<ref name="TexasSB751"> | ||
{{cite web | {{cite web | ||
Line 535: | Line 56: | ||
|quote= In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality}} | |quote= In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality}} | ||
</ref> | |||
<section end=Texas2019 /> | <section end=Texas2019 /> | ||
== Law on synthetic filth in California == | |||
[[File:Flag of California.svg|thumb|left|200px|[[w:California]] moved later than Virginia, but it outlawed also the distribution of synthetic filth on Jan 1 2020.]] | |||
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and a member of the [[w:California State Assembly]], most loved for authoring [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 '''AB-602''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''], which came into effect on Jan 1 2020, banning both the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.]] January 1 2020 <ref name="KFI2019"> | |||
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and | |||
{{cite web | {{cite web | ||
Line 573: | Line 76: | ||
|quote=}} | |quote=}} | ||
</ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]]member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602"> | </ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]] member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602"> | ||
{{cite web | {{cite web | ||
Line 602: | Line 105: | ||
<section end=California2020 /> | <section end=California2020 /> | ||
The law of California is as follows:<ref name="CaliforniaStateLaw AB 602"> | |||
<ref name="CaliforniaStateLaw AB 602"> | |||
{{Citation | {{Citation | ||
| title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | | title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | ||
Line 700: | Line 122: | ||
</ref> | </ref> | ||
* [[w:California State Legislature]] | * [[w:California State Legislature]] | ||
== Law | == Law on synthetic filth in New York == | ||
[[File:Flag of | [[File:Flag of New York (1909–2020).png|thumb|left|200px|[[w:New York State Legislature]] regular session 2021-2022 is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 New York senate bill '''S1641'''] and identical [https://www.nysenate.gov/legislation/bills/2021/A6517 assembly bill '''A6517'''] to ban sending unsolicited pornography.]] | ||
In the 2021-2022 [[w:New York State Senate]] regular sessions, on 2021-01-14 Senators [[w:James Skoufis]], [[w:Brian Benjamin]] and [[w:Todd Kaminsky]] of the New York State Senate introduced [https://www.nysenate.gov/legislation/bills/2021/S1641 New York Senate bill '''S1641'''] to add section '''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL''''' to the [https://www.nysenate.gov/legislation/laws/PEN/P3TNA250 Article 250 of the penal law]. On 2021-03-19 an identical [https://www.nysenate.gov/legislation/bills/2021/A6517 New York Assembly bill '''A6517''' - ''Establishes the crime of unlawful electronic transmission of sexually explicit visual material''] was introduced to the [[w:New York State Assembly]] by Assembly Member [[w:Aileen Gunther]].<ref group="1st seen in"> | |||
In the 2021-2022 [[w:New York State Senate]] regular sessions, on 2021-01-14 | |||
First seen in the [https://trackbill.com/search/#/related=%7B%22id%22:%221690501%22,%22state%22:%22CA%22,%22session%22:%222019%22,%22billId%22:%22AB602%22%7D&direction=desc&page=1&resultsPerPage=25&sort=relevancy&tracked&upcoming_hearings&type=bills&state=all&session suggestions for similar bills for Bills similar to CA AB602 by trackbill.com]. | First seen in the [https://trackbill.com/search/#/related=%7B%22id%22:%221690501%22,%22state%22:%22CA%22,%22session%22:%222019%22,%22billId%22:%22AB602%22%7D&direction=desc&page=1&resultsPerPage=25&sort=relevancy&tracked&upcoming_hearings&type=bills&state=all&session suggestions for similar bills for Bills similar to CA AB602 by trackbill.com]. | ||
Line 922: | Line 168: | ||
:'''§ 2'''. This act shall take effect on the first of November next succeeding the date on which it shall have become a law." | :'''§ 2'''. This act shall take effect on the first of November next succeeding the date on which it shall have become a law." | ||
== Law on synthetic filth in China == | |||
[[File:Flag of China.png|thumb|left|200px|China passed a law requiring faked footage to be labeled as such]] | |||
File: | |||
<section begin=China2020 />On January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a [[w:crime]] the [[w:Cyberspace Administration of China]] ([http://www.cac.gov.cn/ cac.gov.cn]) stated on its website. China announced this new law in November 2019.<ref name="Reuters2019"> | |||
{{cite web | |||
| url = https://www.reuters.com/article/us-china-technology/china-seeks-to-root-out-fake-news-and-deepfakes-with-new-online-content-rules-idUSKBN1Y30VU | |||
</ | | title = China seeks to root out fake news and deepfakes with new online content rules | ||
| last = | |||
| first = | |||
| date = 2019-11-29 | |||
| website = [[w:Reuters.com]] | |||
| publisher = [[w:Reuters]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> The Chinese government seems to be reserving the right to prosecute both users and [[w:online video platform]]s failing to abide by the rules. <ref name="TheVerge2019"> | |||
{{cite web | |||
| url = https://www.theverge.com/2019/11/29/20988363/china-deepfakes-ban-internet-rules-fake-news-disclosure-virtual-reality | |||
| title = China makes it a criminal offense to publish deepfakes or fake news without disclosure | |||
| last = Statt | |||
| first = Nick | |||
| date = 2019-11-29 | |||
| website = | |||
| publisher = [[w:The Verge]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> | |||
= | <section end=China2020 /> | ||
=== | == Law on synthetic filth in the UK == | ||
[[File:Flag of the United Kingdom.svg|thumb|left|200px|The UK needs to improve its legislation. Please [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images sign the petition initiated by Helen Mort] in late 2020.]] | |||
[[File:Helen Mort (2014).jpg|thumb|right|245px|[[w:Helen Mort]] is a British poet, novelist and activist against [[synthetic human-like fakes]]. Please sign the petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] originated by her and to be delivered to the [[w:Law Commission (England and Wales)]] and the prime minister.]] | |||
The UK law does not seem very up-to-date on the issue of synthetic filth. | |||
The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref name="BBC2021"> | |||
{{cite web | |||
|url = https://www.bbc.com/news/technology-55546372 | |||
|title = 'Deepfake porn images still give me nightmares' | |||
|last = Royle | |||
|first = Sara | |||
|date = 2021-01-05 | |||
|website = [[w:BBC Online]] | |||
|publisher = [[w:BBC]] | |||
|access-date = 2021-01-31 | |||
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}} | |||
</ref> | |||
< | "In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020"> | ||
---- | {{cite web | ||
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images | |||
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images' | |||
|last = Mort | |||
|first = Helen | |||
|date = 2020 | |||
|website = [[w:Change.org]] | |||
|publisher = [[w:Change.org]] | |||
|access-date = 2021-01-31 | |||
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}} | |||
</ref> This call is for similar laws as California put in place on January 1 2020. | |||
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks. | |||
{{#lst:Mediatheque|HelenMort2020}} | |||
---- | ---- | ||
Line 962: | Line 254: | ||
* '''History''': This version is an evolution of a Finnish language original written in 2016. | * '''History''': This version is an evolution of a Finnish language original written in 2016. | ||
[[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very | [[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very accessible laws, but also here we need updating of the laws for this age of industrial disinformation.]] | ||
Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]. | Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]. | ||
Line 1,008: | Line 300: | ||
==== Afterwords ==== | ==== Afterwords ==== | ||
The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product | The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product, but then in July 2019 it appeared to me that [[Adequate Porn Watcher AI (concept)]] could really help in this age of industrial disinformation if it were built and trained and banning modeling of human appearance was in conflict with the "original plan". | ||
One would assume that collecting permissions to model each porn is not plausible, so the question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet. | |||
In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be <font color="green">formulated</font> so that this <font color="red">'''does not make'''</font> <big>'''[[Adequate Porn Watcher AI (concept)]]'''</big> <font color="red">illegal</font> / <font color="red">impossible</font>. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal. | In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be <font color="green">formulated</font> so that this <font color="red">'''does not make'''</font> <big>'''[[Adequate Porn Watcher AI (concept)]]'''</big> <font color="red">illegal</font> / <font color="red">impossible</font>. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal. | ||
Line 1,016: | Line 308: | ||
=== Law proposal to ban unauthorized modeling of human voice === | === Law proposal to ban unauthorized modeling of human voice === | ||
{{#ev:youtube|0sR1rU3gLzQ|360px|right|[https://www.youtube.com/watch?v=0sR1rU3gLzQ Video 'This AI Clones Your Voice After Listening for 5 Seconds' by '2 minute papers' at YouTube] describes the voice thieving machine by Google Research in [[w:NeurIPS|w:NeurIPS]] 2018.}} | {{#ev:youtube|0sR1rU3gLzQ|360px|right|[https://www.youtube.com/watch?v=0sR1rU3gLzQ Video 'This AI Clones Your Voice After Listening for 5 Seconds' by '2 minute papers' at YouTube] describes the voice thieving machine by Google Research in [[w:NeurIPS|w:NeurIPS]] 2018.}} | ||
==== §1 Unauthorized modeling of a human voice ==== | ==== §1 Unauthorized modeling of a human voice ==== | ||
Line 1,061: | Line 351: | ||
''' From Europe''' | ''' From Europe''' | ||
[[File:Flag of Europe.svg|thumb| | [[File:Flag of Europe.svg|thumb|left|180px|The [[w:European Union]]'s [[w:European Parliament|Parliament]]'s [[w:European Parliamentary Research Service]] on [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf The ethics of artificial intelligence: Issues and initiatives].]] | ||
* [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf ''''''The ethics of artificial intelligence: Issues and initiatives'''''' (.pdf) at europarl.europa.eu], a March 2020 study by the [[w:European Parliamentary Research Service]] Starting from page 37 the .pdf lists organizations in the field. | * [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf ''''''The ethics of artificial intelligence: Issues and initiatives'''''' (.pdf) at europarl.europa.eu], a March 2020 study by the [[w:European Parliamentary Research Service]] Starting from page 37 the .pdf lists organizations in the field. | ||
== Synthetic filth in the law and media == | == Synthetic filth in the law and media == | ||
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]]. | * [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]]. | ||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You'], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review. | |||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ | |||
== The countries that have unfortunately banned full face veil == | == The countries that have unfortunately banned full face veil == | ||
Line 1,142: | Line 429: | ||
= References = | = References = | ||
<references /> | <references /> | ||