Editing Laws against synthesis and other related crimes
Jump to navigation
Jump to search
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 1: | Line 1: | ||
= Laws and their application = | |||
== Law on synthetic filth in Virginia == | |||
[[File:Flag of Virginia.svg|thumb|left|200px|[[w:Virginia]], an avant-garde state. Motto "[[w:Sic semper tyrannis]]"]] | |||
[[File:Flag of Virginia.svg|thumb| | |||
<section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | <section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | ||
Line 518: | Line 23: | ||
| quote = }} | | quote = }} | ||
</ref>, as | </ref>, as [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''. | ||
'''[https://law.lis.virginia.gov/vacode/ Code of Virginia (TOC)]''' » [https://law.lis.virginia.gov/vacode/title18.2/ Title 18.2. Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ Chapter 8. Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ Article 5. Obscenity and Related Offenses] » [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty] | |||
[https://law.lis.virginia.gov/vacode/ | The [https://law.lis.virginia.gov/vacode/18.2-386.2/ law '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows: | ||
A. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]]. For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.'' | |||
B. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.'' | |||
C. ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.'' | |||
D. ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/> | |||
The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | ||
<section end=Virginia2019 /> | <section end=Virginia2019 /> | ||
== Law | == Law on synthetic filth in Texas == | ||
[[File:Flag of Texas.svg|thumb|left|200px|[[w:Texas]], the Lone Star State has protected the political candidates, but not ordinary folk against synthetic filth.]] | |||
[[File:Flag of Texas.svg|thumb| | |||
<section begin=Texas2019 />On September 1 2019 [[w:Texas Senate]] bill [https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00751F.htm '''SB 751''' - '''''Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election'''''] [[w:amendment]]s to the election code came into effect in the [[w:Law of Texas]], giving [[w:candidates]] in [[w:elections]] a '''30-day protection period''' to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "''a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality''"<ref name="TexasSB751"> | <section begin=Texas2019 />On September 1 2019 [[w:Texas Senate]] bill [https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00751F.htm '''SB 751''' - '''''Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election'''''] [[w:amendment]]s to the election code came into effect in the [[w:Law of Texas]], giving [[w:candidates]] in [[w:elections]] a '''30-day protection period'''' to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "''a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality''"<ref name="TexasSB751"> | ||
{{cite web | {{cite web | ||
Line 559: | Line 61: | ||
The text of '''S.B. No. 751''' is as follows | The text of '''S.B. No. 751''' is as follows | ||
'''''AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.''''' | '''''AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.''''' | ||
Line 570: | Line 72: | ||
* ('''e''') ''In this section, "deep fake video" means a video, created with the '''intent to deceive''', that '''appears to depict''' a real person performing an action that '''did not occur''' in '''reality'''.'' | * ('''e''') ''In this section, "deep fake video" means a video, created with the '''intent to deceive''', that '''appears to depict''' a real person performing an action that '''did not occur''' in '''reality'''.'' | ||
SECTION 2. ''This Act takes effect September 1, 2019.'' | SECTION 2. ''This Act takes effect September 1, 2019.'' | ||
---- | ---- | ||
* [[w:Texas Legislature]] | * [[w:Texas Legislature]] | ||
== Law | == Law on synthetic filth in California == | ||
[[File:Flag of California.svg|thumb|left|200px|[[w:California]] moved later than Virginia, but it outlawed also the manufacture of synthetic filth on Jan 1 2020.]] | |||
[[File:Flag of California.svg|thumb| | |||
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and | |||
[[File:Connie Leyva 2015.jpg|thumb|right|240px|[[w:California]] [[w:California State Senate|w:Senator]] [[w:Connie Leyva]] sponsored [https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false '''California Senate Bill SB 564''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''] in Feb '''2019'''. It is identical to Assembly Bill 602 authored by [[w:Marc Berman]]. The bill was [https://www.sagaftra.org/action-alert-support-california-bill-end-deepfake-porn endorsed by SAG-AFTRA]. It became law on 1 January 2020 in the [[w:California Civil Code|w:California Civil Code]] of the [[w:California Codes]].]] | |||
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and a member of the [[w:California State Assembly]], most loved for authoring [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 '''AB-602''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''], which came into effect on Jan 1 2020, banning both the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.]] January 1 2020 <ref name="KFI2019"> | |||
{{cite web | {{cite web | ||
Line 592: | Line 96: | ||
|quote=}} | |quote=}} | ||
</ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]]member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602"> | </ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]] member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602"> | ||
{{cite web | {{cite web | ||
Line 621: | Line 125: | ||
<section end=California2020 /> | <section end=California2020 /> | ||
The law of California is as follows: | |||
''' | '''1708.86.''' | ||
('''a''') '''For purposes of this section''': | |||
* (1) “''Authorized Representative''” means an attorney, talent agent, or personal manager authorized to represent a depicted individual or, if the depicted individual does not have an attorney, talent agent, or personal manager, a labor union representing performers in audiovisual works. | |||
* (2) (A) “''Consent''” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a description of the sexually explicit material and the audiovisual work in which it will be incorporated. | |||
* (2) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made unless one of the following requirements is satisfied: | |||
:** (i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it. | |||
:** (ii) The depicted individual’s authorized representative provides written approval of the signed agreement. | |||
* (3) “''Depicted individual''” means an individual depicted in a realistic digitized performance in which the individual did not actually perform. For purposes of this paragraph, “''digitized''” includes depicting the nude body parts of another human being as being those of the individual or imposing digitally created nude body parts onto the individual. | |||
* (4) “''Disclose''” means to transfer, publish, make available, or distribute. | |||
* (5) “''Harm''” includes, but is not limited to, economic harm or emotional distress. | |||
* (6) “''Individual''” means a natural person. | |||
* (7) “''Nude''” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola. | |||
* (8) “''Person''” means a human being or legal entity. | |||
* (9) “''Sexual conduct''” means any of the following: | |||
** (A) Masturbation. | |||
** (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals. | |||
** (C) Sexual penetration of the mouth, vagina, or rectum by, or with, an object. | |||
** (D) The transfer of semen onto the depicted individual. | |||
** (E) Sadomasochistic abuse involving the depicted individual. | |||
* (10) “''Sexually explicit material''” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct. | |||
('''b''') A '''depicted individual''' who has suffered harm resulting from the intentional disclosure of sexually explicit material without the depicted individual’s consent has a '''cause of action''' against a person who does either of the following: | |||
* (1) '''creates | * (1) The person '''creates and discloses sexually explicit material''' in an audiovisual work and the person knew or reasonably should have known the depicted individual did not consent. | ||
* (2) | * (2) The person '''discloses, but did not create''', sexually explicit material in an audiovisual work and the person knew the depicted individual did not consent. | ||
The | ('''c''') (1) '''A person''' is '''not liable''' under this section if the person proves any of the following: | ||
* (A) The person disclosed the sexually explicit material in the course of reporting unlawful activity, in the course of a legal proceeding, or the person is a member of law enforcement and disclosed the sexually explicit material in the course of exercising the person’s law enforcement duties. | |||
* (B) The person disclosed the sexually explicit material in relation to a matter of legitimate public concern. | |||
* (C) The person disclosed the sexually explicit material in a work of political or newsworthy value, or similar work. | |||
* (D) The person disclosed the sexually explicit material for the purposes of commentary or criticism or the disclosure is otherwise protected by the California Constitution or the United States Constitution. | |||
('''c''') (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value because the depicted individual is a public figure. | |||
('''d''') It shall '''not be''' a '''defense''' to an action under this section that there is a '''disclaimer included''' in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material. | |||
''' | ('''e''') (1) '''A prevailing plaintiff may recover any of the following''': | ||
* (A) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress. | |||
* (B) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material, or the plaintiff may, at any time before the final judgment is rendered, recover instead an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, in a sum not less than five thousand dollars ($5,000) or more than five hundred thousand dollars ($500,000). | |||
* (C) Punitive damages. | |||
* (D) Reasonable attorney’s fees and costs. | |||
* (E) Any other available relief, including injunctive relief. | |||
* (2) This act does not affect any right or remedy available under any other law. | |||
* (f) An action under this section shall be brought no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence. | |||
('''g''') (1) A '''plaintiff may''' proceed '''using''' a '''pseudonym''', either John Doe, Jane Doe, or Doe, for the true name of the plaintiff and may exclude or redact from all pleadings and documents filed in the action other identifying characteristics of the plaintiff. A plaintiff who proceeds using a pseudonym and excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon the defendant a confidential information form for this purpose that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential. | |||
('''g''') (2) In cases where a '''plaintiff proceeds using a pseudonym''' under this section, the following provisions shall apply: | |||
* (A) All other parties and their agents and attorneys shall use this pseudonym in all pleadings, discovery documents, and other documents filed or served in the action, and at hearings, trial, and other court proceedings that are open to the public. | |||
* (B) | |||
* | ::* (i) Any party filing a pleading, discovery document, or other document in the action shall exclude or redact any identifying characteristics of the plaintiff from the pleading, discovery document, or other document, except for a confidential information form filed pursuant to this subdivision. | ||
::*(ii) A party excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon all other parties a confidential information form that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential. | |||
* (C) All court decisions, orders, petitions, discovery documents, and other documents shall be worded so as to protect the name or other identifying characteristics of the plaintiff from public revelation. | |||
* ( | |||
(''' | ('''g''') (3) The following '''definitions''' apply to this subdivision: | ||
* ( | * (A) “''Identifying characteristics''” means name or any part thereof, address or any part thereof, city or unincorporated area of residence, age, marital status, relationship to defendant, and race or ethnic background, telephone number, email address, social media profiles, online identifiers, contact information, or any other information, including images of the plaintiff, from which the plaintiff’s identity can be discerned. | ||
* ( | * (B) “''Online identifiers''” means any personally identifying information or signifiers that would tie the plaintiff to a particular electronic service, device, or internet application, website, or platform account, including, but not limited to, access names, access codes, account names, aliases, avatars, credentials, gamer tags, display names, handles, login names, member names, online identities, pseudonyms, screen names, user accounts, user identifications, usernames, Uniform Resource Locators (URLs), domain names, Internet Protocol (IP) addresses, and media access control (MAC) addresses. | ||
(''' | ('''g''') (4) The '''responsibility''' for '''excluding or redacting the name or identifying characteristics of the plaintiff''' from '''all documents''' filed with the court rests solely with the parties and their attorneys. Nothing in this section requires the court to review pleadings or other papers for compliance with this provision. | ||
(''' | ('''g''') (5) Upon request of the plaintiff, the clerk shall allow access to the court file in an action filed under this section only as follows: | ||
* (A) To a party to the action, including a party’s attorney. | |||
* (B) To a person by order of the court on a showing of good cause for access. | |||
* (C) To any person 60 days after judgment is entered unless the court grants a plaintiff’s motion to seal records pursuant to Chapter 3 of Division 4 of Title 2 of the California Rules of Court. | |||
(''' | ('''h''') The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.<ref name="CaliforniaStateLaw SB 564"> | ||
{{Citation | |||
| title = "SB-564 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | |||
| first = Marc | |||
: | | last = Berman | ||
: | | author-link = w:Marc Berman | ||
: | | first2 = Connie | ||
| last2 = Leyva | |||
| author2-link = w:Connie Leyva | |||
| series = | |||
| year = 2019 | |||
| publisher = [[w:California]] | |||
| url = https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false | |||
}} | |||
</ref><ref name="CaliforniaStateLaw AB 602"> | |||
</ | |||
<ref name="CaliforniaStateLaw AB 602"> | |||
{{Citation | {{Citation | ||
| title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | | title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | ||
Line 723: | Line 230: | ||
* [[w:California State Legislature]] | * [[w:California State Legislature]] | ||
== Law | == Law on synthetic filth in New York == | ||
[[File:Flag of New York (1909–2020).png|thumb|left|200px|[[w:New York State Legislature]] regular session 2021-2022 is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 New York senate bill '''S1641'''] and identical [https://www.nysenate.gov/legislation/bills/2021/A6517 assembly bill '''A6517'''] to ban sending unsolicited pornography.]] | |||
[[File:Flag of New York (1909–2020).png|thumb| | |||
In the 2021-2022 [[w:New York State Senate]] regular sessions, on 2021-01-14 Senator [[w:James Skoufis]] ([https://www.nysenate.gov/senators/james-skoufis official website]) sponsored and Senators [[w:Brian Benjamin]] ([https://www.nysenate.gov/senators/brian-benjamin official website]) and [[w:Todd Kaminsky]] ([https://www.nysenate.gov/senators/todd-kaminsky official website)] of the New York State Senate co-sponsored [https://www.nysenate.gov/legislation/bills/2021/S1641 New York Senate bill '''S1641'''] to add section '''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL''''' to the [https://www.nysenate.gov/legislation/laws/PEN/P3TNA250 Article 250 of the penal law]. On 2021-03-19 an identical [https://www.nysenate.gov/legislation/bills/2021/A6517 New York Assembly bill '''A6517''' - ''Establishes the crime of unlawful electronic transmission of sexually explicit visual material''] was introduced to the [[w:New York State Assembly]] by Assembly Member [[w:Aileen Gunther]] ([https://nyassembly.gov/mem/Aileen-M-Gunther official website]).<ref group="1st seen in"> | In the 2021-2022 [[w:New York State Senate]] regular sessions, on 2021-01-14 Senator [[w:James Skoufis]] ([https://www.nysenate.gov/senators/james-skoufis official website]) sponsored and Senators [[w:Brian Benjamin]] ([https://www.nysenate.gov/senators/brian-benjamin official website]) and [[w:Todd Kaminsky]] ([https://www.nysenate.gov/senators/todd-kaminsky official website)] of the New York State Senate co-sponsored [https://www.nysenate.gov/legislation/bills/2021/S1641 New York Senate bill '''S1641'''] to add section '''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL''''' to the [https://www.nysenate.gov/legislation/laws/PEN/P3TNA250 Article 250 of the penal law]. On 2021-03-19 an identical [https://www.nysenate.gov/legislation/bills/2021/A6517 New York Assembly bill '''A6517''' - ''Establishes the crime of unlawful electronic transmission of sexually explicit visual material''] was introduced to the [[w:New York State Assembly]] by Assembly Member [[w:Aileen Gunther]] ([https://nyassembly.gov/mem/Aileen-M-Gunther official website]).<ref group="1st seen in"> | ||
Line 940: | Line 272: | ||
:'''§ 2'''. This act shall take effect on the first of November next succeeding the date on which it shall have become a law." | :'''§ 2'''. This act shall take effect on the first of November next succeeding the date on which it shall have become a law." | ||
---- | ---- | ||
* [[w:New York State Legislature]] | * [[w:New York State Legislature]] | ||
=== | == Law on synthetic filth in China == | ||
The | [[File:Flag of China.png|thumb|left|200px|China passed a law requiring faked footage to be labeled as such]] | ||
<section begin=China2020 />On January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a [[w:crime]] the [[w:Cyberspace Administration of China]] ([http://www.cac.gov.cn/ cac.gov.cn]) stated on its website. China announced this new law in November 2019.<ref name="Reuters2019"> | |||
{{cite web | |||
| url = https://www.reuters.com/article/us-china-technology/china-seeks-to-root-out-fake-news-and-deepfakes-with-new-online-content-rules-idUSKBN1Y30VU | |||
| title = China seeks to root out fake news and deepfakes with new online content rules | |||
| last = | |||
| first = | |||
| date = 2019-11-29 | |||
| website = [[w:Reuters.com]] | |||
| publisher = [[w:Reuters]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> The Chinese government seems to be reserving the right to prosecute both users and [[w:online video platform]]s failing to abide by the rules. <ref name="TheVerge2019"> | |||
{{cite web | |||
| url = https://www.theverge.com/2019/11/29/20988363/china-deepfakes-ban-internet-rules-fake-news-disclosure-virtual-reality | |||
| title = China makes it a criminal offense to publish deepfakes or fake news without disclosure | |||
| last = Statt | |||
| first = Nick | |||
| date = 2019-11-29 | |||
| website = | |||
| publisher = [[w:The Verge]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> | |||
<section end=China2020 /> | |||
== | == Law on synthetic filth in the UK == | ||
[[File:Flag of the United Kingdom.svg|thumb|left|200px|The UK needs to improve its legislation. Please [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images sign the petition initiated by Helen Mort] in late 2020.]] | |||
[[File:Helen Mort (2014).jpg|thumb|right|245px|[[w:Helen Mort]] is a British poet, novelist and activist against [[synthetic human-like fakes]]. Please sign the petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] originated by her and to be delivered to the [[w:Law Commission (England and Wales)]] and the prime minister.]] | |||
The UK law does not seem very up-to-date on the issue of synthetic filth. | |||
< | The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref name="BBC2021"> | ||
---- | {{cite web | ||
|url = https://www.bbc.com/news/technology-55546372 | |||
|title = 'Deepfake porn images still give me nightmares' | |||
|last = Royle | |||
|first = Sara | |||
|date = 2021-01-05 | |||
|website = [[w:BBC Online]] | |||
|publisher = [[w:BBC]] | |||
|access-date = 2021-01-31 | |||
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}} | |||
</ref> | |||
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020"> | |||
{{cite web | |||
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images | |||
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images' | |||
|last = Mort | |||
|first = Helen | |||
|date = 2020 | |||
|website = [[w:Change.org]] | |||
|publisher = [[w:Change.org]] | |||
|access-date = 2021-01-31 | |||
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}} | |||
</ref> This call is for similar laws as California put in place on January 1 2020. | |||
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks. | |||
{{#lst:Mediatheque|HelenMort2020}} | |||
---- | ---- | ||
Line 981: | Line 361: | ||
* '''History''': This version is an evolution of a Finnish language original written in 2016. | * '''History''': This version is an evolution of a Finnish language original written in 2016. | ||
[[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very | [[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very accessible laws, but also here we need updating of the laws for this age of industrial disinformation.]] | ||
Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]. | Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]. | ||
Line 1,027: | Line 407: | ||
==== Afterwords ==== | ==== Afterwords ==== | ||
The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product | The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product, but then in July 2019 it appeared to me that [[Adequate Porn Watcher AI (concept)]] could really help in this age of industrial disinformation if it were built and trained and banning modeling of human appearance was in conflict with the "original plan". | ||
One would assume that collecting permissions to model each porn is not plausible, so the question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet. | |||
In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be <font color="green">formulated</font> so that this <font color="red">'''does not make'''</font> <big>'''[[Adequate Porn Watcher AI (concept)]]'''</big> <font color="red">illegal</font> / <font color="red">impossible</font>. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal. | In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be <font color="green">formulated</font> so that this <font color="red">'''does not make'''</font> <big>'''[[Adequate Porn Watcher AI (concept)]]'''</big> <font color="red">illegal</font> / <font color="red">impossible</font>. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal. | ||
Line 1,035: | Line 415: | ||
=== Law proposal to ban unauthorized modeling of human voice === | === Law proposal to ban unauthorized modeling of human voice === | ||
{{#ev:youtube|0sR1rU3gLzQ|360px|right|[https://www.youtube.com/watch?v=0sR1rU3gLzQ Video 'This AI Clones Your Voice After Listening for 5 Seconds' by '2 minute papers' at YouTube] describes the voice thieving machine by Google Research in [[w:NeurIPS|w:NeurIPS]] 2018.}} | {{#ev:youtube|0sR1rU3gLzQ|360px|right|[https://www.youtube.com/watch?v=0sR1rU3gLzQ Video 'This AI Clones Your Voice After Listening for 5 Seconds' by '2 minute papers' at YouTube] describes the voice thieving machine by Google Research in [[w:NeurIPS|w:NeurIPS]] 2018.}} | ||
==== §1 Unauthorized modeling of a human voice ==== | ==== §1 Unauthorized modeling of a human voice ==== | ||
Line 1,080: | Line 458: | ||
''' From Europe''' | ''' From Europe''' | ||
[[File:Flag of Europe.svg|thumb| | [[File:Flag of Europe.svg|thumb|left|180px|The [[w:European Union]]'s [[w:European Parliament|Parliament]]'s [[w:European Parliamentary Research Service]] on [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf The ethics of artificial intelligence: Issues and initiatives].]] | ||
* [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf ''''''The ethics of artificial intelligence: Issues and initiatives'''''' (.pdf) at europarl.europa.eu], a March 2020 study by the [[w:European Parliamentary Research Service]] Starting from page 37 the .pdf lists organizations in the field. | * [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf ''''''The ethics of artificial intelligence: Issues and initiatives'''''' (.pdf) at europarl.europa.eu], a March 2020 study by the [[w:European Parliamentary Research Service]] Starting from page 37 the .pdf lists organizations in the field. | ||
== Synthetic filth in the law and media == | == Synthetic filth in the law and media == | ||
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]]. | * [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]]. | ||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You'], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review. | |||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ | |||
== The countries that have unfortunately banned full face veil == | == The countries that have unfortunately banned full face veil == | ||
Line 1,161: | Line 536: | ||
= References = | = References = | ||
<references /> | <references /> | ||