Laws against synthesis and other related crimes: Difference between revisions

From Stop Synthetic Filth! wiki
Jump to navigation Jump to search
(→‎Law on synthetic filth in California: Sourced and formatted the text of AB 602 / SB 564 from https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false)
Line 122: Line 122:
<section end=California2020 />
<section end=California2020 />


The law of California is as follows:<ref name="CaliforniaStateLaw AB 602">
The law of California is as follows:
 
 
'''1708.86.'''
 
('''a''') '''For purposes of this section''':
* (1) “''Authorized Representative''” means an attorney, talent agent, or personal manager authorized to represent a depicted individual or, if the depicted individual does not have an attorney, talent agent, or personal manager, a labor union representing performers in audiovisual works.
* (2) (A) “''Consent''” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a description of the sexually explicit material and the audiovisual work in which it will be incorporated.
* (2) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made unless one of the following requirements is satisfied:
:** (i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it.
:** (ii) The depicted individual’s authorized representative provides written approval of the signed agreement.
* (3) “''Depicted individual''” means an individual depicted in a realistic digitized performance in which the individual did not actually perform. For purposes of this paragraph, “''digitized''” includes depicting the nude body parts of another human being as being those of the individual or imposing digitally created nude body parts onto the individual.
* (4) “''Disclose''” means to transfer, publish, make available, or distribute.
* (5) “''Harm''” includes, but is not limited to, economic harm or emotional distress.
* (6) “''Individual''” means a natural person.
* (7) “''Nude''” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola.
* (8) “''Person''” means a human being or legal entity.
* (9) “''Sexual conduct''” means any of the following:
** (A) Masturbation.
** (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals.
** (C) Sexual penetration of the mouth, vagina, or rectum by, or with, an object.
** (D) The transfer of semen onto the depicted individual.
** (E) Sadomasochistic abuse involving the depicted individual.
* (10) “''Sexually explicit material''” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct.
 
('''b''') A '''depicted individual''' who has suffered harm resulting from the intentional disclosure of sexually explicit material without the depicted individual’s consent has a '''cause of action''' against a person who does either of the following:
* (1) The person '''creates and discloses sexually explicit material''' in an audiovisual work and the person knew or reasonably should have known the depicted individual did not consent.
* (2) The person '''discloses, but did not create''', sexually explicit material in an audiovisual work and the person knew the depicted individual did not consent.
 
('''c''') (1) '''A person''' is '''not liable''' under this section if the person proves any of the following:
* (A) The person disclosed the sexually explicit material in the course of reporting unlawful activity, in the course of a legal proceeding, or the person is a member of law enforcement and disclosed the sexually explicit material in the course of exercising the person’s law enforcement duties.
* (B) The person disclosed the sexually explicit material in relation to a matter of legitimate public concern.
* (C) The person disclosed the sexually explicit material in a work of political or newsworthy value, or similar work.
* (D) The person disclosed the sexually explicit material for the purposes of commentary or criticism or the disclosure is otherwise protected by the California Constitution or the United States Constitution.
 
('''c''') (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value because the depicted individual is a public figure.
 
('''d''') It shall '''not be''' a '''defense''' to an action under this section that there is a '''disclaimer included''' in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
 
('''e''') (1) '''A prevailing plaintiff may recover any of the following''':
* (A) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress.
* (B) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material, or the plaintiff may, at any time before the final judgment is rendered, recover instead an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, in a sum not less than five thousand dollars ($5,000) or more than five hundred thousand dollars ($500,000).
* (C) Punitive damages.
* (D) Reasonable attorney’s fees and costs.
* (E) Any other available relief, including injunctive relief.
* (2) This act does not affect any right or remedy available under any other law.
* (f) An action under this section shall be brought no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence.
 
('''g''') (1) A '''plaintiff may''' proceed '''using''' a '''pseudonym''', either John Doe, Jane Doe, or Doe, for the true name of the plaintiff and may exclude or redact from all pleadings and documents filed in the action other identifying characteristics of the plaintiff. A plaintiff who proceeds using a pseudonym and excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon the defendant a confidential information form for this purpose that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential.
 
('''g''') (2) In cases where a '''plaintiff proceeds using a pseudonym''' under this section, the following provisions shall apply:
* (A) All other parties and their agents and attorneys shall use this pseudonym in all pleadings, discovery documents, and other documents filed or served in the action, and at hearings, trial, and other court proceedings that are open to the public.
* (B)
::* (i) Any party filing a pleading, discovery document, or other document in the action shall exclude or redact any identifying characteristics of the plaintiff from the pleading, discovery document, or other document, except for a confidential information form filed pursuant to this subdivision.
::*(ii) A party excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon all other parties a confidential information form that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential.
* (C) All court decisions, orders, petitions, discovery documents, and other documents shall be worded so as to protect the name or other identifying characteristics of the plaintiff from public revelation.
 
('''g''') (3) The following '''definitions''' apply to this subdivision:
* (A) “''Identifying characteristics''” means name or any part thereof, address or any part thereof, city or unincorporated area of residence, age, marital status, relationship to defendant, and race or ethnic background, telephone number, email address, social media profiles, online identifiers, contact information, or any other information, including images of the plaintiff, from which the plaintiff’s identity can be discerned.
* (B) “''Online identifiers''” means any personally identifying information or signifiers that would tie the plaintiff to a particular electronic service, device, or internet application, website, or platform account, including, but not limited to, access names, access codes, account names, aliases, avatars, credentials, gamer tags, display names, handles, login names, member names, online identities, pseudonyms, screen names, user accounts, user identifications, usernames, Uniform Resource Locators (URLs), domain names, Internet Protocol (IP) addresses, and media access control (MAC) addresses.
 
('''g''') (4) The '''responsibility''' for '''excluding or redacting the name or identifying characteristics of the plaintiff''' from '''all documents''' filed with the court rests solely with the parties and their attorneys. Nothing in this section requires the court to review pleadings or other papers for compliance with this provision.
 
('''g''') (5) Upon request of the plaintiff, the clerk shall allow access to the court file in an action filed under this section only as follows:
* (A) To a party to the action, including a party’s attorney.
* (B) To a person by order of the court on a showing of good cause for access.
* (C) To any person 60 days after judgment is entered unless the court grants a plaintiff’s motion to seal records pursuant to Chapter 3 of Division 4 of Title 2 of the California Rules of Court.
 
('''h''') The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.<ref name="CaliforniaStateLaw SB 564">
{{Citation
| title = "SB-564 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action."
| first = Marc
| last = Berman
| author-link = w:Marc Berman
| first2 = Connie
| last2 = Leyva
| author2-link = w:Connie Leyva
| series =
| year = 2019
| publisher = [[w:California]]
| url = https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false
 
}}
 
</ref><ref name="CaliforniaStateLaw AB 602">
{{Citation
{{Citation
  | title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action."
  | title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action."
Line 139: Line 223:
</ref>
</ref>


----


* [[w:California State Legislature]]
* [[w:California State Legislature]]

Revision as of 13:07, 30 March 2021

Laws and their application

Law on synthetic filth in Virginia

w:Virginia, an avant-garde state. Motto "w:Sic semper tyrannis"
Homie w:Marcus Simon (marcussimon.com) is a Member of the w:Virginia House of Delegates and a true pioneer in legislating against synthetic filth.

Since July 1 2019[1] w:Virginia w:has criminalized the sale and dissemination of unauthorized synthetic pornography, but not the manufacture.[2], as § 18.2-386.2 titled 'Unlawful dissemination or sale of images of another; penalty.' became part of the w:Code of Virginia.

Code of Virginia (TOC) » Title 18.2. Crimes and Offenses Generally » Chapter 8. Crimes Involving Morals and Decency » Article 5. Obscenity and Related Offenses » § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty

The law § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty. of Virginia is as follows:

A. Any w:person who, with the w:intent to w:coerce, w:harass, or w:intimidate, w:maliciously w:disseminates or w:sells any videographic or still image created by any means whatsoever that w:depicts another person who is totally w:nude, or in a state of undress so as to expose the w:genitals, pubic area, w:buttocks, or female w:breast, where such person knows or has reason to know that he is not w:licensed or w:authorized to disseminate or sell such w:videographic or w:still image is w:guilty of a Class 1 w:misdemeanor. For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's w:face, w:likeness, or other distinguishing characteristic.

B. If a person uses w:services of an w:Internet service provider, an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.

C. Venue for a prosecution under this section may lie in the w:jurisdiction where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.

D. The provisions of this section shall not preclude prosecution under any other w:statute.[2]

The identical bills were House Bill 2678 presented by w:Delegate w:Marcus Simon to the w:Virginia House of Delegates on January 14 2019 and three day later an identical Senate bill 1736 was introduced to the w:Senate of Virginia by Senator w:Adam Ebbin.


Law on synthetic filth in Texas

w:Texas, the Lone Star State has protected the political candidates, but not ordinary folk against synthetic filth.

On September 1 2019 w:Texas Senate bill SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election w:amendments to the election code came into effect in the w:Law of Texas, giving w:candidates in w:elections a 30-day protection period' to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality"[3] SB 751 was introduced to the Senate by w:Bryan Hughes (politician).[4]


The text of S.B. No. 751 is as follows

AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.

BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF TEXAS:

SECTION 1. Section 255.004, Election Code, is w:amended by adding Subsections (d) and (e) to read as follows:

  • (d) A person commits an offense if the person, with intent to injure a candidate or influence the result of an election:
    1. creates a deep fake video; and
    2. causes the deep fake video to be published or distributed within 30 days of an election.
  • (e) In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality.

SECTION 2. This Act takes effect September 1, 2019.


Law on synthetic filth in California

w:California moved later than Virginia, but it outlawed also the manufacture of synthetic filth on Jan 1 2020.
Homie w:Marc Berman, a righteous fighter for our human rights in this age of industrial disinformation filth and a member of the w:California State Assembly, most loved for authoring AB-602 - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action, which came into effect on Jan 1 2020, banning both the manufacturing and w:digital distribution of synthetic pornography without the w:consent of the people depicted.

January 1 2020 [5] the w:California w:US state law "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." came into effect in the civil code of the w:California Codes banning the manufacturing and w:digital distribution of synthetic pornography without the w:consent of the people depicted. AB-602 provides victims of synthetic pornography with w:injunctive relief and poses legal threats of w:statutory and w:punitive damages on w:criminals making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California w:Governor w:Gavin Newsom on October 3 2019 and was authored by w:California State Assembly member w:Marc Berman and an identical Senate bill was coauthored by w:California Senator w:Connie Leyva.[6][7] AB602 at trackbill.com


The law of California is as follows:


1708.86.

(a) For purposes of this section:

  • (1) “Authorized Representative” means an attorney, talent agent, or personal manager authorized to represent a depicted individual or, if the depicted individual does not have an attorney, talent agent, or personal manager, a labor union representing performers in audiovisual works.
  • (2) (A) “Consent” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a description of the sexually explicit material and the audiovisual work in which it will be incorporated.
  • (2) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made unless one of the following requirements is satisfied:
    • (i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it.
    • (ii) The depicted individual’s authorized representative provides written approval of the signed agreement.
  • (3) “Depicted individual” means an individual depicted in a realistic digitized performance in which the individual did not actually perform. For purposes of this paragraph, “digitized” includes depicting the nude body parts of another human being as being those of the individual or imposing digitally created nude body parts onto the individual.
  • (4) “Disclose” means to transfer, publish, make available, or distribute.
  • (5) “Harm” includes, but is not limited to, economic harm or emotional distress.
  • (6) “Individual” means a natural person.
  • (7) “Nude” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola.
  • (8) “Person” means a human being or legal entity.
  • (9) “Sexual conduct” means any of the following:
    • (A) Masturbation.
    • (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals.
    • (C) Sexual penetration of the mouth, vagina, or rectum by, or with, an object.
    • (D) The transfer of semen onto the depicted individual.
    • (E) Sadomasochistic abuse involving the depicted individual.
  • (10) “Sexually explicit material” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct.

(b) A depicted individual who has suffered harm resulting from the intentional disclosure of sexually explicit material without the depicted individual’s consent has a cause of action against a person who does either of the following:

  • (1) The person creates and discloses sexually explicit material in an audiovisual work and the person knew or reasonably should have known the depicted individual did not consent.
  • (2) The person discloses, but did not create, sexually explicit material in an audiovisual work and the person knew the depicted individual did not consent.

(c) (1) A person is not liable under this section if the person proves any of the following:

  • (A) The person disclosed the sexually explicit material in the course of reporting unlawful activity, in the course of a legal proceeding, or the person is a member of law enforcement and disclosed the sexually explicit material in the course of exercising the person’s law enforcement duties.
  • (B) The person disclosed the sexually explicit material in relation to a matter of legitimate public concern.
  • (C) The person disclosed the sexually explicit material in a work of political or newsworthy value, or similar work.
  • (D) The person disclosed the sexually explicit material for the purposes of commentary or criticism or the disclosure is otherwise protected by the California Constitution or the United States Constitution.

(c) (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value because the depicted individual is a public figure.

(d) It shall not be a defense to an action under this section that there is a disclaimer included in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.

(e) (1) A prevailing plaintiff may recover any of the following:

  • (A) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress.
  • (B) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material, or the plaintiff may, at any time before the final judgment is rendered, recover instead an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, in a sum not less than five thousand dollars ($5,000) or more than five hundred thousand dollars ($500,000).
  • (C) Punitive damages.
  • (D) Reasonable attorney’s fees and costs.
  • (E) Any other available relief, including injunctive relief.
  • (2) This act does not affect any right or remedy available under any other law.
  • (f) An action under this section shall be brought no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence.

(g) (1) A plaintiff may proceed using a pseudonym, either John Doe, Jane Doe, or Doe, for the true name of the plaintiff and may exclude or redact from all pleadings and documents filed in the action other identifying characteristics of the plaintiff. A plaintiff who proceeds using a pseudonym and excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon the defendant a confidential information form for this purpose that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential.

(g) (2) In cases where a plaintiff proceeds using a pseudonym under this section, the following provisions shall apply:

  • (A) All other parties and their agents and attorneys shall use this pseudonym in all pleadings, discovery documents, and other documents filed or served in the action, and at hearings, trial, and other court proceedings that are open to the public.
  • (B)
  • (i) Any party filing a pleading, discovery document, or other document in the action shall exclude or redact any identifying characteristics of the plaintiff from the pleading, discovery document, or other document, except for a confidential information form filed pursuant to this subdivision.
  • (ii) A party excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon all other parties a confidential information form that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential.
  • (C) All court decisions, orders, petitions, discovery documents, and other documents shall be worded so as to protect the name or other identifying characteristics of the plaintiff from public revelation.

(g) (3) The following definitions apply to this subdivision:

  • (A) “Identifying characteristics” means name or any part thereof, address or any part thereof, city or unincorporated area of residence, age, marital status, relationship to defendant, and race or ethnic background, telephone number, email address, social media profiles, online identifiers, contact information, or any other information, including images of the plaintiff, from which the plaintiff’s identity can be discerned.
  • (B) “Online identifiers” means any personally identifying information or signifiers that would tie the plaintiff to a particular electronic service, device, or internet application, website, or platform account, including, but not limited to, access names, access codes, account names, aliases, avatars, credentials, gamer tags, display names, handles, login names, member names, online identities, pseudonyms, screen names, user accounts, user identifications, usernames, Uniform Resource Locators (URLs), domain names, Internet Protocol (IP) addresses, and media access control (MAC) addresses.

(g) (4) The responsibility for excluding or redacting the name or identifying characteristics of the plaintiff from all documents filed with the court rests solely with the parties and their attorneys. Nothing in this section requires the court to review pleadings or other papers for compliance with this provision.

(g) (5) Upon request of the plaintiff, the clerk shall allow access to the court file in an action filed under this section only as follows:

  • (A) To a party to the action, including a party’s attorney.
  • (B) To a person by order of the court on a showing of good cause for access.
  • (C) To any person 60 days after judgment is entered unless the court grants a plaintiff’s motion to seal records pursuant to Chapter 3 of Division 4 of Title 2 of the California Rules of Court.

(h) The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.[8][9]


Law on synthetic filth in New York

w:New York State Legislature regular session 2021-2022 is contemplating the New York senate bill S1641 and identical assembly bill A6517 to ban sending unsolicited pornography.

In the 2021-2022 w:New York State Senate regular sessions, on 2021-01-14 Senator w:James Skoufis (official website) sponsored and Senators w:Brian Benjamin (official website) and w:Todd Kaminsky (official website) of the New York State Senate co-sponsored New York Senate bill S1641 to add section § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL to the Article 250 of the penal law. On 2021-03-19 an identical New York Assembly bill A6517 - Establishes the crime of unlawful electronic transmission of sexually explicit visual material was introduced to the w:New York State Assembly by Assembly Member w:Aileen Gunther (official website).[1st seen in 1]

If this bill passes it will be codified in the w:Consolidated Laws of New York. View the Consolidated Laws of New York at nysenate.gov.

  • Title of bill: An act to amend the penal law, in relation to the creation of the criminal offense of unlawful electronic transmission of sexually explicit visual material
  • Purpose: The purpose of this bill is to make it unlawful to send sexually explicit material through electronic means unless the material is sent at the request of, or with the express consent of the recipient.
  • Summary of provisions: Adds a new section 250.70 to the penal law making it unlawful to knowingly transmit by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed unless the material is sent at the request of, or with the express consent of the recipient.
  • Justification: Currently under New York State law, indecent exposure in person is a crime, but it is not unlawful to send sexually explicit photos to nonconsenting adult recipients through electronic transmission. With the growing modem age of online dating, many individuals are receiving sexually explicit visual content without their consent from strangers. No person should be forced to view sexually explicit material without their consent.

The bill offers a clear deterrent to those considering sending unsolicited sexual pics and similar inappropriate conduct, and protects the unwilling recipients who currently have no legal recourse for such abuses.

What is illegal in the real world must be illegal in the digital world, and this legislation is a first step in the right direction in adding that accountability.

  • Legislative history:
    1. Senate - 2020 - S5949 Referred to Codes
    2. Assembly - 2020 - A7801 Referred to Codes
  • Fiscal implications: Minimal
  • Effective date: This act shall take effect on the first of November next succeeding the date on which it shall have become a law.

The text of the bill is, as of 2021-03-24, as follows:

"Section 1. The penal law is amended by adding a new section 250.70 to read as follows:
§ 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL.
A person is guilty of unlawful electronic transmission of sexually explicit visual material if a person knowingly transmits by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed or depicts the covered genitals of a male person that are in a discernibly turgid state and such visual material is not sent at the request of or with the express consent of the recipient. For purposes of this section the term "intimate parts" means the naked genitals, pubic area, anus, or female postpubescent nipple of the person and the term "sexual conduct" shall have the same meaning as defined in section 130.00 (Sex offenses; definitions of terms) of this chapter. Unlawful electronic transmission of sexually explicit visual material is a class a misdemeanor.
§ 2. This act shall take effect on the first of November next succeeding the date on which it shall have become a law."

Law on synthetic filth in China

China passed a law requiring faked footage to be labeled as such

On January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a w:crime the w:Cyberspace Administration of China (cac.gov.cn) stated on its website. China announced this new law in November 2019.[10] The Chinese government seems to be reserving the right to prosecute both users and w:online video platforms failing to abide by the rules. [11]


Law on synthetic filth in the UK

The UK needs to improve its legislation. Please sign the petition initiated by Helen Mort in late 2020.
w:Helen Mort is a British poet, novelist and activist against synthetic human-like fakes. Please sign the petition 'Tighten regulation on taking, making and faking explicit images' at Change.org originated by her and to be delivered to the w:Law Commission (England and Wales) and the prime minister.

The UK law does not seem very up-to-date on the issue of synthetic filth.

The independent w:Law Commission (England and Wales) is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.[12]

"In 2019, law expert Dr Aislinn O’Connell told w:The Independent that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The w:Women and Equalities Committee called on the UK Government to introduce new legislation on image-based sexual abuse in order to criminalise ALL non-consensual creation and distribution of intimate sexual images."[13] This call is for similar laws as California put in place on January 1 2020.

The petition 'Tighten regulation on taking, making and faking explicit images' at Change.org by w:Helen Mort aims to petition the UK govt for proper legislation against synthetic filth. See the mediatheque for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks.


'My experience of harmful ‘deepfake’ images, a talk and poetry about w:Helen Mort's ordeal uploaded to w:Vimeo by herself on 2020-12-18



Law proposals

Law proposals against synthetic filth by Juho Kunsola

w:Finland has very logical and very accessible laws, but also here we need updating of the laws for this age of industrial disinformation.

Existing law in Chapter 24. of the Finnish Criminal Code - "Offences against privacy, public peace and personal reputation" seems to be ineffective against many synthetic human-like fake attack and seems it could be used to frame victims for crimes with digital sound-alikes.

The portions affected by or affecting the synthetic filth situation in bold font:

  • Section 1 - Invasion of domestic premises (879/2013)
  • Section 1(a) - Harassing communications (879/2013)
  • Section 2 - Aggravated invasion of domestic premises (531/2000)
  • Section 3 - Invasion of public premises (585/2005)
  • Section 4 - Aggravated invasion of public premises (531/2000)
  • Section 5 - Eavesdropping (531/2000)
  • Section 6 - Illicit observation (531/2000)
  • Section 7 - Preparation of eavesdropping or illicit observation (531/2000)
  • Section 8 - Dissemination of information violating personal privacy (879/2013)
  • Section 8(a) - Aggravated dissemination of information violating personal privacy (879/2013)
  • Section 9 - Defamation (879/2013)
  • Section 10 - Aggravated defamation (879/2013)
  • Section 11 - Definition (531/2000)
  • Section 12 - Right to bring charges (879/2013)
  • Section 13 - Corporate criminal liability (511/2011)
Subtraction of the diffuse reflection from the specular reflection. Image is scaled for luminocity. Diffuse reflection is acquired by placing polarizers in 90 degree angle and specular with 0 degree angle.

Original picture by Debevec et al. - Copyright ACM 2000 https://dl.acm.org/citation.cfm?doid=311779.344855

Law proposal to ban visual synthetic filth

§1 Models of human appearance

A model of human appearance means

§2 Producing synthetic pornography

Making projections, still or videographic, where targets are portrayed in a nude or in a sexual situation from models of human appearance defined in §1 without express consent of the targets is illegal.

§3 Distributing synthetic pornography

Distributing, making available, public display, purchase, sale, yielding, import and export of non-authorized synthetic pornography defined in §2 are punishable.[footnote 1]

§4 Aggravated producing and distributing synthetic pornography

If the media described in §2 or §3 is made or distributed with the intent to frame for a crime or for blackmail, the crime should be judged as aggravated.

Afterwords

The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product, but then in July 2019 it appeared to me that Adequate Porn Watcher AI (concept) could really help in this age of industrial disinformation if it were built and trained and banning modeling of human appearance was in conflict with the "original plan".

One would assume that collecting permissions to model each porn is not plausible, so the question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet.

In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be formulated so that this does not make Adequate Porn Watcher AI (concept) illegal / impossible. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal.

Law proposal to ban unauthorized modeling of human voice

§1 Unauthorized modeling of a human voice

Acquiring such a model of a human's voice, that deceptively resembles some dead or living person's voice and the possession, purchase, sale, yielding, import and export without the express consent of the target are punishable.

§2 Application of unauthorized voice models

Producing and making available media from covert voice models defined in §1 is punishable.

§3 Aggravated application of unauthorized voice models

If the produced media is for a purpose to

  • frame a human target or targets for crimes
  • to attempt extortion or
  • to defame the target,

the crime should be judged as aggravated.


Resources and reporting on law

AI and law in general

Reviews and regulation From the w:Library of Congress:

w:Gibson Dunn & Crutcher (gibsondunn.com) publishes a quarterly legal update on 'Artificial Intelligence and Autonomous Systems'. Gibson Dunn & Crutcher is a global w:law firm, founded in Los Angeles in 1890.


From Europe

Synthetic filth in the law and media

The countries that have unfortunately banned full face veil

“There are currently 16 nations that have banned the burqa (not to be confused with the hijab), including w:Tunisia,[14] w:Austria, w:Denmark, w:France, w:Belgium, w:Tajikistan, w:Latvia,[15] w:Bulgaria,[16] w:Cameroon, w:Chad, w:Congo-Brazzaville, w:Gabon, w:Netherlands,[17] w:China,[18] w:Morocco, and w:Switzerland.”

~ Wikipedia on w:Hijab by country as of 2021-03-13


Taking into consideration these times of industrial disinformation, it is vicious and uncivilized to have laws banning wearing a the full face veil in public.

Quotes on the current laws and their application

“If no-one who wants to hurt you knows what you look like, so how could someone malevolent make a covert digital look-alike of you?”

~ Juho Kunsola on The fine last line of defense with stopping power i.e. Why banning the burka, niqāb and other forms of full facial veil is not a wise, nor civilized move.




Footnotes

  1. People who are found in possession of this synthetic pornography should probably not be penalized, but rather advised to get some help.

1st seen in

References

  1. "New state laws go into effect July 1".
  2. 2.0 2.1 "§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty". w:Virginia. Retrieved 2021-01-23.
  3. "Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election". w:Texas. 2019-06-14. Retrieved 2021-01-23. In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality
  4. https://capitol.texas.gov/BillLookup/History.aspx?LegSess=86R&Bill=SB751
  5. Johnson, R.J. (2019-12-30). "Here Are the New California Laws Going Into Effect in 2020". KFI. iHeartMedia. Retrieved 2021-01-23.
  6. "AB 602 - California Assembly Bill 2019-2020 Regular Session - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action". openstates.org. openstates.org. Retrieved 2021-03-24.
  7. Mihalcik, Carrie (2019-10-04). "California laws seek to crack down on deepfakes in politics and porn". w:cnet.com. w:CNET. Retrieved 2021-01-23.
  8. Berman, Marc; Leyva, Connie (2019), "SB-564 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.", w:California
  9. Berman, Marc; Leyva, Connie (2019), "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.", w:California
  10. "China seeks to root out fake news and deepfakes with new online content rules". w:Reuters.com. w:Reuters. 2019-11-29. Retrieved 2021-01-23.
  11. Statt, Nick (2019-11-29). "China makes it a criminal offense to publish deepfakes or fake news without disclosure". w:The Verge. Retrieved 2021-01-23.
  12. Royle, Sara (2021-01-05). "'Deepfake porn images still give me nightmares'". w:BBC Online. w:BBC. Retrieved 2021-01-31. She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.
  13. Mort, Helen (2020). "Change.org petition: 'Tighten regulation on taking, making and faking explicit images'". w:Change.org. w:Change.org. Retrieved 2021-01-31. Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.
  14. "Tunisian PM bans wearing of niqab in public institutions". Reuters. 5 July 2019. Retrieved 2021-03-13.
  15. "A European government has banned Islamic face veils despite them being worn by just three women". 21 April 2016. Retrieved 2021-03-13.
  16. Bulgaria the latest European country to ban the burqa and [niqab in public places, Smh.com.au: accessed 5 December 2016.
  17. Halasz, Stephanie; McKenzie, Sheena (27 June 2018). "The Netherlands introduces burqa ban in some public spaces" (27 June 2018). CNN. CNN. Retrieved 2021-03-13.
  18. Phillips, Tom (13 January 2015). "China bans burqa in capital of Muslim region of Xinjiang". The Telegraph (13 January 2015). The Telegraph. Retrieved 2021-03-13.