Laws against synthesis and other related crimes
This article contains some current laws against abusive uses of synthetic human-like fakes and also information what kind of laws are being prepared and two SSFWIKI original law proposals, one against digital look-alikes and one against digital sound-alikes.
New bills are currently in the works in
- China seems to be planning to ban all synthetic pornography, however consensual its making was
- New York State Senate is considering banning sending unsolicited nudes in New York Senate regular session 2021-2022
- The European Union is preparing a law package to regulate AI
- Canada's House of Commons is considering banning all pornographic content, for which there is no proof-of-age and written consent from everybody visible in the pornographic recording.
- UK's Online Safety Bill has been making progress and if passed would criminalize non-consensual synthetic pornography.
Information elsewhere (recommended)
- A Look at Global Deepfake Regulation Approaches at responsible.ai
- The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology at legaljournal.princeton.edu
Canada
Active bills
Stopping Internet Sexual Exploitation Act - House of Commons of Canada bill C-270
- House of Commons of Canada bill C-270 'Stopping Internet Sexual Exploitation Act' at parl.ca, an act to amend the Criminal Code regarding pornography was first read to the Commons by w:Arnold Viersen on Thursday 2022-04-28 at 10:15.
According to townandcountrytoday.com the author of the bill introduced an identical bill C-302 on Thursday 2021-05-27, but that got killed off by oncoming federal elections
Stopping Internet Sexual Exploitation Act - An Act to amend the Criminal Code (pornographic material) is a private member's bill by Arnold Viersen (Official Site at arnoldviersen.ca)
Summary of the C-270 from parl.ca
"This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted."[1]
Sommaire en français / Summary in French
Le texte modifie le Code criminel afin d’interdire à toute personne de produire ou de distribuer du matériel pornographique à des fins commerciales, ou d’en faire la publicité, sans s’être au préalable assurée qu’au moment de la production du matériel, chaque personne dont l’image y est représentée était âgée de dix-huit ans ou plus et avait donné son consentement exprès à ce que son image y soit représentée.[2]
Links
- C-270 - Stopping Internet Sexual Exploitation Act in English et en français at publications.gc.ca
- Parliament of Canada LEGISinfo: C-270 - Stopping Internet Sexual Exploitation Act at parl.ca in English
- C-270 - Stopping Internet Sexual Exploitation Act at openparliament.ca includes motivation of Mr. Arnold Viersen and the co-sponsor of the bill Mr. Garnett Genuis.
Reporting
China
Law against synthesis crimes in China 2020
On January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a w:crime the w:Cyberspace Administration of China (cac.gov.cn) stated on its website. China announced this new law in November 2019.[3] The Chinese government seems to be reserving the right to prosecute both users and w:online video platforms failing to abide by the rules. [4]
Draft bill against synthesis crimes China 2022
The w:Cyberspace Administration of China has a new draft bill in 2022 called Provisions on the Administration of Deep Synthesis Internet Information Services (Draft for solicitation of comments) at chinalawtranslate.com or view the Chinese language draft 国家互联网信息办公室关于《互联网信息服务深度合成管理规定(征求意见稿)》公开征求意见的通知 at cac.gov.cn[1st seen in 1]
EU
EU Law on AI 20??
The European Union is planning a law on AI
- Read Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS at eur-lex.europa.eu[1st seen in 2] (also contains translations)
- https://artificialintelligenceact.eu/ is a website on the planned law by the Future of Life Institute, an American non-profit NGO.
- The Digital Services Act package at digital-strategy.ec.europa.eu w:Digital Services Act (DSA) came into force in November 2022.[5]
- The 2022 Code of Practice on Disinformation at digital-strategy.ec.europa.eu - Major online platforms, emerging and specialised platforms, players in the advertising industry, fact-checkers, research and civil society organisations delivered a strengthened Code of Practice on Disinformation following the Commission’s Guidance of May 2021.[6]
- Tackling deepfakes in European policy (.pdf) at europarl.europa.eu, a 2021 study by the Panel for the Future of Science and Technology and published by the w:European Parliamentary Research Service.
Finland
Laws in Finland
Law on sexual offences in Finland 2023
Law on sexual offences in Finland 2023 is found in Chapter 20 of the Finnish Criminal Code titled "Seksuaalirikoksista" ("Sexual offences") and came into effect on Sunday 2023-01-01.[7]
The new law in Finland protects adults against sexual image based abuse be it real or synthetic in origin.
Other countries have also woken up to the problems of synthesis crime and have legislated laws against synthesis and other related crimes.
Relevant sections of Chapter 20
- 7 § Non-consensual dissemination of a sexual image criminalizes distribution of unauthorized real and synthetic sexual images without permission. (7 § Seksuaalisen kuvan luvaton levittäminen[7])
- 19 § Distribution of an image depicting a child in a sexual manner [7] criminalizes the distribution of real and synthetic child sexual abuse material (CSAM). Attempting this crime is also punishable. (19 § Lasta seksuaalisesti esittävän kuvan levittäminen[7])
- 20 § Aggravated distribution of an image depicting a child in a sexual manner [7] defines the parameters for aggravated form of the crime of making CSAM available. (20 § Törkeä lasta seksuaalisesti esittävän kuvan levittäminen[7])
- 21 § Possession of an image depicting a child in a sexual manner[7] criminalizes the possession of CSAM and acquiring access for the intent to access CSAM. (21 § Lasta seksuaalisesti esittävän kuvan hallussapito)[7])
This 2023 upgrade and gather-together of the Finnish Criminal Code on sexual offences was made upon the initiative of the 2019-2023 w:Marin Cabinet, was voted into law by the w:Members of the Parliament of Finland, 2019–2023 and it came into effect on Sunday 2023-01-01.
Translation to English by the Ministry of Justice: Criminal Code (39/1889) - Chapter 20 - Sexual offences (translation) as .pdf at oikeusministerio.fi (subject to possible revisions)
Finland criminalized synthetic CSAM in 2011
Distribution and attempt of distribution and also possession of synthetic CSAM was already criminalized earlier on 2011-06-01 upon the initiative of the w:Vanhanen II Cabinet. Real CSAM was already criminalized before this improvement. These protections against real and synthetic CSAM were moved in the criminal code into 19 §, 20 § and 21 § of chapter 20 in the 2023 sexual offences legislation improvement.
South Africa
Cybercrimes Act 19 of 2020
South-Africa Cybercrimes Act 19 of 2020 (English / Afrikaans) at gov.za[1st seen in 3] came only partially into effect on Wednesday 2021-12-01.[8]
Links
- Cybercrimes Act at cybercrimesact.co.za by Accessible Law contains the law in website format
Reporting
Singapore
Law in Singapore
Protection from Online Falsehoods and Manipulation Act 2019
w:Protection from Online Falsehoods and Manipulation Act 2019[1st seen in 4] is a w:statute of the w:Parliament of Singapore that enables authorities to tackle the spread of w:fake news or w:false information. (Wikipedia)
UK
The Domestic Abuse Act 2021 Chapter 17, part 6 - Disclosure of private sexual photographs and films
w:Domestic Abuse Act 2021 / Chapter 17 / Part 6 - Offences involving abusive or violent behaviour / Disclosure of private sexual photographs and films - Threats to disclose private sexual photographs and films with intent to cause distress
According to the UK-based Revenge Porn Helpline at revengepornhelpline.org.uk article What to do if someone is threatening to share your intimate images threats to share intimate images with the intent to cause distress is now an offense in UK law. This is included within the w:Domestic Abuse Act 2021 which was enacted into UK law on 29th June 2021.[9]
We don't know quite yet if the bug has been fixed, that if the pictures are not pictures of you, but synthetic human-like fakes the police cannot do anything.
Links
Online Safety Bill
UK's HL Bill 151 - Online Safety Bill at bills.parliament.uk has been making progress and if passed would criminalize non-consensual synthetic pornography. The bill originated from the House of Commons sessions 2021-22 2022-23.
- A guide to the Online Safety Bill at gov.uk
- Documents, publications and announcements relating to the government's Online Safety Bill at gov.uk
Law against synthesis crimes in the UK
The UK law does not seem very up-to-date on the issue of synthetic filth.
The independent w:Law Commission (England and Wales) is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.[10]
"In 2019, law expert Dr Aislinn O’Connell told w:The Independent that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The w:Women and Equalities Committee called on the UK Government to introduce new legislation on image-based sexual abuse in order to criminalise ALL non-consensual creation and distribution of intimate sexual images."[11] This call is for similar laws as California put in place on January 1 2020.
The petition 'Tighten regulation on taking, making and faking explicit images' at Change.org by w:Helen Mort aims to petition the UK govt for proper legislation against synthetic filth. See the mediatheque for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks.
USA
- The Legislative Process: Introduction and Referral of Bills (Video) at congress.gov
- Glossary of Legislative Terms at congress.gov
Law against synthesis crimes in Virginia 2019
Code of Virginia § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty
Since July 1 2019[12] w:Virginia w:has criminalized the sale and dissemination of unauthorized synthetic pornography, but not the manufacture.[13], as section § 18.2-386.2 titled 'Unlawful dissemination or sale of images of another; penalty.' became part of the w:Code of Virginia.
Code of Virginia (TOC) » Title 18.2. Crimes and Offenses Generally » Chapter 8. Crimes Involving Morals and Decency » Article 5. Obscenity and Related Offenses » Section § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty
The section § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty. of Virginia is as follows:
A. Any w:person who, with the w:intent to w:coerce, w:harass, or w:intimidate, w:maliciously w:disseminates or w:sells any videographic or still image created by any means whatsoever that w:depicts another person who is totally w:nude, or in a state of undress so as to expose the w:genitals, pubic area, w:buttocks, or female w:breast, where such person knows or has reason to know that he is not w:licensed or w:authorized to disseminate or sell such w:videographic or w:still image is w:guilty of a Class 1 w:misdemeanor.
- For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's w:face, w:likeness, or other distinguishing characteristic.
B. If a person uses w:services of an w:Internet service provider, an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.
C. Venue for a prosecution under this section may lie in the w:jurisdiction where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.
D. The provisions of this section shall not preclude prosecution under any other w:statute.[13]
The identical bills were House Bill 2678 presented by w:Delegate w:Marcus Simon to the w:Virginia House of Delegates on January 14 2019 and three day later an identical Senate bill 1736 was introduced to the w:Senate of Virginia by Senator w:Adam Ebbin.
Law against synthesis crimes in Texas 2019
Texas SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election
On September 1 2019 w:Texas Senate bill SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election w:amendments to the election code came into effect in the w:Law of Texas, giving w:candidates in w:elections a 30-day protection period to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality"[14] SB 751 was introduced to the Senate by w:Bryan Hughes (politician).[15]
The text of S.B. No. 751 is as follows
AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.
BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF TEXAS:
SECTION 1. Section 255.004, Election Code, is w:amended by adding Subsections (d) and (e) to read as follows:
- (d) A person commits an offense if the person, with intent to injure a candidate or influence the result of an election:
- creates a deep fake video; and
- causes the deep fake video to be published or distributed within 30 days of an election.
- (e) In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality.
SECTION 2. This Act takes effect September 1, 2019.
Law against synthesis crimes in California 2020
California AB-602 - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action
January 1 2020 [16] the w:California w:US state law "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." came into effect in the civil code of the w:California Codes banning the manufacturing and w:digital distribution of synthetic pornography without the w:consent of the people depicted. AB-602 provides victims of synthetic pornography with w:injunctive relief and poses legal threats of w:statutory and w:punitive damages on w:criminals making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California w:Governor w:Gavin Newsom on October 3 2019 and was authored by w:California State Assemblymember w:Marc Berman and an identical Senate bill was coauthored by w:California Senator w:Connie Leyva.[17][18] AB602 at trackbill.com
Introduction by Assemblymember Marc Berman:
AB 602, Berman. Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.
Existing law creates a private w:right of action against a person who intentionally distributes a photograph or recorded image of another that exposes the intimate body parts of that person or of a person engaged in a sexual act without the person’s consent if specified conditions are met.
This bill would provide that a depicted individual, as defined, has a w:cause of action against a person who either
- (1) creates and intentionally discloses sexually explicit material if the person knows or reasonably should have known the depicted individual did not w:consent to its creation or disclosure or
- (2) who intentionally discloses sexually explicit material that the person did not create if the person knows the depicted individual did not consent to its creation.
The bill would specify exceptions to those provisions, including if the material is a matter of legitimate public concern or a work of political or newsworthy value.
The bill would authorize a prevailing w:plaintiff who suffers harm to seek w:injunctive relief and recover reasonable w:attorney’s fees and costs as well as specified monetary w:damages, including statutory and w:punitive damages.
The law is as follows:
SECTION 1. Section 1708.86 is added to the Civil Code of California, to read:
1708.86. (a) For purposes of this section:
- (1) “Altered depiction” means a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
- (2) “Authorized Representative” means an attorney, talent agent, or personal manager authorized to represent a depicted individual if the depicted individual is represented.
- (3) (A) “Consent” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.
- (3) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied:
- (i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it.
- (ii) The depicted individual’s authorized representative provides written approval of the signed agreement.
- (4) “Depicted individual” means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.
- (5) “Despicable conduct” means conduct that is so vile, base, or contemptible that it would be looked down on and despised by a reasonable person.
- (6) “Digitization” means to realistically depict any of the following:
- (A) The nude body parts of another human being as the nude body parts of the depicted individual.
- (B) Computer-generated nude body parts as the nude body parts of the depicted individual.
- (C) The depicted individual engaging in sexual conduct in which the depicted individual did not engage.
- (7) “Disclose” means to publish, make available, or distribute to the public.
- (8) “Individual” means a natural person.
- (9) “Malice” means that the defendant acted with intent to cause harm to the plaintiff or despicable conduct that was done with a willful and knowing disregard of the rights of the plaintiff. A person acts with knowing disregard within the meaning of this paragraph when they are aware of the probable harmful consequences of their conduct and deliberately fail to avoid those consequences.
- (10) “Nude” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola.
- (11) “Person” means a human being or legal entity.
- (12) “Plaintiff” includes cross-plaintiff.
- (13) “Sexual conduct” means any of the following:
- (A) Masturbation.
- (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals.
- (C) Sexual penetration of the vagina or rectum by, or with, an object.
- (D) The transfer of semen by means of sexual conduct from the penis directly onto the depicted individual as a result of ejaculation.
- (E) Sadomasochistic abuse involving the depicted individual.
(14) “Sexually explicit material” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct.
(b) A depicted individual has a cause of action against a person who does either of the following:
- (1) Creates and intentionally discloses sexually explicit material and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation or disclosure.
- (2) Intentionally discloses sexually explicit material that the person did not create and the person knows the depicted individual in that material did not consent to the creation of the sexually explicit material.
(c) (1) A person is not liable under this section in either of the following circumstances:
- (A) The person discloses the sexually explicit material in the course of any of the following:
- (i) Reporting unlawful activity.
- (ii) Exercising the person’s law enforcement duties.
- (iii) Hearings, trials, or other legal proceedings.
- (B) The material is any of the following:
- (i) A matter of legitimate public concern.
- (ii) A work of political or newsworthy value or similar work.
- (iii) Commentary, criticism, or disclosure that is otherwise protected by the California Constitution or the United States Constitution.
- (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value solely because the depicted individual is a public figure.
(d) It shall not be a defense to an action under this section that there is a disclaimer included in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
(e) (1) A prevailing plaintiff who suffers harm as a result of the violation of subdivision (b) may recover any of the following:
- (A) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material.
- (B) One of the following:
- (i) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress.
- (ii) Upon request of the plaintiff at any time before the final judgment is rendered, the plaintiff may instead recover an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, as follows:
- (I) A sum of not less than one thousand five hundred dollars ($1,500) but not more than thirty thousand dollars ($30,000).
- (II) If the unlawful act was committed with malice, the award of statutory damages may be increased to a maximum of one hundred fifty thousand dollars ($150,000).
- (C) Punitive damages.
- (D) Reasonable attorney’s fees and costs.
- (E) Any other available relief, including injunctive relief.
(2) The remedies provided by this section are cumulative and shall not be construed as restricting a remedy that is available under any other law.
(f) An action under this section shall be commenced no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence.
(g) The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions.
Law against synthesis crimes in Georgia in 2021
Georgia Code § 16-11-90
Georgia Code / Title 16. Crimes and Offenses Chapter 11 "Offenses Against Public Order and Safety", Article 3 "Invasions of Privacy", Part 3 "Invasion of privacy" GA CODE § 16-11-90 [1st seen in 5]
The law is as of April 14, 2021[20] as follows:
(a) As used in this Code section, the term:
- (1) “Harassment” means engaging in conduct directed at a depicted person that is intended to cause substantial emotional harm to the depicted person.
- (2) “Nudity” means:
- (A) The showing of the human male or female genitals, pubic area, or buttocks without any covering or with less than a full opaque covering;
- (B) The showing of the female breasts without any covering or with less than a full opaque covering; or
- (C) The depiction of covered male genitals in a discernibly turgid state.
- (3) “Sexually explicit conduct” shall have the same meaning as set forth in Code Section 16-12-100 .
(b) A person violates this Code section if he or she, knowing the content of a transmission or post, knowingly and without the consent of the depicted person:
- (1) Electronically transmits or posts, in one or more transmissions or posts, a photograph or video which depicts nudity or sexually explicit conduct of an adult, including a falsely created videographic or still image, when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person; or
- (2) Causes the electronic transmission or posting, in one or more transmissions or posts, of a photograph or video which depicts nudity or sexually explicit conduct of an adult, including a falsely created videographic or still image, when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person.
- Nothing in this Code section shall be construed to impose liability on an interactive computer service, as such term is defined in 47 U.S.C. 230(f)(2) , or an information service or telecommunications service, as such terms are defined in 47 U.S.C. 153 , for content provided by another person.
(c) Any person who violates this Code section shall be guilty of a misdemeanor of a high and aggravated nature; provided, however, that upon a second or subsequent violation of this Code section, he or she shall be guilty of a felony and, upon conviction thereof, shall be punished by imprisonment of not less than one nor more than five years, a fine of not more than $100,000.00, or both.
(d) A person shall be subject to prosecution in this state pursuant to Code Section 17-2-1 for any conduct made unlawful by this Code section which the person engages in while:
- (1) Either within or outside of this state if, by such conduct, the person commits a violation of this Code section which involves an individual who resides in this state; or
- (2) Within this state if, by such conduct, the person commits a violation of this Code section which involves an individual who resides within or outside this state.
(e) The provisions of subsection (b) of this Code section shall not apply to:
- (1) The activities of law enforcement and prosecution agencies in the investigation and prosecution of criminal offenses;
- (2) Legitimate medical, scientific, or educational activities;
- (3) Any person who transmits or posts a photograph or video depicting only himself or herself engaged in nudity or sexually explicit conduct;
- (4) The transmission or posting of a photograph or video that was originally made for commercial purposes;
- (5) Any person who transmits or posts a photograph or video depicting a person voluntarily engaged in nudity or sexually explicit conduct in a public setting; or
- (6) A transmission that is made pursuant to or in anticipation of a civil action.
(f) There shall be a rebuttable presumption that an information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet, for content provided by another person, does not know the content of an electronic transmission or post.
(g) Any violation of this Code section shall constitute a separate offense and shall not merge with any other crimes set forth in this title.[20]
Law against synthesis crimes in New York State in 2021
New York State - CHAPTER 6 Civil Rights ARTICLE 5 Right of Privacy SECTION 52-C - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual
Consolidated Laws of New York / CHAPTER 6 Civil Rights ARTICLE 5 Right of Privacy / SECTION 52-C - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual[1st seen in 5]
The law is as follows:
§ 52-c - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual. Obs: There are 2 § 52-c's
1. For the purposes of this section:
- a. "depicted individual" means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
- b. "digitization" means to realistically depict the nude body parts of another human being as the nude body parts of the depicted individual, computer-generated nude body parts as the nude body parts of the depicted individual or the depicted individual engaging in sexual conduct, as defined in subdivision ten of section 130.00 of the penal law, in which the depicted individual did not engage.
- c. "individual" means a natural person.
- d. "person" means a human being or legal entity.
- e. "sexually explicit material" means any portion of an audio visual work that shows the depicted individual performing in the nude, meaning with an unclothed or exposed intimate part, as defined in section 245.15 of the penal law, or appearing to engage in, or being subjected to, sexual conduct, as defined in subdivision ten of section 130.00 of the penal law.
2. a. A depicted individual shall have a cause of action against a person who, discloses, disseminates or publishes sexually explicit material related to the depicted individual, and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation, disclosure, dissemination, or publication.
- b. It shall not be a defense to an action under this section that there is a disclaimer in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
3. a. A depicted individual may only consent to the creation, disclosure, dissemination, or publication of sexually explicit material by knowingly and voluntarily signing an agreement written in plain language that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.
- b. A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied:
- i. the depicted individual is given at least three business days to review the terms of the agreement before signing it; or
- ii. if the depicted individual is represented, the attorney, talent agent, or personal manager authorized to represent the depicted individual provides additional written approval of the signed agreement.
4. a. A person is not liable under this section if:
- i. the person discloses, disseminates or publishes the sexually explicit material in the course of reporting unlawful activity, exercising the person's law enforcement duties, or hearings, trials or other legal proceedings; or
- ii. the sexually explicit material is a matter of legitimate public concern, a work of political or newsworthy value or similar work, or commentary, criticism or disclosure that is otherwise protected by the constitution of this state or the United States; provided that sexually explicit material shall not be considered of newsworthy value solely because the depicted individual is a public figure.
5. In any action commenced pursuant to this section, the finder of fact, in its discretion, may award injunctive relief, punitive damages, compensatory damages, and reasonable court costs and attorney's fees.
6. A cause of action or special proceeding under this section shall be commenced the later of either:
- a. three years after the dissemination or publication of sexually explicit material; or
- b. one year from the date a person discovers, or reasonably should have discovered, the dissemination or publication of such sexually explicit material.
7. Nothing in this section shall be read to require a prior criminal complaint, prosecution or conviction to establish the elements of the cause of action provided for in this section.
8. The provisions of this section including the remedies are in addition to, and shall not supersede, any other rights or remedies available in law or equity.
9. If any provision of this section or its application to any person or circumstance is held invalid, the invalidity shall not affect other provisions or applications of this section which can be given effect without the invalid provision or application, and to this end the provisions of this section are severable.
10. Nothing in this section shall be construed to limit, or to enlarge, the protections that 47 U.S.C. § 230 confers on an interactive computer service for content provided by another information content provider, as such terms are defined in 47 U.S.C. § 230.[21]
Current bills in the USA
NY Senate bill S5583 in the 2023-2024 regular session
NY Senate bill S5583 in the 2023-2024 regular session at nysenate.gov would establish the crime of aggravated harassment by means of electronic or digital communication and provides for a private right of action for the unlawful dissemination or publication of deep fakes.
Past bills in the USA
US Senate bill S1641 - Preventing Rampant Online Technological Exploitation and Criminal Trafficking Act of 2022 - (117th Congress)
The bill known as Senate bill S.4991 - 'PROTECT Act' at congress.gov was read by Senator w:Mike Lee on Wednesday 2022-09-28 twice.
New York Senate bill - Unlawful Electronic Transmission of Sexually Explicit Visual Material - in regular session 2021-2022
Bill Unlawful Electronic Transmission of Sexually Explicit Visual Material' is essentially a bill that aims to ban sending unsolicited nudes.
In the 2021-2022 w:New York State Senate regular sessions, on 2021-01-14 Senator w:James Skoufis (official website) sponsored and Senators w:Brian Benjamin (official website) and w:Todd Kaminsky (official website) of the New York State Senate co-sponsored New York Senate bill S1641 to add section § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL to the Article 250 of the penal law. On 2021-03-19 an identical New York Assembly bill A6517 - Establishes the crime of unlawful electronic transmission of sexually explicit visual material was introduced to the w:New York State Assembly by Assembly Member w:Aileen Gunther (official website).[1st seen in 6]
If this bill passes it will be codified in the w:Consolidated Laws of New York. View the Consolidated Laws of New York at nysenate.gov.
- Title of bill: An act to amend the penal law, in relation to the creation of the criminal offense of unlawful electronic transmission of sexually explicit visual material
- Purpose: The purpose of this bill is to make it unlawful to send sexually explicit material through electronic means unless the material is sent at the request of, or with the express consent of the recipient.
- Summary of provisions: Adds a new section 250.70 to the penal law making it unlawful to knowingly transmit by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed unless the material is sent at the request of, or with the express consent of the recipient.
- Justification: Currently under New York State law, indecent exposure in person is a crime, but it is not unlawful to send sexually explicit photos to nonconsenting adult recipients through electronic transmission. With the growing modem age of online dating, many individuals are receiving sexually explicit visual content without their consent from strangers. No person should be forced to view sexually explicit material without their consent.
The bill offers a clear deterrent to those considering sending unsolicited sexual pics and similar inappropriate conduct, and protects the unwilling recipients who currently have no legal recourse for such abuses.
What is illegal in the real world must be illegal in the digital world, and this legislation is a first step in the right direction in adding that accountability.
- Legislative history:
- Senate - 2020 - S5949 Referred to Codes
- Assembly - 2020 - A7801 Referred to Codes
- Fiscal implications: Minimal
- Effective date: This act shall take effect on the first of November next succeeding the date on which it shall have become a law.
The text of the bill is, as of 2021-03-24, as follows:
- "Section 1. The penal law is amended by adding a new section 250.70 to read as follows:
- § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL.
- A person is guilty of unlawful electronic transmission of sexually explicit visual material if a person knowingly transmits by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed or depicts the covered genitals of a male person that are in a discernibly turgid state and such visual material is not sent at the request of or with the express consent of the recipient. For purposes of this section the term "intimate parts" means the naked genitals, pubic area, anus, or female postpubescent nipple of the person and the term "sexual conduct" shall have the same meaning as defined in section 130.00 (Sex offenses; definitions of terms) of this chapter. Unlawful electronic transmission of sexually explicit visual material is a class a misdemeanor.
- § 2. This act shall take effect on the first of November next succeeding the date on which it shall have become a law."
Senator w:James Skoufis (official website) sponsored New York Senate bill S1641 to add section § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL to the Article 250 of the penal law.
- BrianBenjaminFlag.jpg
Senator w:Brian Benjamin (official website) is a cosponsor of S1641
Senator w:Todd Kaminsky (official website) is a cosponsor of S1641
NY Assembly Member w:Aileen Gunther (official website) presented an identical New York Assembly bill A6517 - Establishes the crime of unlawful electronic transmission of sexually explicit visual material to the w:New York State Assembly on 2021-03-19.
US Senate bill - Stop Internet Sexual Exploitation Act - 2019-2020 US Senate session (116th Congress)
The Stop Internet Sexual Exploitation Act (SISE) was a bill introduced to the 2019-2020 session of the US Senate.
US House bill - H.R.3230 - DEEP FAKES Accountability Act (116th Congress)
- H.R.3230 - DEEP FAKES Accountability Act at congress.gov[1st seen in 4] also known as Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019 and it aimed to include requirements for producers of synthetic human-like fakes to generally comply with certain w:digital watermark and disclosure requirements.[22]
US Senate bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress)
- S.3805 - Malicious Deep Fake Prohibition Act of 2018 at congress.gov[1st seen in 4] aimed at criminalizing synthesizing human-likeness media for intent of breaking federal, state, local or tribal law
Law proposals
Law proposals against synthetic filth by Juho Kunsola
- Audience: Developed with suitability for national, supranational and UN treaty levels.
- Writing context:
- Written from context of inclusion to criminal codes.
- I'm a Finn so this has been worded to fit in the Chapter 24 of the Criminal Code of Finland (in Finnish at finlex.fi) titled "Offences against privacy, public peace and personal reputation"
- Access the English translations of the Finnish Criminal Code at finlex.fi or go straight to the latest .pdf from 2016. Chapter 24 starts on page 107.
- History: This version is an evolution of a Finnish language original written in 2016.
Existing law in Chapter 24. of the Finnish Criminal Code - "Offences against privacy, public peace and personal reputation" seems to be ineffective against many synthetic human-like fake attack and seems it could be used to frame victims for crimes with digital sound-alikes.
The portions affected by or affecting the synthetic filth situation in bold font:
- Section 1 - Invasion of domestic premises (879/2013)
- Section 1(a) - Harassing communications (879/2013)
- Section 2 - Aggravated invasion of domestic premises (531/2000)
- Section 3 - Invasion of public premises (585/2005)
- Section 4 - Aggravated invasion of public premises (531/2000)
- Section 5 - Eavesdropping (531/2000)
- Section 6 - Illicit observation (531/2000)
- Section 7 - Preparation of eavesdropping or illicit observation (531/2000)
- Section 8 - Dissemination of information violating personal privacy (879/2013)
- Section 8(a) - Aggravated dissemination of information violating personal privacy (879/2013)
- Section 9 - Defamation (879/2013)
- Section 10 - Aggravated defamation (879/2013)
- Section 11 - Definition (531/2000)
- Section 12 - Right to bring charges (879/2013)
- Section 13 - Corporate criminal liability (511/2011)
Law proposal to ban visual synthetic filth
§1 Models of human appearance
A model of human appearance means
- A realistic 3D model
- A 7D bidirectional reflectance distribution function model
- A direct-to-2D capable w:machine learning model
- Or a model made with any technology whatsoever, that looks deceivingly like the target person.
§2 Producing synthetic pornography
Making projections, still or videographic, where targets are portrayed in a nude or in a sexual situation from models of human appearance defined in §1 without express consent of the targets is illegal.
§3 Distributing synthetic pornography
Distributing, making available, public display, purchase, sale, yielding, import and export of non-authorized synthetic pornography defined in §2 are punishable.[footnote 1]
§4 Aggravated producing and distributing synthetic pornography
If the media described in §2 or §3 is made or distributed with the intent to frame for a crime or for blackmail, the crime should be judged as aggravated.
Afterwords
The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product weaponized synthetic pornography, but then in July 2019 it appeared to me that Adequate Porn Watcher AI (concept) could really help in this age of industrial disinformation if it were built, trained and operational. Banning modeling of human appearance was in conflict with the revised plan.
It is safe to assume that collecting permissions to model each pornographic recording is not plausible, so an interesting question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet.
In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be formulated so that this does not make Adequate Porn Watcher AI (concept) illegal / impossible. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal.
Law proposal to ban unauthorized modeling of human voice
Motivation: The current situation where the criminals can freely trade and grow their libraries of stolen voices is unwise.
§1 Unauthorized modeling of a human voice
Acquiring such a model of a human's voice, that deceptively resembles some dead or living person's voice and the possession, purchase, sale, yielding, import and export without the express consent of the target are punishable.
§2 Application of unauthorized voice models
Producing and making available media from covert voice models defined in §1 is punishable.
§3 Aggravated application of unauthorized voice models
If the produced media is for a purpose to
- frame a human target or targets for crimes
- to attempt extortion or
- to defame the target,
the crime should be judged as aggravated.
Resources and reporting on law
AI and law in general
Reviews and regulation From the w:Library of Congress:
- 'Regulation of Artificial Intelligence' at loc.gov
- 'Regulation of Artificial Intelligence: Comparative Summary' at loc.gov
- 'Regulation of Artificial Intelligence: International and Regional Approaches' (loc.gov)
- 'Regulation of Artificial Intelligence: The Americas and the Caribbean' (loc.gov)
- 'Regulation of Artificial Intelligence: East/South Asia and the Pacific' (loc.gov)
- 'Regulation of Artificial Intelligence: Europe and Central Asia' loc.gov
- 'Regulation of Artificial Intelligence: Middle East and North Africa' (loc.gov)
- 'Regulation of Artificial Intelligence: Sub-Saharan Africa' (loc.gov)
w:Gibson Dunn & Crutcher (gibsondunn.com) publishes a quarterly legal update on 'Artificial Intelligence and Autonomous Systems'. Gibson Dunn & Crutcher is a global w:law firm, founded in Los Angeles in 1890.
- 'Artificial Intelligence and Autonomous Systems Legal Update' Quarter 4 2018 at Gibson & Dunn
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 1 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 2 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 3 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 4 2019'
From Europe
- 'The ethics of artificial intelligence: Issues and initiatives' (.pdf) at europarl.europa.eu, a March 2020 study by the w:European Parliamentary Research Service Starting from page 37 the .pdf lists organizations in the field.
Synthetic filth in the law and media
- '"The New Weapon of Choice": Law's Current Inability to Properly Address Deepfake Pornography' at scholarship.law.vanderbilt.edu, October 2020 Notes by Anne Pechenik Gieseke published in the The w:Vanderbilt Law Review, the flagship w:academic journal of w:Vanderbilt University Law School.
- 'Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios' at carnegieendowment.org, a 2020-07-08 assessment identifies some types of criminalities that can be made using synthetic human-like fakes.
- 'Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes' at ssri.duke.edu, an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the w:Duke University
- 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You' at scholarship.law.duke.edu, published in 2019 in the Duke Law Journal, a student-run law review.
The countries that have unfortunately banned full face veil
“There are currently 16 nations that have banned the burqa (not to be confused with the hijab), including w:Tunisia,[23] w:Austria, w:Denmark, w:France, w:Belgium, w:Tajikistan, w:Latvia,[24] w:Bulgaria,[25] w:Cameroon, w:Chad, w:Congo-Brazzaville, w:Gabon, w:Netherlands,[26] w:China,[27] w:Morocco, and w:Switzerland.”
Taking into consideration these times of industrial disinformation, it is vicious and uncivilized to have laws banning wearing a the full face veil in public.
Quotes on the current laws and their application
“If no-one who wants to hurt you knows what you look like, so how could someone malevolent make a covert digital look-alike of you?”
Footnotes
- ↑ People who are found in possession of this synthetic pornography should probably not be penalized, but rather advised to get some help.
1st seen in
- ↑ Politico AI: Decoded mailing list Wednesday 2022-02-02
- ↑ https://artificialintelligenceact.eu/the-act/ via https://futureoflife.org/ newsletter
- ↑ https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/
- ↑ 4.0 4.1 4.2
Chatting with ChatGPT 2023
- First I asked ChatGPT to "list some legislative approaches against so-called "deep fakes" or "deepfakes"" and it mentioned the Singaporean #Protection from Online Falsehoods and Manipulation Act 2019 and the #Bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress)
- ↑ 5.0 5.1 https://cybercivilrights.org/deep-fake-laws/
- ↑ First seen in the suggestions for similar bills for Bills similar to CA AB602 by trackbill.com.
References
- ↑ Viersen, Arnold (2022-04-28). "Stopping Internet Sexual Exploitation Act - An Act to amend the Criminal Code (pornographic material)". parl.ca. w:House of Commons of Canada. Retrieved 2022-10-06.
- ↑ Bilingual version of C-270 https://publications.gc.ca/collections/collection_2022/parl/XB441-270-1.pdf
- ↑ "China seeks to root out fake news and deepfakes with new online content rules". w:Reuters.com. w:Reuters. 2019-11-29. Retrieved 2021-01-23.
- ↑ Statt, Nick (2019-11-29). "China makes it a criminal offense to publish deepfakes or fake news without disclosure". w:The Verge. Retrieved 2021-01-23.
- ↑ https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches
- ↑ https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation
- ↑ 7.0 7.1 7.2 7.3 7.4 7.5 7.6 7.7
Authoritative up-to-date version of the Criminal Code chapter 20 On sexual offences can always be found at finlex.fi
Translation to English by the Ministry of Justice: Criminal Code (39/1889) - Chapter 20 - Sexual offences (translation) as .pdf at oikeusministerio.fi (subject to possible revisions) - ↑ https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/
- ↑ https://revengepornhelpline.org.uk/information-and-advice/need-help-and-advice/threats-to-share-intimate-images/
- ↑
Royle, Sara (2021-01-05). "'Deepfake porn images still give me nightmares'". w:BBC Online. w:BBC. Retrieved 2021-01-31.
She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.
- ↑
Mort, Helen (2020). "Change.org petition: 'Tighten regulation on taking, making and faking explicit images'". w:Change.org. w:Change.org. Retrieved 2021-01-31.
Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.
- ↑ "New state laws go into effect July 1".
- ↑ 13.0 13.1 "§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty". w:Virginia. Retrieved 2021-01-23.
- ↑
"Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election". w:Texas. 2019-06-14. Retrieved 2021-01-23.
In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality
- ↑ https://capitol.texas.gov/BillLookup/History.aspx?LegSess=86R&Bill=SB751
- ↑ Johnson, R.J. (2019-12-30). "Here Are the New California Laws Going Into Effect in 2020". KFI. iHeartMedia. Retrieved 2021-01-23.
- ↑ "AB 602 - California Assembly Bill 2019-2020 Regular Session - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action". openstates.org. openstates.org. Retrieved 2021-03-24.
- ↑ Mihalcik, Carrie (2019-10-04). "California laws seek to crack down on deepfakes in politics and porn". w:cnet.com. w:CNET. Retrieved 2021-01-23.
- ↑ Berman, Marc; Leyva, Connie (2019), "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.", w:California
- ↑ 20.0 20.1 "Georgia Code Title 16. Crimes and Offenses § 16-11-90". w:FindLaw. w:Georgia (U.S. state). 2021-04-14. Retrieved 2022-01-04.
- ↑ "SECTION 52-C Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual". nysenate.gov. w:New York State Legislature. 2021-11-12. Retrieved 2021-01-04.
- ↑ https://www.congress.gov/bill/116th-congress/house-bill/3230
- ↑ "Tunisian PM bans wearing of niqab in public institutions". Reuters. 5 July 2019. Retrieved 2021-03-13.
- ↑ "A European government has banned Islamic face veils despite them being worn by just three women". 21 April 2016. Retrieved 2021-03-13.
- ↑ Bulgaria the latest European country to ban the burqa and [niqab in public places, Smh.com.au: accessed 5 December 2016.
- ↑ Halasz, Stephanie; McKenzie, Sheena (27 June 2018). "The Netherlands introduces burqa ban in some public spaces" (27 June 2018). CNN. CNN. Retrieved 2021-03-13.
- ↑ Phillips, Tom (13 January 2015). "China bans burqa in capital of Muslim region of Xinjiang". The Telegraph (13 January 2015). The Telegraph. Retrieved 2021-03-13.