Laws against synthesis and other related crimes
This article contains some current laws against abusive uses of synthetic human-like fakes and also information what kind of laws are being prepared and two SSFWIKI original law proposals, one against digital look-alikes and one against digital sound-alikes.
New laws
- UK's Online Safety Act 2023 has been passed to law and reportedly criminalizes non-consensual synthetic pornography.
- The European Union has finalized a law package to regulate AI called the Artificial Intelligence Act
New bills are currently in the works in
- Current bills in the EU of high importance are the almost finalized Directive on combating violence against women and domestic violence and the Regulation to Prevent and Combat Child Sexual Abuse
- US Senate is considering an anti-fake bill and the House is considering several bills
- Canada is considering C-27
- China seems to be planning to ban all synthetic pornography, however consensual its making was
Bills that didn't make it
- Canada's House of Commons was considering banning all pornographic content, for which there is no proof-of-age and written consent from everybody visible in the pornographic recording.
- Past bills in the USA
Information elsewhere / legal information compilations (recommended)
- Existing Nonconsensual Pornography, Sextortion, and Deep Fake Laws at cybercivilrights.org
- A Look at Global Deepfake Regulation Approaches at responsible.ai[2] April 2023 compilation and reporting by Amanda Lawson of the Responsible Artificial Intelligence Institute.
- The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology at legaljournal.princeton.edu[3] compilation and reporting by Caroline Quirk. PLJ is Princeton’s only student-run law review.
- Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography at techpolicy.press[4] May 2023 compilation and reporting by Kaylee Williams
- Deepfake AI laws for USA at foundationra.com, Sextortion laws for USA at foundationra.com and Revenge porn laws for USA at foundationra.com compilations by Foundation RA
- Deepfake laws: is AI outpacing legislation? at onfido.com[5] February 2024 summary and compilation by Aled Owen, Director of Global Policy at Onfido (for-profit)
- Is Deepfake Pornography Illegal? at criminaldefenselawyer.com [6] by Rebecca Pirius is a good bring-together of the current illegality/legality situation in the USA federally and state-wise. Published by w:Nolo (publisher), updated Feb 2024
- Deepfake Pornography: A Legal and Ethical Menace at tclf.in[7] October 2023 compilation and reporting by Janvhi Rastogi, published in the The Contemporary Law Forum.
AustraliaEdit
The Online Safety Act 2021 at legislation.gov.au[1st seen in 1] regulates the non-consensual sharing or threatening to share sexual images.
If the synthetic fake human-like images depict illegal and restricted online content, then the Online Content Scheme, as defined in the Online Safety Act 2021, may apply.[8]
Office of the eSafety Commissioner at esafety.gov.au is Australia's independent regulator for online safety.
Links
CanadaEdit
The existing Canadian law ban the distribution of non-consensual disclosure of intimate images.[9]
Active bills in CanadaEdit
Digital Charter Implementation Act - House of Commons of Canada bill C-27Edit
Digital Charter Implementation Act at parl.ca or An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
Past bills in CanadaEdit
Stopping Internet Sexual Exploitation Act - House of Commons of Canada bill C-270Edit
- House of Commons of Canada bill C-270 'Stopping Internet Sexual Exploitation Act' at parl.ca, an act to amend the Criminal Code regarding pornography was first read to the Commons by w:Arnold Viersen on Thursday 2022-04-28 at 10:15.
According to townandcountrytoday.com the author of the bill introduced an identical bill C-302 on Thursday 2021-05-27, but that got killed off by oncoming federal elections
Stopping Internet Sexual Exploitation Act - An Act to amend the Criminal Code (pornographic material) is a private member's bill by Arnold Viersen (Official Site at arnoldviersen.ca)
Summary of the C-270 from parl.ca
"This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted."
Sommaire en français / Summary in French
Le texte modifie le Code criminel afin d’interdire à toute personne de produire ou de distribuer du matériel pornographique à des fins commerciales, ou d’en faire la publicité, sans s’être au préalable assurée qu’au moment de la production du matériel, chaque personne dont l’image y est représentée était âgée de dix-huit ans ou plus et avait donné son consentement exprès à ce que son image y soit représentée.
Links
- C-270 - Stopping Internet Sexual Exploitation Act in English et en français at publications.gc.ca
- Parliament of Canada LEGISinfo: C-270 - Stopping Internet Sexual Exploitation Act at parl.ca in English
- C-270 - Stopping Internet Sexual Exploitation Act at openparliament.ca includes motivation of Mr. Arnold Viersen and the co-sponsor of the bill Mr. Garnett Genuis.
Reporting
ChinaEdit
This information should be updated.
Law against synthesis crimes in ChinaEdit
Mandatory labeling of fake media in China since 2020Edit
On Wednesday January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a w:crime the w:Cyberspace Administration of China (cac.gov.cn) stated on its website. China announced this new law in November 2019.[12] The Chinese government seems to be reserving the right to prosecute both users and w:online video platforms failing to abide by the rules. [13]
Deep Synthesis Provisions 2023Edit
On Tuesday 2023-01-10 the Deep Synthesis Provisions came into effect. It was originally drafted in 2022 by the w:Cyberspace Administration of China as Provisions on the Administration of Deep Synthesis Internet Information Services (Draft for solicitation of comments) at chinalawtranslate.com or view the Chinese language draft 国家互联网信息办公室关于《互联网信息服务深度合成管理规定(征求意见稿)》公开征求意见的通知 at cac.gov.cn[1st seen in 2].
Reporting
- China’s New Legislation on Deepfakes: Should the Rest of Asia Follow Suit? at thediplomat.com March 2023 reporting
EUEdit
Laws in the EUEdit
Artificial Intelligence ActEdit
The European Union has a law on AI called w:Artificial Intelligence Act. The European Commission proposed the AI Act in 2021. On Wednesday 2024-03-13 the MEPs adopted this law.[14] The AI Act will have a key role in the effective implementation of upcoming EU Directive on combating violence against women and domestic violence and Regulation to Prevent and Combat Child Sexual Abuse that intend to protect us against synthetic pornography.
- European Artificial Intelligence Act has been approved by the member countries and is on track to be approved by April 2024.[15] Read Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS at eur-lex.europa.eu[1st seen in 3] (also contains translations)
- Artificial Intelligence Act: MEPs adopt landmark law at europarl.europa.eu, a press announcement on the adoption of the Artificial Intelligence Act on Wednesday 2024-03-13.
Studies and information
- Tackling deepfakes in European policy at europarl.europa.eu, a 2021 study by the Panel for the Future of Science and Technology and published by the w:European Parliamentary Research Service. View .pdf at europarl.europa.eu
- The EU Artificial Intelligence Act at artificialintelligenceact.eu is a website promising Up-to-date developments and analyses of the EU AI Act on the new EU law by the Future of Life Institute, an American non-profit NGO.
Reporting
- The AI Act vs. deepfakes: A step forward, but is it enough? at euractiv.com, 2024-02-26 opinion piece by Cristina Vanberghen
Digital Services ActEdit
The Digital Services Act package at digital-strategy.ec.europa.eu w:Digital Services Act (DSA) came into force in November 2022.[16]
The Artificial Intelligence Act and Digital Services Act together will help in the enforcement of the upcoming protections to shield us from synthesis crimes.
Bills in the EUEdit
Directive on combating violence against women and domestic violenceEdit
The Directive on combating violence against women and domestic violence at commission.europa.eu will require, among other things, that member states criminalize non-consensual synthetic digital look-alike pornography in their criminal codes.
Official
- Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on combating violence against women and domestic violence at eur-lex.europa.eu has the directive text in all languages and various formats
- Legislative proposal on combating violence against women and domestic violence Legislative Train Schedule at europarl.europa.eu lists this EU bill as close to adoption. (As of March 2024)
- Commission welcomes political agreement on new rules to combat violence against women and domestic violence at ec.europa.eu, a 2024-02-06 press release by the w:European Commission.
- International Women's Day 2022: Commission proposes EU-wide rules to combat violence against women and domestic violence at ec.europa.eu, a 2024-03-08 press release by the EC.
Unofficial
- Deepfakes and dick pics: EU protects women from digital violence at hateaid.org, a 2024-02-07 press release by HateAid.org
- Disrupting the Deepfake Pipeline in Europe at futureoflife.org, 2024-02-22 article on the approach of Leveraging corporate criminal liability under the Violence Against Women Directive to safeguard against pornographic deepfake exploitation by Alexandra Tsalidis
Regulation to Prevent and Combat Child Sexual AbuseEdit
- New legislation to fight child sexual abuse online Legislative Train Schedule at europarl.europa.eu lists this EU bill as tabled. (As of March 2024)
- Prevention of online child sexual abuse at consilium.europa.eu
- Timeline - Prevention of online child sexual abuse at consilium.europa.eu
“The Regulation to Prevent and Combat Child Sexual Abuse (Child Sexual Abuse Regulation, or CSAR) is a w:European Union regulation proposed by the w:European Commissioner for Home Affairs w:Ylva Johansson on 11 May 2022. The stated aim of the legislation is to prevent child sexual abuse online through the implementation of a number of measures, including the establishment of a framework that would make the detection and reporting of child sexual abuse material (CSAM) by digital platforms a legal requirement within the European Union.”
Reporting
- New EU rules will criminalise 'paedophilia handbooks' and deepfakes of child abuse at euronews.com, 2024-02-06 reporting
Code of practice on Disinformation 2022Edit
The 2022 Code of Practice on Disinformation at digital-strategy.ec.europa.eu - Major online platforms, emerging and specialised platforms, players in the advertising industry, fact-checkers, research and civil society organisations delivered a strengthened Code of Practice on Disinformation following the Commission’s Guidance of May 2021.[17]
FinlandEdit
Laws in FinlandEdit
Suomeksi / in Finnish Suomen seksuaalirikoslaki 2023
Law on sexual offences in Finland 2023Edit
Law on sexual offences in Finland 2023 is found in Chapter 20 of the Finnish Criminal Code titled "Seksuaalirikoksista" ("Sexual offences") and came into effect on Sunday 2023-01-01.[18]
The new law in Finland protects adults against sexual image based abuse be it real or synthetic in origin.
Other countries have also woken up to the problems of synthesis crime and have legislated laws against synthesis and other related crimes.
Relevant sections of Chapter 20
- 7 § Non-consensual dissemination of a sexual image criminalizes distribution of unauthorized real and synthetic sexual images without permission. (7 § Seksuaalisen kuvan luvaton levittäminen[18])
- 19 § Distribution of an image depicting a child in a sexual manner [18] criminalizes the distribution of real and synthetic child sexual abuse material (CSAM). Attempting this crime is also punishable. (19 § Lasta seksuaalisesti esittävän kuvan levittäminen[18])
- 20 § Aggravated distribution of an image depicting a child in a sexual manner [18] defines the parameters for aggravated form of the crime of making CSAM available. (20 § Törkeä lasta seksuaalisesti esittävän kuvan levittäminen[18])
- 21 § Possession of an image depicting a child in a sexual manner[18] criminalizes the possession of CSAM and acquiring access for the intent to access CSAM. (21 § Lasta seksuaalisesti esittävän kuvan hallussapito)[18])
This 2023 upgrade and gather-together of the Finnish Criminal Code on sexual offences was made upon the initiative of the 2019-2023 w:Marin Cabinet, was voted into law by the w:Members of the Parliament of Finland, 2019–2023 and it came into effect on Sunday 2023-01-01.
Translation to English by the Ministry of Justice: Criminal Code (39/1889) - Chapter 20 - Sexual offences (translation) as .pdf at oikeusministerio.fi (subject to possible revisions)
Finland criminalized synthetic CSAM in 2011Edit
Distribution and attempt of distribution and also possession of synthetic CSAM was already criminalized earlier on 2011-06-01 upon the initiative of the w:Vanhanen II Cabinet. Real CSAM was already criminalized before this improvement. These protections against real and synthetic CSAM were moved in the criminal code into 19 §, 20 § and 21 § of chapter 20 in the 2023 sexual offences legislation improvement.
GermanyEdit
- Deepfakes and German law at mj-cohen.com, January 2020 summary by Maureen Cohen states that the existing German laws are well equipped rendering non-consensual digital look-alikes illegal.
IndiaEdit
Laws in India
Laws in India can be accessed through the India Code at indiacode.nic.in
w:Information Technology Act, 2000
- Section 66. - Computer related offences.[19]
- Section 66A. - Omitted. Would have been Punishment for sending offensive messages through communication service, etc.
- Section 66B. - Punishment for dishonestly receiving stolen computer resource or communication device.
- Section 66C. - Punishment for identity theft.[20]
- Section 66D. - Punishment for cheating by personation by using computer resource.[21]
- Section 66E. - Punishment for violation of privacy.[22]
- Section 66F. - Punishment for cyber terrorism.
- Section 67. - Punishment for publishing or transmitting obscene material in electronic form.[23]
- Section 67A. - Punishment for publishing or transmitting of material containing sexually explicit act, etc., in electronic form.[24]
- Section 67B. - Punishment for publishing or transmitting of material depicting children in sexually explicit act, etc., in electronic form.[25]
- Section 67C. - Preservation and retention of information by intermediaries.
w:Information Technology Rules, 2021
Past bills in India
Legal compilations on the legal situation against synthetic filth in India
- Deepfake Pornography: A Legal and Ethical Menace at tclf.in[7] October 2023 compilation and reporting by Janvhi Rastogi, published in the The Contemporary Law Forum.
- Deepfakes And Breach Of Personal Data – A Bigger Picture at livelaw.in[26] 2023-11-24 compilation on laws against synthetic fakes in India, by Vikrant Rana, Anuradha Gandhi and Rachita Thakur
- What Is Deep Fake Cyber Crime? What Does Indian Law Say About It? at cybercert.in[27]
- Cyberlaw in India at cybercert.in provides a wider look into the Indian Cyber Laws
New ZealandEdit
- 'Harmful Digital Communications Act 2015' at legislation.govt.nz[1st seen in 1] criminalises intimate visual recordings and image-based sexual abuse.
- w:Netsafe netsafe.org.nz is an online safety non-profit organisation in New Zealand. It provides educational, anti-bullying and support services. The organisation is contracted under the w:Harmful Digital Communications Act until 2026. (Wikipedia)
- CERT.govt.nz, the w:Computer emergency response team of New Zealand - "Responding to cyber security threats in New Zealand - CERT NZ is your first port of call when you need to report a cyber security problem."
- The w:Department of Internal Affairs investigates the possession of, and trading in, child exploitation material.[28]
Links regarding the Harmful Digital Communications Act 2015
- 'Harmful Digital Communications (HDC)' at police.govt.nz by the w:New Zealand Police
- 'Harmful digital communications' at justice.govt.nz by the w:Ministry of Justice (New Zealand)
- What is the HDCA? at netsafe.org.nz by w:Netsafe, the agency approved by the w:New Zealand Police to process complaints about harmful digital communications.
- See Wikipedia article on w:Harmful Digital Communications Act 2015 for more info
SingaporeEdit
Law in SingaporeEdit
Protection from Online Falsehoods and Manipulation Act 2019Edit
w:Protection from Online Falsehoods and Manipulation Act 2019[1st seen in 4] is a w:statute of the w:Parliament of Singapore that enables authorities to tackle the spread of w:fake news or w:false information. (Wikipedia)
South AfricaEdit
Cybercrimes Act 19 of 2020Edit
South-Africa Cybercrimes Act 19 of 2020 (English / Afrikaans) at gov.za[1st seen in 5] came only partially into effect on Wednesday 2021-12-01.[29]
- Cybercrimes Act, Chapter 2 Cybercrimes, Section 16 - Disclosure of data message of intimate image at cybercrimesact.co.za Subsection 1 states
Any person (‘‘A’’) who unlawfully and intentionally discloses, by means of an electronic communications service, a data message of an intimate image of a person (‘‘B’’), without the consent of B, is guilty of an offence.
Any person who unlawfully and intentionally
- attempts;
- conspires with any other person; or
- aids, abets, induces, incites, instigates, instructs, commands or procures another person, to commit an offence in terms of Part I or Part II of this Chapter, is guilty of an offence and is liable on conviction to the punishment to which a person convicted of actually committing that offence would be liable.
Links
- Cybercrimes Act at cybercrimesact.co.za by Accessible Law contains the law in website format
Reporting
South KoreaEdit
Law in South KoreaEdit
ACT ON SPECIAL CASES CONCERNING THE PUNISHMENT OF SEXUAL CRIMESEdit
ACT ON SPECIAL CASES CONCERNING THE PUNISHMENT OF SEXUAL CRIMES at elaw.klri.re.kr is a law in South Korea.[1st seen in 6]
UKEdit
Law against synthesis crimes in the UKEdit
On Tuesday 2024-04-16 the UK announced that creating sexually explicit deepfake images to be made offence in UK.[30] This is a very good move and next logical step would be the prohibition of non-consensually possessing other people's appearance models and voice models, as suggested in Law proposals against synthetic filth by Juho Kunsola. There is really no logical reason to allow criminal leagues to legally possess and trade their libraries of models without this acquisition through covert modeling or trading and possession of raw materials (covert models) to produce the disinformation weapons being illegal.
Online Safety Act 2023Edit
w:Online Safety Act 2023 Online Safety Act 2023 at legislation.gov.uk received Royal Assent on 2023-10-26[31] and it reportedly criminalizes non-consensual synthetic pornography.
- Part 4 - CSEA reporting of the Act on CSEA reporting has not yet come into effect as of February 2024.
- Section 66 requires providers of services regulated under the Act to have systems and processes in place (so far as possible) to ensure that they report all detected and unreported CSEA content present on the service to the NCA (National Crime Agency)
- Offence in relation to CSEA reporting in Section 69
- Part 7 – Enforcement offences
- Part 10 - Communication offences establishes
- False communications offence in Section 179
- Threatening communications offence in Section 181
- Offences of sending or showing flashing images electronically in Section 183
- Offence of encouraging or assisting serious self-harm in Section 184
- Sending etc photograph or film of genitals in Section 187
- Sharing or threatening to share intimate photograph or film in Section 188
The Online Safety Act 2023 came to be from a House of Lords bill UK's HL Bill 151 - Online Safety Bill at bills.parliament.uk The bill originated from the House of Commons sessions 2021-22 2022-23.
- Online Safety Act: new criminal offences circular at gov.uk published 2024-01-31 - It is a circular is issued to inform the police and other relevant public authorities of certain provisions of the Online Safety Act in particular new criminal offences.
- Creating a safer life online for people in the UK at ofcom.org.uk
- A guide to the Online Safety Bill at gov.uk
- Documents, publications and announcements relating to the government's Online Safety Bill at gov.uk
Reporting and summaries of the Online Safety Act
- The Online Safety Act 2023: a primer at infolaw.co.uk, a 2023 primer on the Online Safety Act by Alex Heshmaty on 2023-11-29
- UK: The Online Safety Act is now law; Ofcom’s powers as online safety regulator have officially commenced at epra.org states that there will three successive implementation phases.
The Domestic Abuse Act 2021 Chapter 17, part 6 - Disclosure of private sexual photographs and filmsEdit
w:Domestic Abuse Act 2021 / Chapter 17 / Part 6 - Offences involving abusive or violent behaviour / Disclosure of private sexual photographs and films - Threats to disclose private sexual photographs and films with intent to cause distress
According to the UK-based Revenge Porn Helpline at revengepornhelpline.org.uk article What to do if someone is threatening to share your intimate images threats to share intimate images with the intent to cause distress is now an offense in UK law. This is included within the w:Domestic Abuse Act 2021 which was enacted into UK law on 29th June 2021.[32]
We don't know quite yet if the bug has been fixed, that if the pictures are not pictures of you, but synthetic human-like fakes the police cannot do anything.
Links
Historical about the UK law against synthesis crimesEdit
The UK law was not very up-to-date on the issue of synthetic filth until recent improvements.
The independent w:Law Commission (England and Wales) reviewed the law where it applies to taking, making and sharing intimate images without consent. The outcome of the consultation was due to be published later in 2021.[33]
"In 2019, law expert Dr Aislinn O’Connell told w:The Independent that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The w:Women and Equalities Committee called on the UK Government to introduce new legislation on image-based sexual abuse in order to criminalise ALL non-consensual creation and distribution of intimate sexual images."[34] This call is for similar laws as California put in place on January 1 2020.
USAEdit
- Is Deepfake Pornography Illegal? at criminaldefenselawyer.com [6] by Rebecca Pirius is a good bring-together of the current illegality/legality situation in the USA federally and state-wise. Published by w:Nolo (publisher), updated Feb 2024
- Deepfake AI laws for USA at foundationra.com, Sextortion laws for USA at foundationra.com and Revenge porn laws for USA at foundationra.com compilations by Foundation RA
See also:
- Current bills in the USA and past bills in the USA
- The Legislative Process: Introduction and Referral of Bills (Video) at congress.gov
- Glossary of Legislative Terms at congress.gov
Law against synthesis crimes in Virginia 2019Edit
Code of Virginia § 18.2-386.2. Unlawful dissemination or sale of images of another; penaltyEdit
Since July 1 2019[35] w:Virginia w:has criminalized the sale and dissemination of unauthorized synthetic pornography, but not the manufacture.[36], as section § 18.2-386.2 titled 'Unlawful dissemination or sale of images of another; penalty.' became part of the w:Code of Virginia.
Code of Virginia (TOC) » Title 18.2. Crimes and Offenses Generally » Chapter 8. Crimes Involving Morals and Decency » Article 5. Obscenity and Related Offenses » Section § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty
The section § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty. of Virginia is as follows:
A. Any w:person who, with the w:intent to w:coerce, w:harass, or w:intimidate, w:maliciously w:disseminates or w:sells any videographic or still image created by any means whatsoever that w:depicts another person who is totally w:nude, or in a state of undress so as to expose the w:genitals, pubic area, w:buttocks, or female w:breast, where such person knows or has reason to know that he is not w:licensed or w:authorized to disseminate or sell such w:videographic or w:still image is w:guilty of a Class 1 w:misdemeanor.
- For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's w:face, w:likeness, or other distinguishing characteristic.
B. If a person uses w:services of an w:Internet service provider, an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.
C. Venue for a prosecution under this section may lie in the w:jurisdiction where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.
D. The provisions of this section shall not preclude prosecution under any other w:statute.[36]
The identical bills were House Bill 2678 presented by w:Delegate w:Marcus Simon to the w:Virginia House of Delegates on January 14 2019 and three day later an identical Senate bill 1736 was introduced to the w:Senate of Virginia by Senator w:Adam Ebbin.
Law against synthesis crimes in Texas 2019Edit
Texas SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an electionEdit
On September 1 2019 w:Texas Senate bill SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election w:amendments to the election code came into effect in the w:Law of Texas, giving w:candidates in w:elections a 30-day protection period to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality"[37] SB 751 was introduced to the Senate by w:Bryan Hughes (politician).[38]
The text of S.B. No. 751 is as follows
AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.
BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF TEXAS:
SECTION 1. Section 255.004, Election Code, is w:amended by adding Subsections (d) and (e) to read as follows:
- (d) A person commits an offense if the person, with intent to injure a candidate or influence the result of an election:
- creates a deep fake video; and
- causes the deep fake video to be published or distributed within 30 days of an election.
- (e) In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality.
SECTION 2. This Act takes effect September 1, 2019.
Law against synthesis crimes in California 2020Edit
California AB-602 - Depiction of individual using digital or electronic technology: sexually explicit material: cause of actionEdit
January 1 2020 [39] the w:California w:US state law "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." came into effect in the civil code of the w:California Codes banning the manufacturing and w:digital distribution of synthetic pornography without the w:consent of the people depicted. AB-602 provides victims of synthetic pornography with w:injunctive relief and poses legal threats of w:statutory and w:punitive damages on w:criminals making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California w:Governor w:Gavin Newsom on October 3 2019 and was authored by w:California State Assemblymember w:Marc Berman and an identical Senate bill was coauthored by w:California Senator w:Connie Leyva.[40][41] AB602 at trackbill.com
Introduction by Assemblymember Marc Berman:
AB 602, Berman. Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.
Existing law creates a private w:right of action against a person who intentionally distributes a photograph or recorded image of another that exposes the intimate body parts of that person or of a person engaged in a sexual act without the person’s consent if specified conditions are met.
This bill would provide that a depicted individual, as defined, has a w:cause of action against a person who either
- (1) creates and intentionally discloses sexually explicit material if the person knows or reasonably should have known the depicted individual did not w:consent to its creation or disclosure or
- (2) who intentionally discloses sexually explicit material that the person did not create if the person knows the depicted individual did not consent to its creation.
The bill would specify exceptions to those provisions, including if the material is a matter of legitimate public concern or a work of political or newsworthy value.
The bill would authorize a prevailing w:plaintiff who suffers harm to seek w:injunctive relief and recover reasonable w:attorney’s fees and costs as well as specified monetary w:damages, including statutory and w:punitive damages.
The law is as follows:
SECTION 1. Section 1708.86 is added to the Civil Code of California, to read:
1708.86. (a) For purposes of this section:
- (1) “Altered depiction” means a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
- (2) “Authorized Representative” means an attorney, talent agent, or personal manager authorized to represent a depicted individual if the depicted individual is represented.
- (3) (A) “Consent” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.
- (3) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied:
- (i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it.
- (ii) The depicted individual’s authorized representative provides written approval of the signed agreement.
- (4) “Depicted individual” means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.
- (5) “Despicable conduct” means conduct that is so vile, base, or contemptible that it would be looked down on and despised by a reasonable person.
- (6) “Digitization” means to realistically depict any of the following:
- (A) The nude body parts of another human being as the nude body parts of the depicted individual.
- (B) Computer-generated nude body parts as the nude body parts of the depicted individual.
- (C) The depicted individual engaging in sexual conduct in which the depicted individual did not engage.
- (7) “Disclose” means to publish, make available, or distribute to the public.
- (8) “Individual” means a natural person.
- (9) “Malice” means that the defendant acted with intent to cause harm to the plaintiff or despicable conduct that was done with a willful and knowing disregard of the rights of the plaintiff. A person acts with knowing disregard within the meaning of this paragraph when they are aware of the probable harmful consequences of their conduct and deliberately fail to avoid those consequences.
- (10) “Nude” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola.
- (11) “Person” means a human being or legal entity.
- (12) “Plaintiff” includes cross-plaintiff.
- (13) “Sexual conduct” means any of the following:
- (A) Masturbation.
- (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals.
- (C) Sexual penetration of the vagina or rectum by, or with, an object.
- (D) The transfer of semen by means of sexual conduct from the penis directly onto the depicted individual as a result of ejaculation.
- (E) Sadomasochistic abuse involving the depicted individual.
(14) “Sexually explicit material” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct.
(b) A depicted individual has a cause of action against a person who does either of the following:
- (1) Creates and intentionally discloses sexually explicit material and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation or disclosure.
- (2) Intentionally discloses sexually explicit material that the person did not create and the person knows the depicted individual in that material did not consent to the creation of the sexually explicit material.
(c) (1) A person is not liable under this section in either of the following circumstances:
- (A) The person discloses the sexually explicit material in the course of any of the following:
- (i) Reporting unlawful activity.
- (ii) Exercising the person’s law enforcement duties.
- (iii) Hearings, trials, or other legal proceedings.
- (B) The material is any of the following:
- (i) A matter of legitimate public concern.
- (ii) A work of political or newsworthy value or similar work.
- (iii) Commentary, criticism, or disclosure that is otherwise protected by the California Constitution or the United States Constitution.
- (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value solely because the depicted individual is a public figure.
(d) It shall not be a defense to an action under this section that there is a disclaimer included in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
(e) (1) A prevailing plaintiff who suffers harm as a result of the violation of subdivision (b) may recover any of the following:
- (A) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material.
- (B) One of the following:
- (i) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress.
- (ii) Upon request of the plaintiff at any time before the final judgment is rendered, the plaintiff may instead recover an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, as follows:
- (I) A sum of not less than one thousand five hundred dollars ($1,500) but not more than thirty thousand dollars ($30,000).
- (II) If the unlawful act was committed with malice, the award of statutory damages may be increased to a maximum of one hundred fifty thousand dollars ($150,000).
- (C) Punitive damages.
- (D) Reasonable attorney’s fees and costs.
- (E) Any other available relief, including injunctive relief.
(2) The remedies provided by this section are cumulative and shall not be construed as restricting a remedy that is available under any other law.
(f) An action under this section shall be commenced no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence.
(g) The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions.
Law against synthesis crimes in Georgia in 2021Edit
Georgia Code § 16-11-90Edit
Georgia Code / Title 16. Crimes and Offenses Chapter 11 "Offenses Against Public Order and Safety", Article 3 "Invasions of Privacy", Part 3 "Invasion of privacy" GA CODE § 16-11-90 [1st seen in 7]
The law is as of April 14, 2021[43] as follows:
(a) As used in this Code section, the term:
- (1) “Harassment” means engaging in conduct directed at a depicted person that is intended to cause substantial emotional harm to the depicted person.
- (2) “Nudity” means:
- (A) The showing of the human male or female genitals, pubic area, or buttocks without any covering or with less than a full opaque covering;
- (B) The showing of the female breasts without any covering or with less than a full opaque covering; or
- (C) The depiction of covered male genitals in a discernibly turgid state.
- (3) “Sexually explicit conduct” shall have the same meaning as set forth in Code Section 16-12-100 .
(b) A person violates this Code section if he or she, knowing the content of a transmission or post, knowingly and without the consent of the depicted person:
- (1) Electronically transmits or posts, in one or more transmissions or posts, a photograph or video which depicts nudity or sexually explicit conduct of an adult, including a falsely created videographic or still image, when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person; or
- (2) Causes the electronic transmission or posting, in one or more transmissions or posts, of a photograph or video which depicts nudity or sexually explicit conduct of an adult, including a falsely created videographic or still image, when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person.
- Nothing in this Code section shall be construed to impose liability on an interactive computer service, as such term is defined in 47 U.S.C. 230(f)(2) , or an information service or telecommunications service, as such terms are defined in 47 U.S.C. 153 , for content provided by another person.
(c) Any person who violates this Code section shall be guilty of a misdemeanor of a high and aggravated nature; provided, however, that upon a second or subsequent violation of this Code section, he or she shall be guilty of a felony and, upon conviction thereof, shall be punished by imprisonment of not less than one nor more than five years, a fine of not more than $100,000.00, or both.
(d) A person shall be subject to prosecution in this state pursuant to Code Section 17-2-1 for any conduct made unlawful by this Code section which the person engages in while:
- (1) Either within or outside of this state if, by such conduct, the person commits a violation of this Code section which involves an individual who resides in this state; or
- (2) Within this state if, by such conduct, the person commits a violation of this Code section which involves an individual who resides within or outside this state.
(e) The provisions of subsection (b) of this Code section shall not apply to:
- (1) The activities of law enforcement and prosecution agencies in the investigation and prosecution of criminal offenses;
- (2) Legitimate medical, scientific, or educational activities;
- (3) Any person who transmits or posts a photograph or video depicting only himself or herself engaged in nudity or sexually explicit conduct;
- (4) The transmission or posting of a photograph or video that was originally made for commercial purposes;
- (5) Any person who transmits or posts a photograph or video depicting a person voluntarily engaged in nudity or sexually explicit conduct in a public setting; or
- (6) A transmission that is made pursuant to or in anticipation of a civil action.
(f) There shall be a rebuttable presumption that an information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet, for content provided by another person, does not know the content of an electronic transmission or post.
(g) Any violation of this Code section shall constitute a separate offense and shall not merge with any other crimes set forth in this title.
Law against synthesis crimes in New York State in 2021Edit
New York State - CHAPTER 6 Civil Rights ARTICLE 5 Right of Privacy SECTION 52-C - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individualEdit
Consolidated Laws of New York / CHAPTER 6 Civil Rights ARTICLE 5 Right of Privacy / SECTION 52-C - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual[1st seen in 7]
The law is as follows:
§ 52-c - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual. Obs: There are 2 § 52-c's
1. For the purposes of this section:
- a. "depicted individual" means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
- b. "digitization" means to realistically depict the nude body parts of another human being as the nude body parts of the depicted individual, computer-generated nude body parts as the nude body parts of the depicted individual or the depicted individual engaging in sexual conduct, as defined in subdivision ten of section 130.00 of the penal law, in which the depicted individual did not engage.
- c. "individual" means a natural person.
- d. "person" means a human being or legal entity.
- e. "sexually explicit material" means any portion of an audio visual work that shows the depicted individual performing in the nude, meaning with an unclothed or exposed intimate part, as defined in section 245.15 of the penal law, or appearing to engage in, or being subjected to, sexual conduct, as defined in subdivision ten of section 130.00 of the penal law.
2. a. A depicted individual shall have a cause of action against a person who, discloses, disseminates or publishes sexually explicit material related to the depicted individual, and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation, disclosure, dissemination, or publication.
- b. It shall not be a defense to an action under this section that there is a disclaimer in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
3. a. A depicted individual may only consent to the creation, disclosure, dissemination, or publication of sexually explicit material by knowingly and voluntarily signing an agreement written in plain language that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.
- b. A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied:
- i. the depicted individual is given at least three business days to review the terms of the agreement before signing it; or
- ii. if the depicted individual is represented, the attorney, talent agent, or personal manager authorized to represent the depicted individual provides additional written approval of the signed agreement.
4. a. A person is not liable under this section if:
- i. the person discloses, disseminates or publishes the sexually explicit material in the course of reporting unlawful activity, exercising the person's law enforcement duties, or hearings, trials or other legal proceedings; or
- ii. the sexually explicit material is a matter of legitimate public concern, a work of political or newsworthy value or similar work, or commentary, criticism or disclosure that is otherwise protected by the constitution of this state or the United States; provided that sexually explicit material shall not be considered of newsworthy value solely because the depicted individual is a public figure.
5. In any action commenced pursuant to this section, the finder of fact, in its discretion, may award injunctive relief, punitive damages, compensatory damages, and reasonable court costs and attorney's fees.
6. A cause of action or special proceeding under this section shall be commenced the later of either:
- a. three years after the dissemination or publication of sexually explicit material; or
- b. one year from the date a person discovers, or reasonably should have discovered, the dissemination or publication of such sexually explicit material.
7. Nothing in this section shall be read to require a prior criminal complaint, prosecution or conviction to establish the elements of the cause of action provided for in this section.
8. The provisions of this section including the remedies are in addition to, and shall not supersede, any other rights or remedies available in law or equity.
9. If any provision of this section or its application to any person or circumstance is held invalid, the invalidity shall not affect other provisions or applications of this section which can be given effect without the invalid provision or application, and to this end the provisions of this section are severable.
10. Nothing in this section shall be construed to limit, or to enlarge, the protections that 47 U.S.C. § 230 confers on an interactive
computer service for content provided by another information content provider, as such terms are defined in 47 U.S.C. § 230.
Current bills in the USAEdit
US Senate bill S. 3696 - Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act of 2024) (118th Congress - 2023-2024)Edit
S.3696 - DEFIANCE Act of 2024 at congress.gov, a bipartisan Senate bill against synthetic filth.
- Introductory Statement on S. 3696; Congressional Record Vol. 170, No. 17 at congress.gov 2024-01-30
- Durbin, Graham, Klobuchar, Hawley Introduce DEFIANCE Act to Hold Accountable Those Responsible for the Proliferation of Nonconsensual, Sexually-Explicit “Deepfake” Images and Videos at judiciary.senate.gov[1st seen in 8]
- S. 3696 (IS) - Disrupt Explicit Forged Images And Non-Consensual Edits Act of 2024 at govinfo.gov
- DEFIANCE Act of 2024 at durbin.senate.gov explains that it would create a federal civil remedy for victims who are identifiable in a “digital forgery".
Reporting and commentary on DEFIANCE Act of 2024 bill
- Taylor Swift AI images prompt US bill to tackle nonconsensual, sexual deepfakes at theguardian.com
- Lawmakers propose anti-nonconsensual AI porn bill after Taylor Swift controversy at theverge.com
US House bill H.R.7123 - Quashing Unwanted and Interruptive Electronic Telecommunications Act (QUIET Act) (118th Congress - 2023-2024)Edit
Quashing Unwanted and Interruptive Electronic Telecommunications Act at congress.gov[1st seen in 9]
US House bill H.R.6943 - No AI FRAUD Act (118th Congress - 2023-2024)Edit
No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act of 2024 or the (No AI FRAUD Act) at congress.gov was introduced to the 118th Congress 2nd session on 2024-01-24.[1st seen in 10]
US House bill H.R.5586 - Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023 (118th Congress - 2023-2024)Edit
“Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023” or the “DEEPFAKES Accountability Act” at congress.gov is a reintroduction of earlier House bill H.R.3230 to the 118th Congress (2023-2024 session)
US House bill H.R. 3106 - Preventing Deepfakes of Intimate Images Act (118th Congress - 2023-2024)Edit
Preventing Deepfakes of Intimate Images Act at congress.gov was introduced in the House on 2023-05-05. It was a reintroduction of H.R. 9631 from the 117th Congress.
NY Senate bill S5583 in the 2023-2024 regular sessionEdit
NY Senate bill S5583 in the 2023-2024 regular session at nysenate.gov would establish the crime of aggravated harassment by means of electronic or digital communication and provides for a private right of action for the unlawful dissemination or publication of deep fakes.
Past bills in the USAEdit
US Senate bill S1641 - Preventing Rampant Online Technological Exploitation and Criminal Trafficking Act of 2022 - (117th Congress)Edit
The bill known as Senate bill S.4991 - 'PROTECT Act' at congress.gov was read by Senator w:Mike Lee on Wednesday 2022-09-28 twice.
New York Senate bill - Unlawful Electronic Transmission of Sexually Explicit Visual Material - in regular session 2021-2022Edit
Bill Unlawful Electronic Transmission of Sexually Explicit Visual Material' is essentially a bill that aims to ban sending unsolicited nudes.
In the 2021-2022 w:New York State Senate regular sessions, on 2021-01-14 Senator w:James Skoufis (official website) sponsored and Senators w:Brian Benjamin (official website) and w:Todd Kaminsky (official website) of the New York State Senate co-sponsored New York Senate bill S1641 to add section § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL to the Article 250 of the penal law. On 2021-03-19 an identical New York Assembly bill A6517 - Establishes the crime of unlawful electronic transmission of sexually explicit visual material was introduced to the w:New York State Assembly by Assembly Member w:Aileen Gunther (official website).[1st seen in 11]
If this bill passes it will be codified in the w:Consolidated Laws of New York. View the Consolidated Laws of New York at nysenate.gov.
- Title of bill: An act to amend the penal law, in relation to the creation of the criminal offense of unlawful electronic transmission of sexually explicit visual material
- Purpose: The purpose of this bill is to make it unlawful to send sexually explicit material through electronic means unless the material is sent at the request of, or with the express consent of the recipient.
- Summary of provisions: Adds a new section 250.70 to the penal law making it unlawful to knowingly transmit by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed unless the material is sent at the request of, or with the express consent of the recipient.
- Justification: Currently under New York State law, indecent exposure in person is a crime, but it is not unlawful to send sexually explicit photos to nonconsenting adult recipients through electronic transmission. With the growing modem age of online dating, many individuals are receiving sexually explicit visual content without their consent from strangers. No person should be forced to view sexually explicit material without their consent.
The bill offers a clear deterrent to those considering sending unsolicited sexual pics and similar inappropriate conduct, and protects the unwilling recipients who currently have no legal recourse for such abuses.
What is illegal in the real world must be illegal in the digital world, and this legislation is a first step in the right direction in adding that accountability.
- Legislative history:
- Senate - 2020 - S5949 Referred to Codes
- Assembly - 2020 - A7801 Referred to Codes
- Fiscal implications: Minimal
- Effective date: This act shall take effect on the first of November next succeeding the date on which it shall have become a law.
The text of the bill is, as of 2021-03-24, as follows:
- "Section 1. The penal law is amended by adding a new section 250.70 to read as follows:
- § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL.
- A person is guilty of unlawful electronic transmission of sexually explicit visual material if a person knowingly transmits by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed or depicts the covered genitals of a male person that are in a discernibly turgid state and such visual material is not sent at the request of or with the express consent of the recipient. For purposes of this section the term "intimate parts" means the naked genitals, pubic area, anus, or female postpubescent nipple of the person and the term "sexual conduct" shall have the same meaning as defined in section 130.00 (Sex offenses; definitions of terms) of this chapter. Unlawful electronic transmission of sexually explicit visual material is a class a misdemeanor.
- § 2. This act shall take effect on the first of November next succeeding the date on which it shall have become a law."
Senator w:James Skoufis (official website) sponsored New York Senate bill S1641 to add section § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL to the Article 250 of the penal law.
Senator w:Brian Benjamin (official website) is a cosponsor of S1641
Senator w:Todd Kaminsky (official website) is a cosponsor of S1641
NY Assembly Member w:Aileen Gunther (official website) presented an identical New York Assembly bill A6517 - Establishes the crime of unlawful electronic transmission of sexually explicit visual material to the w:New York State Assembly on 2021-03-19.
US Senate bill - Stop Internet Sexual Exploitation Act - 2019-2020 US Senate session (116th Congress)Edit
The Stop Internet Sexual Exploitation Act (SISE) was a bill introduced to the 2019-2020 session of the US Senate.
US House bill - H.R.3230 - DEEP FAKES Accountability Act (116th Congress)Edit
- H.R.3230 - DEEP FAKES Accountability Act at congress.gov[1st seen in 4] also known as Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019 and it aimed to include requirements for producers of synthetic human-like fakes to generally comply with certain w:digital watermark and disclosure requirements.[45]
US Senate bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress)Edit
- S.3805 - Malicious Deep Fake Prohibition Act of 2018 at congress.gov[1st seen in 4] aimed at criminalizing synthesizing human-likeness media for intent of breaking federal, state, local or tribal law
Law proposalsEdit
Law proposals against synthetic filth by Juho KunsolaEdit
- Audience: Developed with suitability for national, supranational and UN treaty levels.
- Writing context:
- Written from context of inclusion to criminal codes.
- I'm a Finn so this has been worded to fit in the Chapter 24 of the Criminal Code of Finland (in Finnish at finlex.fi) titled "Offences against privacy, public peace and personal reputation"
- Access the English translations of the Finnish Criminal Code at finlex.fi or go straight to the latest .pdf from 2016. Chapter 24 starts on page 107.
- History: This version is an evolution of a Finnish language original written in 2016.
Existing law in Chapter 24. of the Finnish Criminal Code - "Offences against privacy, public peace and personal reputation" seems to be ineffective against many synthetic human-like fake attack and seems it could be used to frame victims for crimes with digital sound-alikes.
The portions affected by or affecting the synthetic filth situation in bold font:
- Section 1 - Invasion of domestic premises (879/2013)
- Section 1(a) - Harassing communications (879/2013)
- Section 2 - Aggravated invasion of domestic premises (531/2000)
- Section 3 - Invasion of public premises (585/2005)
- Section 4 - Aggravated invasion of public premises (531/2000)
- Section 5 - Eavesdropping (531/2000)
- Section 6 - Illicit observation (531/2000)
- Section 7 - Preparation of eavesdropping or illicit observation (531/2000)
- Section 8 - Dissemination of information violating personal privacy (879/2013)
- Section 8(a) - Aggravated dissemination of information violating personal privacy (879/2013)
- Section 9 - Defamation (879/2013)
- Section 10 - Aggravated defamation (879/2013)
- Section 11 - Definition (531/2000)
- Section 12 - Right to bring charges (879/2013)
- Section 13 - Corporate criminal liability (511/2011)
Law proposal to ban visual synthetic filthEdit
§1 Models of human appearanceEdit
A model of human appearance means
- A realistic 3D model
- A 7D bidirectional reflectance distribution function model
- A direct-to-2D capable w:machine learning model
- Or a model made with any technology whatsoever, that looks deceivingly like the target person.
§2 Producing synthetic pornographyEdit
Making projections, still or videographic, where targets are portrayed in a nude or in a sexual situation from models of human appearance defined in §1 without express consent of the targets is illegal.
§3 Distributing synthetic pornographyEdit
Distributing, making available, public display, purchase, sale, yielding, import and export of non-authorized synthetic pornography defined in §2 are punishable.[footnote 1]
§4 Aggravated producing and distributing synthetic pornographyEdit
If the media described in §2 or §3 is made or distributed with the intent to frame for a crime or for blackmail, the crime should be judged as aggravated.
AfterwordsEdit
The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product weaponized synthetic pornography, but then in July 2019 it appeared to me that Adequate Porn Watcher AI (concept) could really help in this age of industrial disinformation if it were built, trained and operational. Banning modeling of human appearance was in conflict with the revised plan.
It is safe to assume that collecting permissions to model each pornographic recording is not plausible, so an interesting question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet.
In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be formulated so that this does not make Adequate Porn Watcher AI (concept) illegal / impossible. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal.
Law proposal to ban unauthorized modeling of human voiceEdit
Motivation: The current situation where the criminals can freely trade and grow their libraries of stolen voices is unwise.
§1 Unauthorized modeling of a human voiceEdit
Acquiring such a model of a human's voice, that deceptively resembles some dead or living person's voice and the possession, purchase, sale, yielding, import and export without the express consent of the target are punishable.
§2 Application of unauthorized voice modelsEdit
Producing and making available media from covert voice models defined in §1 is punishable.
§3 Aggravated application of unauthorized voice modelsEdit
If the produced media is for a purpose to
- frame a human target or targets for crimes
- to attempt extortion or
- to defame the target,
the crime should be judged as aggravated.
Resources and reporting on lawEdit
AI and law in generalEdit
Reviews and regulation From the w:Library of Congress:
- 'Regulation of Artificial Intelligence' at loc.gov
- 'Regulation of Artificial Intelligence: Comparative Summary' at loc.gov
- 'Regulation of Artificial Intelligence: International and Regional Approaches' (loc.gov)
- 'Regulation of Artificial Intelligence: The Americas and the Caribbean' (loc.gov)
- 'Regulation of Artificial Intelligence: East/South Asia and the Pacific' (loc.gov)
- 'Regulation of Artificial Intelligence: Europe and Central Asia' loc.gov
- 'Regulation of Artificial Intelligence: Middle East and North Africa' (loc.gov)
- 'Regulation of Artificial Intelligence: Sub-Saharan Africa' (loc.gov)
w:Gibson Dunn & Crutcher (gibsondunn.com) publishes a quarterly legal update on 'Artificial Intelligence and Autonomous Systems'. Gibson Dunn & Crutcher is a global w:law firm, founded in Los Angeles in 1890.
- 'Artificial Intelligence and Autonomous Systems Legal Update' Quarter 4 2018 at Gibson & Dunn
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 1 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 2 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 3 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 4 2019'
From Europe
- 'The ethics of artificial intelligence: Issues and initiatives' (.pdf) at europarl.europa.eu, a March 2020 study by the w:European Parliamentary Research Service Starting from page 37 the .pdf lists organizations in the field.
Synthetic filth in the law and mediaEdit
- '"The New Weapon of Choice": Law's Current Inability to Properly Address Deepfake Pornography' at scholarship.law.vanderbilt.edu, October 2020 Notes by Anne Pechenik Gieseke published in the The w:Vanderbilt Law Review, the flagship w:academic journal of w:Vanderbilt University Law School.
- 'Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios' at carnegieendowment.org, a 2020-07-08 assessment identifies some types of criminalities that can be made using synthetic human-like fakes.
- 'Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes' at ssri.duke.edu, an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the w:Duke University
- 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You' at scholarship.law.duke.edu, published in 2019 in the Duke Law Journal, a student-run law review.
- States Are Rushing to Regulate Deepfakes as AI Goes Mainstream at bloomberg.com (paywalled), June 2023 reporting on US legislation against synthetic human-like fakes. It lists CA, WA, WY, MN, TX, GA and VA as having anti-deepfake laws and lists LA, IL, MA and NJ as planning to legislate.
The countries that have unfortunately banned full face veilEdit
“There are currently 16 nations that have banned the burqa (not to be confused with the hijab), including w:Tunisia,[46] w:Austria, w:Denmark, w:France, w:Belgium, w:Tajikistan, w:Latvia,[47] w:Bulgaria,[48] w:Cameroon, w:Chad, w:Congo-Brazzaville, w:Gabon, w:Netherlands,[49] w:China,[50] w:Morocco, and w:Switzerland.”
Taking into consideration these times of industrial disinformation, it is vicious and uncivilized to have laws banning wearing a the full face veil in public.
Quotes on the current laws and their applicationEdit
“If no-one who wants to hurt you knows what you look like, so how could someone malevolent make a covert digital look-alike of you?”
FootnotesEdit
- ↑ People who are found in possession of this synthetic pornography should probably not be penalized, but rather advised to get some help.
1st seen inEdit
- ↑ 1.0 1.1 https://equalitynow.org/resource/briefing-paper-deepfake-image-based-sexual-abuse-tech-facilitated-sexual-exploitation-and-the-law/
- ↑ Politico AI: Decoded mailing list Wednesday 2022-02-02
- ↑ https://artificialintelligenceact.eu/the-act/ via https://futureoflife.org/ newsletter
- ↑ 4.0 4.1 4.2
Chatting with ChatGPT 2023
- First I asked ChatGPT to "list some legislative approaches against so-called "deep fakes" or "deepfakes"" and it mentioned the Singaporean #Protection from Online Falsehoods and Manipulation Act 2019 and the #Bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress)
- ↑ https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/
- ↑ https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches
- ↑ 7.0 7.1 https://cybercivilrights.org/deep-fake-laws/
- ↑ https://onfido.com/blog/deepfake-law/
- ↑ https://www.newyorker.com/science/annals-of-artificial-intelligence/the-terrifying-ai-scam-that-uses-your-loved-ones-voice
- ↑ https://onfido.com/blog/deepfake-law/
- ↑ First seen in the suggestions for similar bills for Bills similar to CA AB602 by trackbill.com.
ReferencesEdit
- ↑
"You Won't Believe What Obama Says In This Video!". w:YouTube. w:BuzzFeed. 2018-04-17. Retrieved 2022-01-05.
We're entering an era in which our enemies can make anyone say anything at any point in time.
- ↑ Lawson, Amanda (2023-04-24). "A Look at Global Deepfake Regulation Approaches". responsible.ai. Responsible Artificial Intelligence Institute. Retrieved 2024-02-14.
- ↑ Quirk, Caroline (2023-06-19). "The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology". legaljournal.princeton.edu. Princeton Legal Journal. Retrieved 2024-02-14.
- ↑ Williams, Kaylee (2023-05-15). "Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography". techpolicy.press. Retrieved 2024-02-14.
- ↑ Owen, Aled (2024-02-02). "Deepfake laws: is AI outpacing legislation?". onfido.com. Onfido. Retrieved 2024-02-14.
- ↑ 6.0 6.1 Pirius, Rebecca (2024-02-07). "Is Deepfake Pornography Illegal?". Criminaldefenselawyer.com. w:Nolo (publisher). Retrieved 2024-02-22.
- ↑ 7.0 7.1 Rastogi, Janvhi (2023-10-16). "Deepfake Pornography: A Legal and Ethical Menace". tclf.in. The Contemporary Law Forum. Retrieved 2024-02-14.
- ↑ https://equalitynow.org/resource/briefing-paper-deepfake-image-based-sexual-abuse-tech-facilitated-sexual-exploitation-and-the-law/
- ↑ https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches
- ↑ Viersen, Arnold (2022-04-28). "Stopping Internet Sexual Exploitation Act - An Act to amend the Criminal Code (pornographic material)". parl.ca. w:House of Commons of Canada. Retrieved 2022-10-06.
- ↑ Bilingual version of C-270 https://publications.gc.ca/collections/collection_2022/parl/XB441-270-1.pdf
- ↑ "China seeks to root out fake news and deepfakes with new online content rules". w:Reuters.com. w:Reuters. 2019-11-29. Retrieved 2021-01-23.
- ↑ Statt, Nick (2019-11-29). "China makes it a criminal offense to publish deepfakes or fake news without disclosure". w:The Verge. Retrieved 2021-01-23.
- ↑
"Artificial Intelligence Act: MEPs adopt landmark law". europarl.europa.eu. w:European Parliament. 2024-03-13. Retrieved 2024-03-22.
The regulation, agreed in negotiations with member states in December 2023, was endorsed by MEPs with 523 votes in favour, 46 against and 49 abstentions.
- ↑ https://www.politico.eu/article/eu-artificial-intelligence-act-ai-technology-risk-rules/
- ↑ https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches
- ↑ https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation
- ↑ 18.0 18.1 18.2 18.3 18.4 18.5 18.6 18.7
Authoritative up-to-date version of the Criminal Code chapter 20 On sexual offences can always be found at finlex.fi
Translation to English by the Ministry of Justice: Criminal Code (39/1889) - Chapter 20 - Sexual offences (translation) as .pdf at oikeusministerio.fi (subject to possible revisions) - ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=76&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=79&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=80&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=81#:~:text=Whoever%2C%20intentionally%20or%20knowingly%20captures,two%20lakh%20rupees%2C%20or%20with
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=83&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=84&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=85&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ Rana, Vikrant; Gandhi, Anuradha; Thakur, Rachita (2023-11-24). "Deepfakes And Breach Of Personal Data – A Bigger Picture". livelaw.in. Retrieved 2024-02-21.
- ↑
"What Is Deep Fake Cyber Crime? What Does Indian Law Say About It?". cybercert.in. Retrieved 2024-03-23.
At present, India does not have any law specifically for deep fake cybercrime, but various other laws can be combined to deal with it.
- ↑ https://www.police.govt.nz/advice-services/cybercrime-and-internet/online-child-safety
- ↑ https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/
- ↑ creating sexually explicit deepfake images to be made offence in UK at theguardian.com
- ↑ https://www.gov.uk/government/publications/online-safety-act-new-criminal-offences-circular/online-safety-act-new-criminal-offences-circular
- ↑ https://revengepornhelpline.org.uk/information-and-advice/need-help-and-advice/threats-to-share-intimate-images/
- ↑
Royle, Sara (2021-01-05). "'Deepfake porn images still give me nightmares'". w:BBC Online. w:BBC. Retrieved 2021-01-31.
She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.
- ↑
Mort, Helen (2020). "Change.org petition: 'Tighten regulation on taking, making and faking explicit images'". w:Change.org. w:Change.org. Retrieved 2021-01-31.
Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.
- ↑ "New state laws go into effect July 1".
- ↑ 36.0 36.1 "§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty". w:Virginia. Retrieved 2021-01-23.
- ↑
"Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election". w:Texas. 2019-06-14. Retrieved 2021-01-23.
In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality
- ↑ https://capitol.texas.gov/BillLookup/History.aspx?LegSess=86R&Bill=SB751
- ↑ Johnson, R.J. (2019-12-30). "Here Are the New California Laws Going Into Effect in 2020". KFI. iHeartMedia. Retrieved 2021-01-23.
- ↑ "AB 602 - California Assembly Bill 2019-2020 Regular Session - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action". openstates.org. openstates.org. Retrieved 2021-03-24.
- ↑ Mihalcik, Carrie (2019-10-04). "California laws seek to crack down on deepfakes in politics and porn". w:cnet.com. w:CNET. Retrieved 2021-01-23.
- ↑ Berman, Marc; Leyva, Connie (2019), "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.", w:California
- ↑ 43.0 43.1 "Georgia Code Title 16. Crimes and Offenses § 16-11-90". w:FindLaw. w:Georgia (U.S. state). 2021-04-14. Retrieved 2022-01-04.
- ↑ "SECTION 52-C Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual". nysenate.gov. w:New York State Legislature. 2021-11-12. Retrieved 2021-01-04.
- ↑ https://www.congress.gov/bill/116th-congress/house-bill/3230
- ↑ "Tunisian PM bans wearing of niqab in public institutions". Reuters. 5 July 2019. Retrieved 2021-03-13.
- ↑ "A European government has banned Islamic face veils despite them being worn by just three women". 21 April 2016. Retrieved 2021-03-13.
- ↑ Bulgaria the latest European country to ban the burqa and [niqab in public places, Smh.com.au: accessed 5 December 2016.
- ↑ Halasz, Stephanie; McKenzie, Sheena (27 June 2018). "The Netherlands introduces burqa ban in some public spaces" (27 June 2018). CNN. CNN. Retrieved 2021-03-13.
- ↑ Phillips, Tom (13 January 2015). "China bans burqa in capital of Muslim region of Xinjiang". The Telegraph (13 January 2015). The Telegraph. Retrieved 2021-03-13.