Editing Laws against synthesis and other related crimes
Jump to navigation
Jump to search
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 1: | Line 1: | ||
[[File: | [[File:Justice scale silhouette, medium.svg|thumb|right|200px|This article is an attempt to track legislation against synthetic human-like fakes world-wide. Do you know of a law or jurisdiction that this list does not include? Add it! Or let us know through the chat in the lower-righthand side!]] | ||
This article contains some current laws against abusive uses of synthetic human-like fakes and also information what kind of laws are being prepared and two SSFWIKI original law proposals, [[#Law proposal to ban visual synthetic filth|one against digital look-alikes]] and [[#Law proposal to ban unauthorized modeling of human voice|one against digital sound-alikes]]. | This article contains some current laws against abusive uses of synthetic human-like fakes and also information what kind of laws are being prepared and two SSFWIKI original law proposals, [[#Law proposal to ban visual synthetic filth|one against digital look-alikes]] and [[#Law proposal to ban unauthorized modeling of human voice|one against digital sound-alikes]]. | ||
Line 10: | Line 7: | ||
New bills are currently in the works in | New bills are currently in the works in | ||
* The [[#EU|European Union]] is | * The [[#EU|European Union]] is preparing a law package to regulate AI | ||
* [[#China|China]] seems to be planning to ban all synthetic pornography, however consensual its making was | * [[#China|China]] seems to be planning to ban all synthetic pornography, however consensual its making was | ||
* [[#Canada|Canada's House of Commons]] is considering banning all pornographic content, for which there is no proof-of-age and written consent from everybody visible in the pornographic recording. | |||
'''Information elsewhere''' (recommended) | |||
* [https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches '''''A Look at Global Deepfake Regulation Approaches''''' at responsible.ai] | |||
* [https://legaljournal.princeton.edu/the-high-stakes-of-deepfakes-the-growing-necessity-of-federal-legislation-to-regulate-this-rapidly-evolving-technology/ '''''The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology''''' at legaljournal.princeton.edu] | |||
'''Information elsewhere | |||
* | |||
* | |||
= Canada = | = Canada = | ||
[[File:Flag of Canada (Pantone).svg|thumb|right|200px|House of Commons of Canada was contemplating banning distribution of all pornography for which there are no consent declarations and proof-of-age for the people depicted.]] | [[File:Flag of Canada (Pantone).svg|thumb|right|200px|House of Commons of Canada was contemplating banning distribution of all pornography for which there are no consent declarations and proof-of-age for the people depicted.]] | ||
== Active bills == | |||
== Active bills | |||
=== Stopping Internet Sexual Exploitation Act - House of Commons of Canada bill C-270 === | === Stopping Internet Sexual Exploitation Act - House of Commons of Canada bill C-270 === | ||
Line 126: | Line 27: | ||
'''Summary of the C-270 from parl.ca''' | '''Summary of the C-270 from parl.ca''' | ||
"''This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, '''each person''' whose image is '''depicted''' in the material was '''18 years of age''' or older and '''gave their express consent''' to their image being depicted.''" | "''This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, '''each person''' whose image is '''depicted''' in the material was '''18 years of age''' or older and '''gave their express consent''' to their image being depicted.''"<ref name="Canada C-270"> | ||
{{cite web | {{cite web | ||
Line 144: | Line 45: | ||
'''Sommaire en français / Summary in French''' | '''Sommaire en français / Summary in French''' | ||
''Le texte modifie le Code criminel afin d’interdire à toute personne de produire ou de distribuer du matériel pornographique à des fins commerciales, ou d’en faire la publicité, sans s’être au préalable assurée qu’au moment de la production du matériel, '''chaque personne''' dont l’image y est représentée '''était âgée de dix-huit ans''' ou plus et '''avait donné son consentement exprès''' à ce que son image y soit représentée''. | ''Le texte modifie le Code criminel afin d’interdire à toute personne de produire ou de distribuer du matériel pornographique à des fins commerciales, ou d’en faire la publicité, sans s’être au préalable assurée qu’au moment de la production du matériel, '''chaque personne''' dont l’image y est représentée '''était âgée de dix-huit ans''' ou plus et '''avait donné son consentement exprès''' à ce que son image y soit représentée''.<ref>Bilingual version of C-270 https://publications.gc.ca/collections/collection_2022/parl/XB441-270-1.pdf</ref> | ||
'''Links''' | '''Links''' | ||
Line 161: | Line 62: | ||
= China = | = China = | ||
This information should be updated. | This information should be updated. | ||
== Law against synthesis crimes in China 2020 == | |||
[[File:Flag of China.png|thumb|right|200px|China passed a law requiring faked footage to be labeled as such, effective 2020-01-01]] | [[File:Flag of China.png|thumb|right|200px|China passed a law requiring faked footage to be labeled as such, effective 2020-01-01]] | ||
<section begin=China2020 />On Wednesday January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a [[w:crime]] the [[w:Cyberspace Administration of China]] ([http://www.cac.gov.cn/ cac.gov.cn]) stated on its website. China announced this new law in November 2019.<ref name="Reuters2019"> | <section begin=China2020 />On Wednesday January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a [[w:crime]] the [[w:Cyberspace Administration of China]] ([http://www.cac.gov.cn/ cac.gov.cn]) stated on its website. China announced this new law in November 2019.<ref name="Reuters2019"> | ||
Line 193: | Line 93: | ||
</ref> | </ref> | ||
<section end=China2020 /> | <section end=China2020 /> | ||
== | == Draft bill against synthesis crimes China 2022 == | ||
The '''[[w:Cyberspace Administration of China]]''' has a new draft bill in 2022 called [https://www.chinalawtranslate.com/en/deep-synthesis-draft/ '''''Provisions on the Administration of Deep Synthesis Internet Information Services''''' (Draft for solicitation of comments) at chinalawtranslate.com] or [http://www.cac.gov.cn/2022-01/28/c_1644970458520968.htm view the Chinese language draft '''''国家互联网信息办公室关于《互联网信息服务深度合成管理规定(征求意见稿)》公开征求意见的通知''''' at cac.gov.cn]<ref group="1st seen in">'''Politico AI: Decoded''' mailing list Wednesday 2022-02-02</ref> | |||
---- | ---- | ||
= EU = | = EU = | ||
== EU law on AI == | |||
[[File:Flag of Europe.svg|thumb|right|200px|The EU must address the malicious uses of AI in the law it is planning to regulate AI.]] | [[File:Flag of Europe.svg|thumb|right|200px|The EU must address the malicious uses of AI in the law it is planning to regulate AI.]] | ||
The European Union is planning a law on AI called [[w:Artificial Intelligence Act]]. The European Commission proposed the AI Act in 2021. | |||
* European '''Artificial Intelligence Act''' has been approved by the member countries and is on track to be approved by April 2024.<ref>https://www.politico.eu/article/eu-artificial-intelligence-act-ai-technology-risk-rules/</ref> Read [https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206 Proposal for a '''''REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS''''' at eur-lex.europa.eu]<ref group="1st seen in">https://artificialintelligenceact.eu/the-act/ via https://futureoflife.org/ newsletter</ref> (also contains translations) | |||
</ref> | |||
* | * [https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package '''The Digital Services Act''' package at digital-strategy.ec.europa.eu] [[w:Digital Services Act]] (DSA) came into force in November 2022.<ref>https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches</ref> | ||
* | * The Artificial Intelligence Act and Digital Services Act together are supposed to shield us from synthesis crimes. | ||
* [https:// | * There is also a [https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation '''2022 Code of Practice on Disinformation''' at digital-strategy.ec.europa.eu] - ''Major online platforms, emerging and specialised platforms, players in the advertising industry, fact-checkers, research and civil society organisations delivered a strengthened Code of Practice on Disinformation following the Commission’s Guidance of May 2021''.<ref>https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation</ref> | ||
'''Studies and information''' | '''Studies and information''' | ||
* [https://www.europarl.europa.eu/stoa/en/document/EPRS_STU(2021)690039 '''''Tackling deepfakes in European policy''''' at europarl.europa.eu], a 2021 study by the Panel for the Future of Science and Technology and published by the [[w:European Parliamentary Research Service]]. [https://www.europarl.europa.eu/RegData/etudes/STUD/2021/690039/EPRS_STU(2021)690039_EN.pdf View .pdf at europarl.europa.eu] | * [https://www.europarl.europa.eu/stoa/en/document/EPRS_STU(2021)690039 '''''Tackling deepfakes in European policy''''' at europarl.europa.eu], a 2021 study by the Panel for the Future of Science and Technology and published by the [[w:European Parliamentary Research Service]]. [https://www.europarl.europa.eu/RegData/etudes/STUD/2021/690039/EPRS_STU(2021)690039_EN.pdf View .pdf at europarl.europa.eu] | ||
* | * https://artificialintelligenceact.eu/ is a website on the planned law by the [https://futureoflife.org/ Future of Life Institute], an American non-profit NGO. | ||
---- | |||
= Finland = | = Finland = | ||
Line 285: | Line 130: | ||
=== Finland criminalized synthetic CSAM in 2011 === | === Finland criminalized synthetic CSAM in 2011 === | ||
{{#lst:Law on sexual offences in Finland 2023|history}} | {{#lst:Law on sexual offences in Finland 2023|history}} | ||
---- | ---- | ||
Line 389: | Line 157: | ||
---- | ---- | ||
= | |||
[[File:Flag of | = Singapore = | ||
== Law in | [[File:Flag of Singapore.svg|thumb|right|200px|Singapore]] | ||
=== | == Law in Singapore == | ||
[ | === Protection from Online Falsehoods and Manipulation Act 2019 === | ||
[[w:Protection from Online Falsehoods and Manipulation Act 2019]]<ref group="1st seen in" name="ChatGPT 2023 inquiry" /> is a [[w:statute]] of the [[w:Parliament of Singapore]] that enables authorities to tackle the spread of [[w:fake news]] or [[w:false information]]. ([https://en.wikipedia.org/w/index.php?title=Protection_from_Online_Falsehoods_and_Manipulation_Act_2019&oldid=1134029282 Wikipedia]) | |||
---- | ---- | ||
= UK = | = UK = | ||
* | |||
* | * [https://www.parliament.uk/site-information/glossary/ 'Glossary' at parliament.uk] | ||
* [https://www.lawsociety.org.uk/public/for-public-visitors/resources/glossary 'Legal glossary' at lawsociety.org.uk] | |||
== Law against synthesis crimes in the UK == | == Law against synthesis crimes in the UK == | ||
[[File:Flag of the United Kingdom.svg|thumb|right|200px|The UK has improved its legislation.]] | [[File:Flag of the United Kingdom.svg|thumb|right|200px|The UK has improved its legislation.]] | ||
=== The Domestic Abuse Act 2021 Chapter 17, part 6 - Disclosure of private sexual photographs and films === | === The Domestic Abuse Act 2021 Chapter 17, part 6 - Disclosure of private sexual photographs and films === | ||
Line 444: | Line 185: | ||
'''Links''' | '''Links''' | ||
* [https://www.legislation.gov.uk/ukpga/2021/17/part/6/crossheading/disclosure-of-private-sexual-photographs-and-films/enacted '''Domestic Abuse Act 2021''' / '''Chapter 17''' / '''Part 6''' / '''''Disclosure of private sexual photographs and films''''' - '''''Threats to disclose private sexual photographs and films with intent to cause distress''''' at legislation.gov.uk] | * [https://www.legislation.gov.uk/ukpga/2021/17/part/6/crossheading/disclosure-of-private-sexual-photographs-and-films/enacted '''Domestic Abuse Act 2021''' / '''Chapter 17''' / '''Part 6''' / '''''Disclosure of private sexual photographs and films''''' - '''''Threats to disclose private sexual photographs and films with intent to cause distress''''' at legislation.gov.uk] | ||
=== Online Safety Act 2023 === | |||
[[w:Online Safety Act 2023]] [https://www.legislation.gov.uk/ukpga/2023/50/enacted '''Online Safety Act 2023''' at legislation.gov.uk] reportedly criminalizes non-consensual synthetic pornography. | |||
The online Safety Act 2023 came to be from a House of Lords bill UK's [https://bills.parliament.uk/bills/3137 '''HL Bill 151''' - '''''Online Safety Bill''''' at bills.parliament.uk] The bill originated from the House of Commons sessions 2021-22 2022-23. | |||
* [https://www.gov.uk/guidance/a-guide-to-the-online-safety-bill ''A guide to the Online Safety Bill'' at gov.uk] | |||
* [https://www.gov.uk/government/collections/online-safety-bill ''Documents, publications and announcements relating to the government's Online Safety Bill'' at gov.uk] | |||
=== Historical about the UK law against synthesis crimes === | === Historical about the UK law against synthesis crimes === | ||
Line 483: | Line 232: | ||
= USA = | = USA = | ||
[[File:Flag of the United States.svg|thumb|right|200px|Various US states have enacted state laws aimed against synthetic human-like fakes, but it seems that USA has no federal legislation against this menace, eventhough [[#Past bills in the USA|federal bills have been introduced in the USA]].]] | [[File:Flag of the United States.svg|thumb|right|200px|Various US states have enacted state laws aimed against synthetic human-like fakes, but it seems that USA has no federal legislation against this menace, eventhough [[#Past bills in the USA|federal bills have been introduced in the USA]].]] | ||
* [https://www.congress.gov/legislative-process/introduction-and-referral-of-bills '''The Legislative Process: Introduction and Referral of Bills (Video)''' at congress.gov] | * [https://www.congress.gov/legislative-process/introduction-and-referral-of-bills '''The Legislative Process: Introduction and Referral of Bills (Video)''' at congress.gov] | ||
* {{#lst:Glossary|US-Congress-glossary}} | * {{#lst:Glossary|US-Congress-glossary}} | ||
Line 523: | Line 264: | ||
The [https://law.lis.virginia.gov/vacode/18.2-386.2/ section '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows: | The [https://law.lis.virginia.gov/vacode/18.2-386.2/ section '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows: | ||
'''A'''. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]]. '' | '''A'''. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]]. '' | ||
::For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.'' | ::For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.'' | ||
Line 534: | Line 275: | ||
The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | ||
<section end=Virginia2019 /> | <section end=Virginia2019 /> | ||
Line 559: | Line 299: | ||
The text of '''S.B. No. 751''' is as follows | The text of '''S.B. No. 751''' is as follows | ||
'''''AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.''''' | '''''AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.''''' | ||
Line 570: | Line 310: | ||
* ('''e''') ''In this section, "deep fake video" means a video, created with the '''intent to deceive''', that '''appears to depict''' a real person performing an action that '''did not occur''' in '''reality'''.'' | * ('''e''') ''In this section, "deep fake video" means a video, created with the '''intent to deceive''', that '''appears to depict''' a real person performing an action that '''did not occur''' in '''reality'''.'' | ||
SECTION 2. ''This Act takes effect September 1, 2019.'' | SECTION 2. ''This Act takes effect September 1, 2019.'' | ||
---- | ---- | ||
Line 629: | Line 369: | ||
Existing law creates a private [[w:Cause of action|w:right of action]] against a person who intentionally distributes a photograph or recorded image of another that exposes the intimate body parts of that person or of a person engaged in a sexual act '''without the person’s consent''' if specified conditions are met. | Existing law creates a private [[w:Cause of action|w:right of action]] against a person who intentionally distributes a photograph or recorded image of another that exposes the intimate body parts of that person or of a person engaged in a sexual act '''without the person’s consent''' if specified conditions are met. | ||
This bill would provide that a '''depicted individual''', as defined, has a '''[[w:cause of action]] against''' a person who | This bill would provide that a '''depicted individual''', as defined, has a '''[[w:cause of action]] against''' a person who either | ||
* (1) '''creates''' and intentionally discloses sexually explicit material if the person knows or reasonably should have known the '''depicted''' individual '''did not [[w:consent]]''' to its creation or disclosure or | * (1) '''creates''' and intentionally discloses sexually explicit material if the person knows or reasonably should have known the '''depicted''' individual '''did not [[w:consent]]''' to its creation or disclosure or | ||
* (2) who '''intentionally discloses''' sexually explicit material that the person did not create if the person knows the '''depicted''' individual '''did not consent''' to its creation. | * (2) who '''intentionally discloses''' sexually explicit material that the person did not create if the person knows the '''depicted''' individual '''did not consent''' to its creation. | ||
Line 639: | Line 379: | ||
'''The law is as follows''': | '''The law is as follows''': | ||
SECTION 1. Section 1708.86 is added to the [https://leginfo.legislature.ca.gov/faces/codesTOCSelected.xhtml?tocCode=CIV&tocTitle=+Civil+Code+-+CIV Civil Code of California], to read: | SECTION 1. Section 1708.86 is added to the [https://leginfo.legislature.ca.gov/faces/codesTOCSelected.xhtml?tocCode=CIV&tocTitle=+Civil+Code+-+CIV Civil Code of California], to read: | ||
Line 701: | Line 441: | ||
('''g''') The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions. | ('''g''') The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions. | ||
<ref name="CaliforniaStateLaw AB 602"> | <ref name="CaliforniaStateLaw AB 602"> | ||
{{Citation | {{Citation | ||
Line 731: | Line 471: | ||
'''The law is''' as of April 14, 2021<ref name="Georgia Code § 16-11-90 at findlaw.com" /> as follows: | '''The law is''' as of April 14, 2021<ref name="Georgia Code § 16-11-90 at findlaw.com" /> as follows: | ||
('''a''') As used in this Code section, the term: | ('''a''') As used in this Code section, the term: | ||
Line 778: | Line 518: | ||
('''f''') There shall be a rebuttable presumption that an information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet, for content provided by another person, does not know the content of an electronic transmission or post. | ('''f''') There shall be a rebuttable presumption that an information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet, for content provided by another person, does not know the content of an electronic transmission or post. | ||
('''g''') Any violation of this Code section shall constitute a separate offense and shall not merge with any other crimes set forth in this title. | ('''g''') Any violation of this Code section shall constitute a separate offense and shall not merge with any other crimes set forth in this title.<ref name="Georgia Code § 16-11-90 at findlaw.com"> | ||
{{cite web | {{cite web | ||
Line 799: | Line 539: | ||
'''The law is as follows:''' | '''The law is as follows:''' | ||
<big>'''§ 52-c''' - '''''Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual.''''' </big><sup>Obs: There are 2 § 52-c's</sup> | <big>'''§ 52-c''' - '''''Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual.''''' </big><sup>Obs: There are 2 § 52-c's</sup> | ||
Line 847: | Line 587: | ||
'''10.''' Nothing in this section shall be construed to limit, or to enlarge, the protections that 47 U.S.C. § 230 confers on an interactive | '''10.''' Nothing in this section shall be construed to limit, or to enlarge, the protections that 47 U.S.C. § 230 confers on an interactive | ||
computer service for content provided by another information content provider, as such terms are defined in 47 U.S.C. § 230. | computer service for content provided by another information content provider, as such terms are defined in 47 U.S.C. § 230.<ref name="New York State Civil Rights CHAPTER 6, ARTICLE 5, SECTION 52-C"> | ||
{{cite web | {{cite web | ||
Line 863: | Line 603: | ||
== Current bills in the USA == | == Current bills in the USA == | ||
=== US House bill H.R.5586 - Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023 === | |||
=== US House bill H.R.5586 - Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023 | |||
[https://www.congress.gov/bill/118th-congress/house-bill/5586/text “Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023” or the “DEEPFAKES Accountability Act” at congress.gov] is a reintroduction of earlier House bill H.R.3230 to the 118th Congress (2023-2024 session) | [https://www.congress.gov/bill/118th-congress/house-bill/5586/text “Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023” or the “DEEPFAKES Accountability Act” at congress.gov] is a reintroduction of earlier House bill H.R.3230 to the 118th Congress (2023-2024 session) | ||
=== NY Senate bill S5583 in the 2023-2024 regular session === | === NY Senate bill S5583 in the 2023-2024 regular session === | ||
Line 1,088: | Line 806: | ||
* [https://ssri.duke.edu/news/don%E2%80%99t-believe-your-eyes-or-ears-weaponization-artificial-intelligence-machine-learning-and ''''''Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes'''''' at ssri.duke.edu], an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the [[w:Duke University]] | * [https://ssri.duke.edu/news/don%E2%80%99t-believe-your-eyes-or-ears-weaponization-artificial-intelligence-machine-learning-and ''''''Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes'''''' at ssri.duke.edu], an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the [[w:Duke University]] | ||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ ''''''Deepfakes: False Pornography Is Here and the Law Cannot Protect You'''''' at scholarship.law.duke.edu], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review. | * [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ ''''''Deepfakes: False Pornography Is Here and the Law Cannot Protect You'''''' at scholarship.law.duke.edu], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review. | ||
== The countries that have unfortunately banned full face veil == | == The countries that have unfortunately banned full face veil == |