Laws against synthesis and other related crimes: Difference between revisions
Juho Kunsola (talk | contribs) (checked links and bumped access dates) |
Juho Kunsola (talk | contribs) (→Laws in Finland: + link=Suomen seksuaalirikoslaki 2023|25px Suomeksi / in Finnish Suomen seksuaalirikoslaki 2023) |
||
(337 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
= Laws and their | [[File:No-synthetic-F.png|thumb|right|260px|link=Main Page|This article is an attempt to track legislation against '''[[synthetic human-like fakes]]''' world-wide.]] | ||
[[File:Screenshot at 27s of a moving digital-look-alike made to appear Obama-like by Monkeypaw Productions and Buzzfeed 2018.png|thumb|right|480px|link=Mediatheque/2018/Obama's appearance thieved - a public service announcement digital look-alike by Monkeypaw Productions and Buzzfeed|{{#lst:Mediatheque|Obama-like-fake-2018}}]] | |||
[[File:Appearance of Richard Nixon stolen by the Center for Advanced Virtuality of MIT for their awareness raising project In The Event of Moon Disaster 2020 screenshot at 376s.png|thumb|rightr|480px|link=Mediatheque/2020/Nixon's appearance and voice stolen - a public service announcement digital look-alike and sound-alike by MIT's Center for Advanced Virtuality|{{#lst:Mediatheque|Nixon-like-fake-2020}}]] | |||
[[File:Justice scale silhouette, medium.svg|thumb|right|200px|Do you know of a law or jurisdiction that this list does not include? Add it! Or let us know through the chat in the lower-righthand side!]] | |||
This article contains some current laws against abusive uses of synthetic human-like fakes and also information what kind of laws are being prepared and two SSFWIKI original law proposals, [[#Law proposal to ban visual synthetic filth|one against digital look-alikes]] and [[#Law proposal to ban unauthorized modeling of human voice|one against digital sound-alikes]]. | |||
New laws | |||
* [[#UK|UK]]'s [[#Online Safety Act 2023|Online Safety Act 2023]] has been passed to law and reportedly criminalizes non-consensual synthetic pornography. | |||
* The [[#EU|European Union]] has finalized a law package to regulate AI called the [[#Artificial Intelligence Act|Artificial Intelligence Act]] | |||
New bills are currently in the works in | |||
* Current [[#Bills in the EU|bills in the EU]] of high importance are the almost finalized [[#Directive on combating violence against women and domestic violence|Directive on combating violence against women and domestic violence]] and the [[#Regulation to Prevent and Combat Child Sexual Abuse|Regulation to Prevent and Combat Child Sexual Abuse]] | |||
* [[#Current bills in the USA|US Senate is considering an anti-fake bill and the House is considering several bills]] | |||
* [[#Canada|Canada]] is considering C-27 | |||
* [[#China|China]] seems to be planning to ban all synthetic pornography, however consensual its making was | |||
Bills that didn't make it | |||
* [[#Canada|Canada's House of Commons]] was considering banning all pornographic content, for which there is no proof-of-age and written consent from everybody visible in the pornographic recording. | |||
* [[#Past bills in the USA|Past bills in the USA]] | |||
'''Information elsewhere / legal information compilations''' (recommended) | |||
<section begin=anti-fake-law-compilations/> | |||
* {{#lst:Organizations, studies and events against synthetic human-like fakes|cybercivilrights.org law compilations}} | |||
* <section begin=responsible.ai 2023/>[https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches '''''A Look at Global Deepfake Regulation Approaches''''' at responsible.ai]<ref name="responsible.ai 2023"> | |||
{{cite web | |||
| url = https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches | |||
| title = A Look at Global Deepfake Regulation Approaches | |||
| last = Lawson | |||
| first = Amanda | |||
| date = 2023-04-24 | |||
| website = responsible.ai | |||
| publisher = Responsible Artificial Intelligence Institute | |||
| access-date = 2024-02-14 | |||
| quote = | |||
}} | |||
</ref> April 2023 compilation and reporting by Amanda Lawson of the Responsible Artificial Intelligence Institute.<section end=responsible.ai 2023/> | |||
* <section begin=Princeton Legal Journal 2023/>[https://legaljournal.princeton.edu/the-high-stakes-of-deepfakes-the-growing-necessity-of-federal-legislation-to-regulate-this-rapidly-evolving-technology/ '''''The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology''''' at legaljournal.princeton.edu]<ref name="Princeton Legal Journal 2023"> | |||
{{cite web | |||
| url = https://legaljournal.princeton.edu/the-high-stakes-of-deepfakes-the-growing-necessity-of-federal-legislation-to-regulate-this-rapidly-evolving-technology/ | |||
| title = The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology | |||
| last = Quirk | |||
| first = Caroline | |||
| date = 2023-06-19 | |||
| website = legaljournal.princeton.edu | |||
| publisher = Princeton Legal Journal | |||
| access-date = 2024-02-14 | |||
| quote = | |||
}} | |||
</ref> compilation and reporting by Caroline Quirk. PLJ is Princeton’s only student-run law review.<section end=Princeton Legal Journal 2023/> | |||
* <section begin=tech policy press 2023/>[https://www.techpolicy.press/exploring-legal-approaches-to-regulating-nonconsensual-deepfake-pornography/ '''''Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography''''' at techpolicy.press]<ref name="tech policy press 2023"> | |||
{{cite web | |||
| url = https://www.techpolicy.press/exploring-legal-approaches-to-regulating-nonconsensual-deepfake-pornography/ | |||
| title = Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography | |||
| last = Williams | |||
| first = Kaylee | |||
| date = 2023-05-15 | |||
| website = techpolicy.press | |||
| publisher = | |||
| access-date = 2024-02-14 | |||
| quote = | |||
}} | |||
</ref> May 2023 compilation and reporting by Kaylee Williams<section end=tech policy press 2023/> | |||
* <section begin=Foundation RA deepfake AI laws for USA/>[https://foundationra.com/deepfake-ai-laws/ '''''Deepfake AI laws for USA''''' at foundationra.com], [https://foundationra.com/sextortion-laws/ '''''Sextortion laws for USA''''' at foundationra.com] and [https://foundationra.com/revenge-porn-laws-us/ '''''Revenge porn laws for USA''''' at foundationra.com] compilations by [https://foundationra.com/ Foundation RA]<section end=Foundation RA deepfake AI laws for USA/> | |||
* <section begin=Onfido 2024/>[https://onfido.com/blog/deepfake-law/ '''''Deepfake laws: is AI outpacing legislation?''''' at onfido.com]<ref name="Onfido 2024"> | |||
{{cite web | |||
| url = https://onfido.com/blog/deepfake-law/ | |||
| title = Deepfake laws: is AI outpacing legislation? | |||
| last = Owen | |||
| first = Aled | |||
| date = 2024-02-02 | |||
| website = onfido.com | |||
| publisher = Onfido | |||
| access-date = 2024-02-14 | |||
| quote = | |||
}} | |||
</ref> February 2024 summary and compilation by Aled Owen, Director of Global Policy at Onfido (for-profit)<section end=Onfido 2024/> | |||
* <section begin=criminaldefenselawyer.com 2024/>[https://www.criminaldefenselawyer.com/resources/is-deepfake-pornography-illegal.html '''''Is Deepfake Pornography Illegal?''''' at criminaldefenselawyer.com] <ref name="criminaldefenselawyer.com 2024"> | |||
{{cite web | |||
| url = https://www.criminaldefenselawyer.com/resources/is-deepfake-pornography-illegal.html | |||
| title = Is Deepfake Pornography Illegal? | |||
| last = Pirius | |||
| first = Rebecca | |||
| date = 2024-02-07 | |||
| website = Criminaldefenselawyer.com | |||
| publisher = [[w:Nolo (publisher)]] | |||
| access-date = 2024-02-22 | |||
| quote = | |||
}} | |||
</ref> by Rebecca Pirius is a good bring-together of the current illegality/legality situation in the USA federally and state-wise. Published by [[w:Nolo (publisher)]], updated Feb 2024<section end=criminaldefenselawyer.com 2024/> | |||
* <section begin=tclf 2023/>[https://tclf.in/2023/10/16/deepfake-pornography-a-legal-and-ethical-menace/ '''''Deepfake Pornography: A Legal and Ethical Menace''''' at tclf.in]<ref name="tclf 2023"> | |||
{{cite web | |||
| url = https://tclf.in/2023/10/16/deepfake-pornography-a-legal-and-ethical-menace/ | |||
| title = Deepfake Pornography: A Legal and Ethical Menace | |||
| last = Rastogi | |||
| first = Janvhi | |||
| date = 2023-10-16 | |||
| website = tclf.in | |||
| publisher = The Contemporary Law Forum | |||
| access-date = 2024-02-14 | |||
| quote = | |||
}} | |||
</ref> October 2023 compilation and reporting by Janvhi Rastogi, published in the The Contemporary Law Forum.<section end=tclf 2023/> | |||
<section end=anti-fake-law-compilations/> | |||
= Australia = | |||
[[File:Flag of Australia (converted).svg|thumb|right|200px|Australia]] | |||
The [https://www.legislation.gov.au/C2021A00076/latest/text '''Online Safety Act 2021''' at legislation.gov.au]<ref group="1st seen in" name="Equality Now and AUDRi 2024 briefing paper">https://equalitynow.org/resource/briefing-paper-deepfake-image-based-sexual-abuse-tech-facilitated-sexual-exploitation-and-the-law/</ref> regulates the non-consensual sharing or threatening to share sexual images. | |||
If the synthetic fake human-like images depict illegal and restricted online content, then the Online Content Scheme, as defined in the Online Safety Act 2021, may apply.<ref name="Equality Now and AUDRi 2024 briefing">https://equalitynow.org/resource/briefing-paper-deepfake-image-based-sexual-abuse-tech-facilitated-sexual-exploitation-and-the-law/</ref> | |||
[https://www.esafety.gov.au/ '''Office of the eSafety Commissioner''' at esafety.gov.au] is Australia's independent regulator for online safety. | |||
'''Links''' | |||
* [https://www.esafety.gov.au/newsroom/whats-on/online-safety-act '''''Learn about the Online Safety Act''''' at esafety.gov.au] | |||
* [https://www.infrastructure.gov.au/media-technology-communications/internet/online-safety/current-legislation '''''Key elements of the Online Safety Act 2021''''' at infrastructure.gov.au] | |||
* [https://www.aph.gov.au/Parliamentary_Business/Bills_LEGislation/Bills_Search_Results/Result?bId=r6680 '''Online Safety Bill 2021''' at aph.gov.au] | |||
* [https://foundationra.com/australia/ '''Revenge porn laws in Australia''' at foundationra.com] | |||
= Canada = | |||
[[File:Flag of Canada (Pantone).svg|thumb|right|200px|House of Commons of Canada was contemplating banning distribution of all pornography for which there are no consent declarations and proof-of-age for the people depicted.]] | |||
The existing Canadian law ban the distribution of non-consensual disclosure of intimate images.<ref>https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches</ref> | |||
== Active bills in Canada == | |||
=== Digital Charter Implementation Act - House of Commons of Canada bill C-27 === | |||
[https://www.parl.ca/legisinfo/en/bill/44-1/c-27 '''''Digital Charter Implementation Act''''' at parl.ca] or ''An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts'' | |||
== Past bills in Canada == | |||
=== Stopping Internet Sexual Exploitation Act - House of Commons of Canada bill C-270 === | |||
* [https://www.parl.ca/DocumentViewer/en/44-1/bill/C-270/first-reading House of Commons of Canada bill '''C-270''' ''''''Stopping Internet Sexual Exploitation Act'''''' at parl.ca], an act to amend the [[w:Criminal Code (Canada)|Criminal Code]] regarding pornography was first read to the Commons by [[w:Arnold Viersen]] on Thursday 2022-04-28 at 10:15. | |||
According to [https://www.townandcountrytoday.com/local-news/mp-submits-private-members-bill-for-second-time-5346232 townandcountrytoday.com] the author of the bill introduced an identical bill '''C-302''' on Thursday 2021-05-27, but that got killed off by oncoming federal elections | |||
'''Stopping Internet Sexual Exploitation Act''' - '''An Act to amend the Criminal Code (pornographic material)''' is a private member's bill by Arnold Viersen ([https://www.arnoldviersen.ca/ Official Site at arnoldviersen.ca]) | |||
'''Summary of the C-270 from parl.ca''' | |||
<blockquote> | |||
"''This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, '''each person''' whose image is '''depicted''' in the material was '''18 years of age''' or older and '''gave their express consent''' to their image being depicted.''"</blockquote><ref name="Canada C-270"> | |||
{{cite web | |||
|url=https://www.parl.ca/DocumentViewer/en/44-1/bill/C-270/first-reading | |||
|title= Stopping Internet Sexual Exploitation Act - An Act to amend the Criminal Code (pornographic material) | |||
|last=Viersen | |||
|first=Arnold | |||
|date=2022-04-28 | |||
|website=parl.ca | |||
|publisher=[[w:House of Commons of Canada]] | |||
|access-date=2022-10-06 | |||
|quote=}} | |||
</ref> | |||
'''Sommaire en français / Summary in French''' | |||
<blockquote> | |||
''Le texte modifie le Code criminel afin d’interdire à toute personne de produire ou de distribuer du matériel pornographique à des fins commerciales, ou d’en faire la publicité, sans s’être au préalable assurée qu’au moment de la production du matériel, '''chaque personne''' dont l’image y est représentée '''était âgée de dix-huit ans''' ou plus et '''avait donné son consentement exprès''' à ce que son image y soit représentée''.</blockquote><ref>Bilingual version of C-270 https://publications.gc.ca/collections/collection_2022/parl/XB441-270-1.pdf</ref> | |||
'''Links''' | |||
* [https://publications.gc.ca/collections/collection_2022/parl/XB441-270-1.pdf C-270 - ''Stopping Internet Sexual Exploitation Act'' in English et en français at publications.gc.ca] | |||
* [https://www.parl.ca/legisinfo/en/bill/44-1/C-270 Parliament of Canada LEGISinfo: C-270 - ''Stopping Internet Sexual Exploitation Act'' at parl.ca] in English | |||
* [https://openparliament.ca/bills/44-1/C-270/ C-270 - ''Stopping Internet Sexual Exploitation Act'' at openparliament.ca] includes motivation of Mr. Arnold Viersen and the co-sponsor of the bill Mr. Garnett Genuis. | |||
'''Reporting''' | |||
* [https://www.conservative.ca/mp-viersen-reintroduces-the-stopping-internet-sexual-exploitation-sise-act/ ''MP Viersen reintroduces the Stopping Internet Sexual Exploitation (SISE) Act'' at conservative.ca] | |||
---- | |||
= China = | |||
This information should be updated. | |||
[[File:Flag of China.png|thumb|right|200px|China passed a law requiring faked footage to be labeled as such, effective 2020-01-01]] | |||
== Law against synthesis crimes in China == | |||
=== Mandatory labeling of fake media in China since 2020 === | |||
<section begin=China2020 />On Wednesday January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a [[w:crime]] the [[w:Cyberspace Administration of China]] ([http://www.cac.gov.cn/ cac.gov.cn]) stated on its website. China announced this new law in November 2019.<ref name="Reuters2019"> | |||
{{cite web | |||
| url = https://www.reuters.com/article/us-china-technology/china-seeks-to-root-out-fake-news-and-deepfakes-with-new-online-content-rules-idUSKBN1Y30VU | |||
| title = China seeks to root out fake news and deepfakes with new online content rules | |||
| last = | |||
| first = | |||
| date = 2019-11-29 | |||
| website = [[w:Reuters.com]] | |||
| publisher = [[w:Reuters]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> The Chinese government seems to be reserving the right to prosecute both users and [[w:online video platform]]s failing to abide by the rules. <ref name="TheVerge2019"> | |||
{{cite web | |||
| url = https://www.theverge.com/2019/11/29/20988363/china-deepfakes-ban-internet-rules-fake-news-disclosure-virtual-reality | |||
| title = China makes it a criminal offense to publish deepfakes or fake news without disclosure | |||
| last = Statt | |||
| first = Nick | |||
| date = 2019-11-29 | |||
| website = | |||
| publisher = [[w:The Verge]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> | |||
<section end=China2020 /> | |||
=== Deep Synthesis Provisions 2023 === | |||
On Tuesday 2023-01-10 the Deep Synthesis Provisions came into effect. It was originally drafted in 2022 by the '''[[w:Cyberspace Administration of China]]''' as [https://www.chinalawtranslate.com/en/deep-synthesis-draft/ '''''Provisions on the Administration of Deep Synthesis Internet Information Services''''' (Draft for solicitation of comments) at chinalawtranslate.com] or [http://www.cac.gov.cn/2022-01/28/c_1644970458520968.htm view the Chinese language draft '''''国家互联网信息办公室关于《互联网信息服务深度合成管理规定(征求意见稿)》公开征求意见的通知''''' at cac.gov.cn]<ref group="1st seen in">'''Politico AI: Decoded''' mailing list Wednesday 2022-02-02</ref>. | |||
'''Reporting''' | |||
* [https://thediplomat.com/2023/03/chinas-new-legislation-on-deepfakes-should-the-rest-of-asia-follow-suit/ '''''China’s New Legislation on Deepfakes: Should the Rest of Asia Follow Suit?''''' at thediplomat.com] March 2023 reporting | |||
---- | |||
= EU = | |||
[[File:Flag of Europe.svg|thumb|right|200px|The EU must address the malicious uses of AI in the law it is planning to regulate AI.]] | |||
== Laws in the EU == | |||
=== Artificial Intelligence Act === | |||
The European Union has a law on AI called '''[[w:Artificial Intelligence Act]]'''. The European Commission proposed the AI Act in 2021. On Wednesday 2024-03-13 the MEPs adopted this law.<ref> | |||
{{cite web | |||
| url = https://www.europarl.europa.eu/news/en/press-room/20240308IPR19015/artificial-intelligence-act-meps-adopt-landmark-law | |||
| title = Artificial Intelligence Act: MEPs adopt landmark law | |||
| last = | |||
| first = | |||
| date = 2024-03-13 | |||
| website = europarl.europa.eu | |||
| publisher = [[w:European Parliament]] | |||
| access-date = 2024-03-22 | |||
| quote = The regulation, agreed in negotiations with member states in December 2023, was endorsed by MEPs with 523 votes in favour, 46 against and 49 abstentions. | |||
}} | |||
</ref> The AI Act will have a key role in the effective implementation of upcoming EU [[#Directive on combating violence against women and domestic violence|Directive on combating violence against women and domestic violence]] and [[#Regulation to Prevent and Combat Child Sexual Abuse|Regulation to Prevent and Combat Child Sexual Abuse]] that intend to protect us against synthetic pornography. | |||
* European '''Artificial Intelligence Act''' has been approved by the member countries and is on track to be approved by April 2024.<ref>https://www.politico.eu/article/eu-artificial-intelligence-act-ai-technology-risk-rules/</ref> Read [https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206 Proposal for a '''''REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS''''' at eur-lex.europa.eu]<ref group="1st seen in">https://artificialintelligenceact.eu/the-act/ via https://futureoflife.org/ newsletter</ref> (also contains translations) | |||
* [https://www.europarl.europa.eu/legislative-train/theme-a-europe-fit-for-the-digital-age/file-regulation-on-artificial-intelligence '''''Artificial intelligence act''''' in the Legislative Train Schedule at europarl.europa.eu] | |||
* [https://www.europarl.europa.eu/news/en/press-room/20240308IPR19015/artificial-intelligence-act-meps-adopt-landmark-law '''''Artificial Intelligence Act: MEPs adopt landmark law''''' at europarl.europa.eu], a press announcement on the adoption of the Artificial Intelligence Act on Wednesday 2024-03-13. | |||
'''Studies and information''' | |||
* [https://www.europarl.europa.eu/stoa/en/document/EPRS_STU(2021)690039 '''''Tackling deepfakes in European policy''''' at europarl.europa.eu], a 2021 study by the Panel for the Future of Science and Technology and published by the [[w:European Parliamentary Research Service]]. [https://www.europarl.europa.eu/RegData/etudes/STUD/2021/690039/EPRS_STU(2021)690039_EN.pdf View .pdf at europarl.europa.eu] | |||
* [https://artificialintelligenceact.eu/ '''''The EU Artificial Intelligence Act''''' at artificialintelligenceact.eu] is a website promising ''Up-to-date developments and analyses of the EU AI Act'' on the new EU law by the [https://futureoflife.org/ Future of Life Institute], an American non-profit NGO. | |||
'''Reporting''' | |||
* [https://www.euractiv.com/section/artificial-intelligence/opinion/the-ai-act-vs-deepfakes-a-step-forward-but-is-it-enough/ '''''The AI Act vs. deepfakes: A step forward, but is it enough?''''' at euractiv.com], 2024-02-26 opinion piece by Cristina Vanberghen | |||
=== Digital Services Act === | |||
[https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package '''The Digital Services Act''' package at digital-strategy.ec.europa.eu] [[w:Digital Services Act]] (DSA) came into force in November 2022.<ref>https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches</ref> | |||
The Artificial Intelligence Act and Digital Services Act together will help in the enforcement of the upcoming protections to shield us from synthesis crimes. | |||
== Bills in the EU == | |||
=== Directive on combating violence against women and domestic violence === | |||
The [https://commission.europa.eu/document/28552314-3316-40b4-ae0f-17ff8ab9f43f_en '''''Directive on combating violence against women and domestic violence''''' at commission.europa.eu] will require, among other things, that member states criminalize non-consensual [[Synthetic human-like fakes#Digital look-alikes|synthetic digital look-alike]] pornography in their criminal codes. | |||
'''Official''' | |||
* [https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A52022PC0105 '''''Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on combating violence against women and domestic violence''''' at eur-lex.europa.eu] has the directive text in all languages and various formats | |||
* [https://www.europarl.europa.eu/legislative-train/theme-a-new-push-for-european-democracy/file-legislative-proposal-on-gender-based-violence '''''Legislative proposal on combating violence against women and domestic violence''''' Legislative Train Schedule at europarl.europa.eu] lists this EU bill as ''close to adoption''. (As of March 2024) | |||
* [https://ec.europa.eu/commission/presscorner/detail/en/ip_24_649 '''''Commission welcomes political agreement on new rules to combat violence against women and domestic violence''''' at ec.europa.eu], a 2024-02-06 press release by the [[w:European Commission]]. | |||
* [https://ec.europa.eu/commission/presscorner/detail/en/ip_22_1533 '''''International Women's Day 2022: Commission proposes EU-wide rules to combat violence against women and domestic violence''''' at ec.europa.eu], a 2024-03-08 press release by the EC. | |||
'''Unofficial''' | |||
* [https://hateaid.org/en/eu-protects-women-from-digital-violence/ '''''Deepfakes and dick pics: EU protects women from digital violence''''' at hateaid.org], a 2024-02-07 press release by HateAid.org | |||
* [https://futureoflife.org/ai-policy/disrupting-the-deepfake-pipeline-in-europe/ '''''Disrupting the Deepfake Pipeline in Europe''''' at futureoflife.org], 2024-02-22 article on the approach of ''Leveraging corporate criminal liability under the Violence Against Women Directive to safeguard against pornographic deepfake exploitation'' by Alexandra Tsalidis | |||
=== Regulation to Prevent and Combat Child Sexual Abuse === | |||
* [https://www.europarl.europa.eu/legislative-train/theme-promoting-our-european-way-of-life/file-combating-child-sexual-abuse-online '''''New legislation to fight child sexual abuse online''''' Legislative Train Schedule at europarl.europa.eu] lists this EU bill as ''tabled''. (As of March 2024) | |||
* [https://www.consilium.europa.eu/en/policies/prevent-child-sexual-abuse-online/ '''''Prevention of online child sexual abuse''''' at consilium.europa.eu] | |||
* [https://www.consilium.europa.eu/en/policies/prevent-child-sexual-abuse-online/timeline-prevention-of-online-child-sexual-abuse/ '''''Timeline - Prevention of online child sexual abuse''''' at consilium.europa.eu] | |||
{{Q|The '''Regulation to Prevent and Combat Child Sexual Abuse''' ('''Child Sexual Abuse Regulation''', or '''CSAR''') is a [[w:European Union regulation]] proposed by the [[w:European Commissioner for Home Affairs]] [[w:Ylva Johansson]] on 11 May 2022. The stated aim of the legislation is to prevent child sexual abuse online through the implementation of a number of measures, including the establishment of a framework that would make the detection and reporting of child sexual abuse material ([[w:Child pornography|CSAM]]) by digital platforms a legal requirement within the European Union.|Wikipedia as of March 2024|[[w:Regulation to Prevent and Combat Child Sexual Abuse]]}} | |||
'''Reporting''' | |||
* [https://www.euronews.com/my-europe/2024/02/06/new-eu-rules-will-criminalise-paedophilia-handbooks-and-deepfakes-of-child-abuse '''''New EU rules will criminalise 'paedophilia handbooks' and deepfakes of child abuse''''' at euronews.com], 2024-02-06 reporting | |||
== Code of practice on Disinformation 2022 == | |||
The [https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation '''2022 Code of Practice on Disinformation''' at digital-strategy.ec.europa.eu] - ''Major online platforms, emerging and specialised platforms, players in the advertising industry, fact-checkers, research and civil society organisations delivered a strengthened Code of Practice on Disinformation following the Commission’s Guidance of May 2021''.<ref>https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation</ref> | |||
= Finland = | |||
[[File:Flag of Finland.svg|thumb|right|alt=An image of a Finnish flag|thumb|200px|Finland finally updated its laws in 2023 against synthetic filth attacks against adults.]] | |||
[[File:Suomen lippu valokuva.png|thumb|right|alt=An image of a Finnish flag|thumb|255px|Finland strongly criminalized synthetic sexual pictures depicting children in 2011, upon the initiative of the [[w:Vanhanen II Cabinet]]]] | |||
== Laws in Finland == | |||
[[File:Flag_of_Finland.svg|link=Suomen seksuaalirikoslaki 2023|25px]] Suomeksi / in Finnish [[Suomen seksuaalirikoslaki 2023]] | |||
=== Law on sexual offences in Finland 2023 === | |||
{{#lst:Law on sexual offences in Finland 2023|what-is-it}} | |||
=== Finland criminalized synthetic CSAM in 2011 === | |||
{{#lst:Law on sexual offences in Finland 2023|history}} | |||
---- | |||
= Germany = | |||
[[File:Flag of Germany.svg|thumb|right|200px|Existing German laws provide protection against appearance theft.]] | |||
* [https://mj-cohen.com/2020/01/03/deepfakes-and-german-law/ '''''Deepfakes and German law''''' at mj-cohen.com], January 2020 summary by Maureen Cohen states that the existing German laws are well equipped rendering non-consensual digital look-alikes illegal. | |||
---- | |||
= India = | |||
[[File:Flag of India.svg|thumb|right|200px|India]] | |||
'''Laws in India''' | |||
Laws in India can be accessed through the [https://www.indiacode.nic.in/ '''India Code''' at indiacode.nic.in] | |||
'''[[w:Information Technology Act, 2000]]''' | |||
* Section 66. - '''Computer related offences.'''<ref>https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=76&orgactid=AC_CEN_45_76_00001_200021_1517807324077</ref> | |||
** Section 66A. - Omitted. Would have been ''Punishment for sending offensive messages through communication service, etc.'' | |||
** Section 66B. - Punishment for dishonestly receiving stolen computer resource or communication device. | |||
** Section 66C. - '''Punishment for identity theft.'''<ref>https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=79&orgactid=AC_CEN_45_76_00001_200021_1517807324077</ref> | |||
** Section 66D. - '''Punishment for cheating by personation by using computer resource.'''<ref>https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=80&orgactid=AC_CEN_45_76_00001_200021_1517807324077</ref> | |||
** Section 66E. - '''Punishment for violation of privacy.'''<ref>https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=81#:~:text=Whoever%2C%20intentionally%20or%20knowingly%20captures,two%20lakh%20rupees%2C%20or%20with</ref> | |||
** Section 66F. - Punishment for cyber terrorism. | |||
* Section 67. - '''Punishment for publishing or transmitting obscene material in electronic form.'''<ref>https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=83&orgactid=AC_CEN_45_76_00001_200021_1517807324077</ref> | |||
** Section 67A. - '''Punishment for publishing or transmitting of material containing sexually explicit act, etc., in electronic form.'''<ref>https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=84&orgactid=AC_CEN_45_76_00001_200021_1517807324077</ref> | |||
** Section 67B. - '''Punishment for publishing or transmitting of material depicting children in sexually explicit act, etc., in electronic form.'''<ref>https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=85&orgactid=AC_CEN_45_76_00001_200021_1517807324077</ref> | |||
** Section 67C. - Preservation and retention of information by intermediaries. | |||
'''[[w:Information Technology Rules, 2021]]''' | |||
'''Past bills in India''' | |||
* [[w:Personal Data Protection Bill, 2019]] | |||
'''Legal compilations on the legal situation against synthetic filth in India''' | |||
* {{#lst:{{ARTICLEPAGENAME}}|tclf 2023}} | |||
* [https://www.livelaw.in/law-firms/law-firm-articles-/deepfakes-personal-data-artificial-intelligence-machine-learning-ministry-of-electronics-and-information-technology-information-technology-act-242916 '''''Deepfakes And Breach Of Personal Data – A Bigger Picture''''' at livelaw.in]<ref name="Livelaw.in 2023"> | |||
{{cite web | |||
| url = https://www.livelaw.in/law-firms/law-firm-articles-/deepfakes-personal-data-artificial-intelligence-machine-learning-ministry-of-electronics-and-information-technology-information-technology-act-242916 | |||
| title = Deepfakes And Breach Of Personal Data – A Bigger Picture | |||
| last = Rana | |||
| first = Vikrant | |||
| last2 = Gandhi | |||
| first2 = Anuradha | |||
| last3 = Thakur | |||
| first3 = Rachita | |||
| date = 2023-11-24 | |||
| website = livelaw.in | |||
| publisher = | |||
| access-date = 2024-02-21 | |||
| quote = | |||
}} | |||
</ref> 2023-11-24 compilation on laws against synthetic fakes in India, by Vikrant Rana, Anuradha Gandhi and Rachita Thakur | |||
* [https://cybercert.in/what-is-deep-fake-cyber-crime-what-does-indian-law-say-about-it/ '''''What Is Deep Fake Cyber Crime? What Does Indian Law Say About It?''''' at cybercert.in]<ref> | |||
{{cite web | |||
| url = https://cybercert.in/what-is-deep-fake-cyber-crime-what-does-indian-law-say-about-it/ | |||
| title = What Is Deep Fake Cyber Crime? What Does Indian Law Say About It? | |||
| last = | |||
| first = | |||
| date = | |||
| website = cybercert.in | |||
| publisher = | |||
| access-date = 2024-03-23 | |||
| quote = At present, India does not have any law specifically for deep fake cybercrime, but various other laws can be combined to deal with it. | |||
}} | |||
</ref> | |||
* [https://cybercert.in/indian-cyber-laws/ '''''Cyberlaw in India''''' at cybercert.in] provides a wider look into the Indian Cyber Laws | |||
---- | |||
= New Zealand = | |||
[[File:Flag of New Zealand.svg|thumb|right|200px|New Zealand]] | |||
* [https://www.legislation.govt.nz/act/public/2015/0063/latest/whole.html ''''Harmful Digital Communications Act 2015'''' at legislation.govt.nz]<ref group="1st seen in" name="Equality Now and AUDRi 2024 briefing paper"/> criminalises intimate visual recordings and image-based sexual abuse. | |||
* '''[[w:Netsafe]]''' [https://netsafe.org.nz/ netsafe.org.nz] is an online safety non-profit organisation in New Zealand. It provides educational, anti-bullying and support services. The organisation is contracted under the [[w:Harmful Digital Communications Act]] until 2026. (Wikipedia) | |||
* [https://www.cert.govt.nz/ '''CERT.govt.nz'''], the [[w:Computer emergency response team]] of New Zealand - "''Responding to cyber security threats in New Zealand - CERT NZ is your first port of call when you need to report a cyber security problem.''" | |||
* The '''[[w:Department of Internal Affairs]]''' investigates the possession of, and trading in, child exploitation material.<ref>https://www.police.govt.nz/advice-services/cybercrime-and-internet/online-child-safety</ref> | |||
'''Links regarding the Harmful Digital Communications Act 2015''' | |||
* [https://www.police.govt.nz/advice-services/cybercrime-and-internet/harmful-digital-communications-hdc ''''Harmful Digital Communications (HDC)'''' at police.govt.nz] by the [[w:New Zealand Police]] | |||
* [https://www.justice.govt.nz/courts/civil/harmful-digital-communications/ ''''Harmful digital communications'''' at justice.govt.nz] by the [[w:Ministry of Justice (New Zealand)]] | |||
* [https://netsafe.org.nz/what-is-the-hdca/ '''''What is the HDCA?''''' at netsafe.org.nz] by [[w:Netsafe]], the agency approved by the [[w:New Zealand Police]] to process complaints about harmful digital communications. | |||
* See Wikipedia article on '''[[w:Harmful Digital Communications Act 2015]]''' for more info | |||
---- | |||
= Singapore = | |||
[[File:Flag of Singapore.svg|thumb|right|200px|Singapore]] | |||
== Law in Singapore == | |||
=== Protection from Online Falsehoods and Manipulation Act 2019 === | |||
[[w:Protection from Online Falsehoods and Manipulation Act 2019]]<ref group="1st seen in" name="ChatGPT 2023 inquiry" /> is a [[w:statute]] of the [[w:Parliament of Singapore]] that enables authorities to tackle the spread of [[w:fake news]] or [[w:false information]]. ([https://en.wikipedia.org/w/index.php?title=Protection_from_Online_Falsehoods_and_Manipulation_Act_2019&oldid=1134029282 Wikipedia]) | |||
---- | |||
= South Africa = | |||
[[File:Flag of the Republic of South Africa.svg|thumb|right|200px|[https://www.gov.za/documents/cybercrimes-act-19-2020-1-jun-2021-0000 South Africa's relatively new Cybercrimes Act (Cybercrimes Act 19 of 2020)] has been enacted only partially on Dec 1 2021]] | |||
== Cybercrimes Act 19 of 2020 == | |||
[https://www.gov.za/documents/cybercrimes-act-19-2020-1-jun-2021-0000 '''South-Africa Cybercrimes Act 19 of 2020''' (English / Afrikaans) at gov.za]<ref group="1st seen in">https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/</ref> came only partially into effect on Wednesday 2021-12-01.<ref>https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/</ref> | |||
* [https://cybercrimesact.co.za/section-16-distribution-of-data-message-of-intimate-image/ '''Cybercrimes Act, Chapter 2 Cybercrimes, Section 16 - Disclosure of data message of intimate image''' at cybercrimesact.co.za] Subsection 1 states <blockquote>''Any person (‘‘A’’) who unlawfully and intentionally discloses, by means of an electronic communications service, a data message of an intimate image of a person (‘‘B’’), without the consent of B, is guilty of an offence.'' | |||
</blockquote> | |||
* [https://cybercrimesact.co.za/section-17-attempting-conspiring-aiding-abetting-inducing-inciting-instigating-instructing-commanding-or-procuring-to-commit-offence/ '''Cybercrimes Act, Chapter 2 Cybercrimes, Section 17 - Attempting, conspiring, aiding, abetting, inducing, inciting, instigating, instructing, commanding or procuring to commit offence''' at cybercrimesact.co.za] states | |||
<blockquote> | |||
Any person who unlawfully and intentionally | |||
# attempts; | |||
# conspires with any other person; or | |||
# aids, abets, induces, incites, instigates, instructs, commands or procures another person, to commit an offence in terms of Part I or Part II of this Chapter, is guilty of an offence and is liable on conviction to the punishment to which a person convicted of actually committing that offence would be liable. | |||
</blockquote> | |||
'''Links''' | |||
* [https://cybercrimesact.co.za/ '''Cybercrimes Act''' at cybercrimesact.co.za] by Accessible Law contains the law in website format | |||
'''Reporting''' | |||
* [https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/ '''''Not all of the cogs in the Cybercrimes Act machine are turning at once — we still remain vulnerable''''' at dailymaverick.co.za] | |||
---- | |||
= South Korea = | |||
[[File:Flag of South Korea.svg|thumb|right|200px|South Korea has invested heavily in AI research and has also legislation to counter the menaces of synthetic human-like fakes.]] | |||
== Law in South Korea == | |||
=== ACT ON SPECIAL CASES CONCERNING THE PUNISHMENT OF SEXUAL CRIMES === | |||
[https://elaw.klri.re.kr/eng_service/lawView.do?hseq=40947&lang=ENG '''ACT ON SPECIAL CASES CONCERNING THE PUNISHMENT OF SEXUAL CRIMES''' at elaw.klri.re.kr] is a law in South Korea.<ref group="1st seen in">https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches</ref> | |||
---- | |||
= UK = | |||
* {{#lst:Glossary|UK-Parliament-glossary}} | |||
* {{#lst:Glossary|Law-Society-glossary}} | |||
== Law against synthesis crimes in the UK == | |||
[[File:Flag of the United Kingdom.svg|thumb|right|200px|The UK has improved its legislation.]] | |||
On Tuesday 2024-04-16 the UK announced that '''creating sexually explicit deepfake images to be made offence in UK'''.<ref>[https://www.theguardian.com/technology/2024/apr/16/creating-sexually-explicit-deepfake-images-to-be-made-offence-in-uk '''''creating sexually explicit deepfake images to be made offence in UK''''' at theguardian.com]</ref> This is a very good move and next logical step would be the prohibition of non-consensually possessing other people's appearance models and voice models, as suggested in [[#Law proposals against synthetic filth by Juho Kunsola|Law proposals against synthetic filth by Juho Kunsola]]. There is really no logical reason to allow criminal leagues to legally possess and trade their libraries of models without this acquisition through covert modeling or trading and possession of raw materials (covert models) to produce the disinformation weapons being illegal. | |||
* [https://www.theguardian.com/technology/2024/apr/16/creating-sexually-explicit-deepfake-images-to-be-made-offence-in-uk '''''Creating sexually explicit deepfake images to be made offence in UK''''' at theguardian.com] | |||
=== Online Safety Act 2023 === | |||
[[w:Online Safety Act 2023]] [https://www.legislation.gov.uk/ukpga/2023/50/enacted '''Online Safety Act 2023''' at legislation.gov.uk] received Royal Assent on 2023-10-26<ref>https://www.gov.uk/government/publications/online-safety-act-new-criminal-offences-circular/online-safety-act-new-criminal-offences-circular</ref> and it reportedly criminalizes non-consensual synthetic pornography. | |||
* '''Part 4''' - '''CSEA reporting''' of the Act on CSEA reporting has not yet come into effect as of February 2024. | |||
** Section 66 requires providers of services regulated under the Act to have systems and processes in place (so far as possible) to ensure that they report all detected and unreported CSEA content present on the service to the NCA (National Crime Agency) | |||
** '''Offence in relation to CSEA reporting''' in Section 69 | |||
* '''Part 7''' – '''Enforcement offences''' | |||
* '''Part 10''' - '''Communication offences''' establishes | |||
** '''False communications offence''' in Section 179 | |||
** '''Threatening communications offence''' in Section 181 | |||
** '''Offences of sending or showing flashing images electronically''' in Section 183 | |||
** '''Offence of encouraging or assisting serious self-harm''' in Section 184 | |||
** '''Sending etc photograph or film of genitals''' in Section 187 | |||
** '''Sharing or threatening to share intimate photograph or film''' in Section 188 | |||
The Online Safety Act 2023 came to be from a House of Lords bill UK's [https://bills.parliament.uk/bills/3137 '''HL Bill 151''' - '''''Online Safety Bill''''' at bills.parliament.uk] The bill originated from the House of Commons sessions 2021-22 2022-23. | |||
* [https://www.gov.uk/government/publications/online-safety-act-new-criminal-offences-circular/online-safety-act-new-criminal-offences-circular '''''Online Safety Act: new criminal offences circular''''' at gov.uk] published 2024-01-31 - It is a ''circular is issued to inform the police and other relevant public authorities of certain provisions of the Online Safety Act in particular new criminal offences.'' | |||
* [https://www.ofcom.org.uk/news-centre/2023/safer-life-online-for-people-in-uk ''Creating a safer life online for people in the UK'' at ofcom.org.uk] | |||
* [https://www.gov.uk/guidance/a-guide-to-the-online-safety-bill ''A guide to the Online Safety Bill'' at gov.uk] | |||
* [https://www.gov.uk/government/collections/online-safety-bill ''Documents, publications and announcements relating to the government's Online Safety Bill'' at gov.uk] | |||
'''Reporting and summaries of the Online Safety Act''' | |||
* [https://www.infolaw.co.uk/newsletter/2023/11/the-online-safety-act-2023-a-primer/ '''''The Online Safety Act 2023: a primer''''' at infolaw.co.uk], a 2023 primer on the Online Safety Act by Alex Heshmaty on 2023-11-29 | |||
* [https://www.epra.org/news_items/uk-the-online-safety-act-is-now-law-ofcom-s-powers-as-online-safety-regulator-have-officially-commenced '''''UK: The Online Safety Act is now law; Ofcom’s powers as online safety regulator have officially commenced''''' at epra.org] states that there will three successive implementation phases. | |||
=== The Domestic Abuse Act 2021 Chapter 17, part 6 - Disclosure of private sexual photographs and films === | |||
'''[[w:Domestic Abuse Act 2021]]''' / [https://www.legislation.gov.uk/ukpga/2021/17/contents/enacted Chapter 17] / [https://www.legislation.gov.uk/ukpga/2021/17/part/6/enacted Part 6 - ''Offences involving abusive or violent behaviour''] / [https://www.legislation.gov.uk/ukpga/2021/17/part/6/crossheading/disclosure-of-private-sexual-photographs-and-films/enacted ''Disclosure of private sexual photographs and films'' - ''Threats to disclose private sexual photographs and films with intent to cause distress''] | |||
According to the UK-based [https://revengepornhelpline.org.uk/ '''Revenge Porn Helpline''' at revengepornhelpline.org.uk] article [https://revengepornhelpline.org.uk/information-and-advice/need-help-and-advice/threats-to-share-intimate-images/ '''''What to do if someone is threatening to share your intimate images'''''] threats to share intimate images with the intent to cause distress is now an offense in UK law. This is included within the '''[[w:Domestic Abuse Act 2021]]''' which was enacted into UK law on 29th June 2021.<ref>https://revengepornhelpline.org.uk/information-and-advice/need-help-and-advice/threats-to-share-intimate-images/</ref> | |||
We don't know quite yet if the bug has been fixed, that if the pictures <u>are not pictures of you</u>, but [[synthetic human-like fakes]] the police cannot do anything. | |||
'''Links''' | |||
* [https://www.legislation.gov.uk/ukpga/2021/17/part/6/crossheading/disclosure-of-private-sexual-photographs-and-films/enacted '''Domestic Abuse Act 2021''' / '''Chapter 17''' / '''Part 6''' / '''''Disclosure of private sexual photographs and films''''' - '''''Threats to disclose private sexual photographs and films with intent to cause distress''''' at legislation.gov.uk] | |||
=== Historical about the UK law against synthesis crimes === | |||
The UK law was not very up-to-date on the issue of synthetic filth until recent improvements. | |||
The independent [[w:Law Commission (England and Wales)]] reviewed the law where it applies to taking, making and sharing intimate images without consent. The outcome of the consultation was due to be published later in 2021.<ref name="BBC2021"> | |||
{{cite web | |||
|url = https://www.bbc.com/news/technology-55546372 | |||
|title = 'Deepfake porn images still give me nightmares' | |||
|last = Royle | |||
|first = Sara | |||
|date = 2021-01-05 | |||
|website = [[w:BBC Online]] | |||
|publisher = [[w:BBC]] | |||
|access-date = 2021-01-31 | |||
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}} | |||
</ref> | |||
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020"> | |||
{{cite web | |||
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images | |||
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images' | |||
|last = Mort | |||
|first = Helen | |||
|date = 2020 | |||
|website = [[w:Change.org]] | |||
|publisher = [[w:Change.org]] | |||
|access-date = 2021-01-31 | |||
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}} | |||
</ref> This call is for similar laws as California put in place on January 1 2020. | |||
---- | |||
= USA = | |||
[[File:Flag of the United States.svg|thumb|right|200px|Various US states have enacted state laws aimed against synthetic human-like fakes, but it seems that USA has no federal legislation against this menace, eventhough [[#Past bills in the USA|federal bills have been introduced in the USA]].]] | |||
* {{#lst:{{ARTICLEPAGENAME}}|criminaldefenselawyer.com 2024}} | |||
* {{#lst:{{ARTICLEPAGENAME}}|Foundation RA deepfake AI laws for USA}} | |||
{{#lst:Organizations, studies and events against synthetic human-like fakes|cybercivilrights.org law compilations}} | |||
See also: | |||
* [[#Current bills in the USA|Current bills in the USA]] and [[#Past bills in the USA|past bills in the USA]] | |||
* [https://www.congress.gov/legislative-process/introduction-and-referral-of-bills '''The Legislative Process: Introduction and Referral of Bills (Video)''' at congress.gov] | |||
* {{#lst:Glossary|US-Congress-glossary}} | |||
== Law against synthesis crimes in Virginia 2019 == | |||
=== Code of Virginia § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty === | |||
[[File:Flag of Virginia.svg|thumb|right|200px|[[w:Virginia]], an avant-garde state. Motto "[[w:Sic semper tyrannis]]"]] | |||
<section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | <section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | ||
Line 21: | Line 559: | ||
| quote = }} | | quote = }} | ||
</ref>, as [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''. The law | </ref>, as section [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''. | ||
[https://law.lis.virginia.gov/vacode/ '''Code of Virginia''' (TOC)] » [https://law.lis.virginia.gov/vacode/title18.2/ '''Title 18.2.''' Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ '''Chapter 8.''' Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ '''Article 5.''' Obscenity and Related Offenses] » '''Section''' [https://law.lis.virginia.gov/vacode/18.2-386.2/ § '''18.2-386.2.''' Unlawful dissemination or sale of images of another; penalty] | |||
The [https://law.lis.virginia.gov/vacode/18.2-386.2/ section '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows: | |||
<blockquote> | |||
'''A'''. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]]. '' | |||
::For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.'' | |||
'''B'''. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.'' | |||
'''C.''' ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.'' | |||
'''D.''' ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/> | |||
The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | |||
</blockquote> | |||
<section end=Virginia2019 /> | <section end=Virginia2019 /> | ||
== Law | == Law against synthesis crimes in Texas 2019 == | ||
<section begin=Texas2019 /> | === Texas SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election === | ||
[[File:Flag of Texas.svg|thumb|right|200px|[[w:Texas]], the Lone Star State has protected the political candidates, but not ordinary folk against synthetic filth.]] | |||
<section begin=Texas2019 />On September 1 2019 [[w:Texas Senate]] bill [https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00751F.htm '''SB 751''' - '''''Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election'''''] [[w:amendment]]s to the election code came into effect in the [[w:Law of Texas]], giving [[w:candidates]] in [[w:elections]] a '''30-day protection period''' to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "''a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality''"<ref name="TexasSB751"> | |||
{{cite web | {{cite web | ||
Line 38: | Line 595: | ||
|quote= In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality}} | |quote= In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality}} | ||
</ref> | </ref> SB 751 was introduced to the Senate by [[w:Bryan Hughes (politician)]].<ref name="Texas SB 751 history">https://capitol.texas.gov/BillLookup/History.aspx?LegSess=86R&Bill=SB751</ref> | ||
<section end=Texas2019 /> | <section end=Texas2019 /> | ||
== Law | The text of '''S.B. No. 751''' is as follows | ||
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px| | <blockquote> | ||
'''''AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.''''' | |||
''BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF TEXAS:'' | |||
SECTION 1. ''[https://statutes.capitol.texas.gov/Docs/EL/htm/EL.255.htm Section 255.004, Election Code], is [[w:Amend (motion)|w:amended]] by adding Subsections (d) and (e) to read as follows:'' | |||
* ('''d''') ''A person commits an offense if the person, with intent to injure a candidate or influence the result of an election:'' | |||
*# '''''creates''' a [[w:deepfake|deep fake]] video; and'' | |||
*# ''causes the deep fake video to be '''published''' or '''distributed''' within '''30 days''' of an '''election'''.'' | |||
* ('''e''') ''In this section, "deep fake video" means a video, created with the '''intent to deceive''', that '''appears to depict''' a real person performing an action that '''did not occur''' in '''reality'''.'' | |||
SECTION 2. ''This Act takes effect September 1, 2019.'' | |||
</blockquote> | |||
---- | |||
* [[w:Texas Legislature]] | |||
== Law against synthesis crimes in California 2020 == | |||
=== California AB-602 - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action === | |||
[[File:Flag of California.svg|thumb|right|200px|[[w:California]] moved later than Virginia, but it outlawed also the manufacture of synthetic filth on Jan 1 2020.]] | |||
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and an Assemblymember of the [[w:California State Assembly]], most loved for authoring [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 '''AB-602''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''], which came into effect on Jan 1 2020, banning both the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.]] January 1 2020 <ref name="KFI2019"> | |||
{{cite web | {{cite web | ||
Line 56: | Line 633: | ||
|quote=}} | |quote=}} | ||
</ref> the [[w:California]] [[w:State law (United States)|w:US state law]] [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602] came into effect banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]] member [[w:Marc Berman]].<ref name="CNET2019"> | </ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]]member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602"> | ||
{{cite web | |||
|url=https://openstates.org/ca/bills/20192020/AB602/ | |||
|title=AB 602 - California Assembly Bill 2019-2020 Regular Session - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action. | |||
|last= | |||
|first= | |||
|date= | |||
|website=openstates.org | |||
|publisher=openstates.org | |||
|access-date=2021-03-24 | |||
|quote=}} | |||
</ref><ref name="CNET2019"> | |||
{{cite web | {{cite web | ||
Line 67: | Line 657: | ||
| url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/ | | url = https://www.cnet.com/news/california-laws-seek-to-crack-down-on-deepfakes-in-politics-and-porn/ | ||
| access-date = 2021-01-23 | | access-date = 2021-01-23 | ||
}} | |||
</ref> [https://trackbill.com/bill/california-assembly-bill-602-depiction-of-individual-using-digital-or-electronic-technology-sexually-explicit-material-cause-of-action/1690501/ AB602 at trackbill.com] | |||
<section end=California2020 /> | |||
[[File:Connie Leyva 2015.jpg|thumb|right|240px|[[w:California]] [[w:California State Senate|w:Senator]] [[w:Connie Leyva]] sponsored [https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false '''California Senate Bill SB 564''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''] in Feb '''2019''' and the Senate Bill was [https://www.sagaftra.org/action-alert-support-california-bill-end-deepfake-porn endorsed by SAG-AFTRA], but the Assembly Bill AB-602 authored by [[w:California State Assembly]]member [[w:Marc Berman]] was the one that became law on 1 January 2020 in the [[w:California Civil Code|w:California Civil Code]] of the [[w:California Codes]].]] | |||
'''Introduction''' by [https://a24.asmdc.org/ Assemblymember Marc Berman]: | |||
'''AB 602''', Berman. '''''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action'''''. | |||
Existing law creates a private [[w:Cause of action|w:right of action]] against a person who intentionally distributes a photograph or recorded image of another that exposes the intimate body parts of that person or of a person engaged in a sexual act '''without the person’s consent''' if specified conditions are met. | |||
This bill would provide that a '''depicted individual''', as defined, has a '''[[w:cause of action]] against''' a person who '''either''' | |||
* (1) '''creates''' and intentionally discloses sexually explicit material if the person knows or reasonably should have known the '''depicted''' individual '''did not [[w:consent]]''' to its creation or disclosure or | |||
* (2) who '''intentionally discloses''' sexually explicit material that the person did not create if the person knows the '''depicted''' individual '''did not consent''' to its creation. | |||
The bill would specify exceptions to those provisions, including if the material is a matter of legitimate public concern or a work of political or newsworthy value. | |||
The bill would authorize a prevailing [[w:plaintiff]] who suffers harm to seek [[w:Injunction|w:injunctive]] relief and recover reasonable [[w:attorney’s fee]]s and costs as well as specified monetary [[w:damages]], including [[w:Statutory damages|statutory]] and [[w:punitive damages]]. | |||
'''The law is as follows''': | |||
<blockquote> | |||
SECTION 1. Section 1708.86 is added to the [https://leginfo.legislature.ca.gov/faces/codesTOCSelected.xhtml?tocCode=CIV&tocTitle=+Civil+Code+-+CIV Civil Code of California], to read: | |||
'''1708.86.''' ('''a''') '''For purposes of this section''': | |||
* (1) “''Altered depiction''” means a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section. | |||
* (2) “''Authorized Representative''” means an attorney, talent agent, or personal manager authorized to represent a depicted individual if the depicted individual is represented. | |||
* (3) (A) “''Consent''” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated. | |||
* (3) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied: | |||
:**(i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it. | |||
:**(ii) The depicted individual’s authorized representative provides written approval of the signed agreement. | |||
* (4) “''Depicted individual''” means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction. | |||
* (5) “''Despicable conduct''” means conduct that is so vile, base, or contemptible that it would be looked down on and despised by a reasonable person. | |||
* (6) “''Digitization''” means to realistically depict any of the following: | |||
** (A) The nude body parts of another human being as the nude body parts of the depicted individual. | |||
** (B) Computer-generated nude body parts as the nude body parts of the depicted individual. | |||
** (C) The depicted individual engaging in sexual conduct in which the depicted individual did not engage. | |||
* (7) “''Disclose''” means to publish, make available, or distribute to the public. | |||
* (8) “''Individual''” means a natural person. | |||
* (9) “''Malice''” means that the defendant acted with intent to cause harm to the plaintiff or despicable conduct that was done with a willful and knowing disregard of the rights of the plaintiff. A person acts with knowing disregard within the meaning of this paragraph when they are aware of the probable harmful consequences of their conduct and deliberately fail to avoid those consequences. | |||
* (10) “''Nude''” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola. | |||
* (11) “''Person''” means a human being or legal entity. | |||
* (12) “''Plaintiff''” includes cross-plaintiff. | |||
* (13) “''Sexual conduct''” means any of the following: | |||
** (A) Masturbation. | |||
** (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals. | |||
** (C) Sexual penetration of the vagina or rectum by, or with, an object. | |||
** (D) The transfer of semen by means of sexual conduct from the penis directly onto the depicted individual as a result of ejaculation. | |||
** (E) Sadomasochistic abuse involving the depicted individual. | |||
(14) “''Sexually explicit material''” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct. | |||
('''b''') '''A depicted individual has a cause of action against''' a person who does '''either of the following''': | |||
* (1) '''Creates and intentionally discloses sexually explicit material''' and the person knows or reasonably should have known the '''depicted individual''' in that material '''did not consent''' to its creation or disclosure. | |||
* (2) '''Intentionally discloses''' sexually explicit material that the person did not create and the person knows the depicted individual in that material '''did not consent''' to the creation of the sexually explicit material. | |||
('''c''') (1) A person is '''not liable''' under this section in either of the following circumstances: | |||
* (A) The person discloses the sexually explicit material in the course of any of the following: | |||
:** (i) Reporting unlawful activity. | |||
:** (ii) Exercising the person’s law enforcement duties. | |||
:**(iii) Hearings, trials, or other legal proceedings. | |||
* (B) The material is any of the following: | |||
:** (i) A matter of legitimate public concern. | |||
:** (ii) A work of political or newsworthy value or similar work. | |||
:** (iii) Commentary, criticism, or disclosure that is otherwise protected by the California Constitution or the United States Constitution. | |||
* (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value solely because the depicted individual is a public figure. | |||
('''d''') It shall not be a defense to an action under this section that there is a disclaimer included in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material. | |||
('''e''') (1) A prevailing plaintiff who suffers harm as a result of the violation of subdivision (b) may recover any of the following: | |||
* (A) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material. | |||
* (B) One of the following: | |||
:** (i) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress. | |||
:** (ii) Upon request of the plaintiff at any time before the final judgment is rendered, the plaintiff may instead recover an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, as follows: | |||
:** (I) A sum of not less than one thousand five hundred dollars ($1,500) but not more than thirty thousand dollars ($30,000). | |||
:** (II) If the unlawful act was committed with malice, the award of statutory damages may be increased to a maximum of one hundred fifty thousand dollars ($150,000). | |||
* (C) Punitive damages. | |||
* (D) Reasonable attorney’s fees and costs. | |||
* (E) Any other available relief, including injunctive relief. | |||
(2) The remedies provided by this section are cumulative and shall not be construed as restricting a remedy that is available under any other law. | |||
('''f''') An action under this section shall be commenced no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence. | |||
('''g''') The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions. | |||
</blockquote> | |||
<ref name="CaliforniaStateLaw AB 602"> | |||
{{Citation | |||
| title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | |||
| first = Marc | |||
| last = Berman | |||
| author-link = w:Marc Berman | |||
| first2 = Connie | |||
| last2 = Leyva | |||
| author2-link = w:Connie Leyva | |||
| series = | |||
| year = 2019 | |||
| publisher = [[w:California]] | |||
| url = https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 | |||
}} | }} | ||
</ref> | </ref> | ||
== Law | ---- | ||
< | |||
* [[w:California State Legislature]] | |||
== Law against synthesis crimes in Georgia in 2021 == | |||
[[File:Flag of Georgia (U.S. state) II.png|thumb|right|200px|Georgia's motto is ''"Wisdom, Justice, Moderation"'' and one could say it is well manifested in how their law against synthetic filth [https://codes.findlaw.com/ga/title-16-crimes-and-offenses/ga-code-sect-16-11-90.html '''GA CODE § 16-11-90'''] is thought out and written.]] | |||
=== Georgia Code § 16-11-90 === | |||
'''[https://codes.findlaw.com/ga/ Georgia Code]''' / '''[https://codes.findlaw.com/ga/title-16-crimes-and-offenses/ Title 16. Crimes and Offenses]''' [https://codes.findlaw.com/ga/title-16-crimes-and-offenses/ga-code-sect-16-11-90.html Chapter 11 "Offenses Against Public Order and Safety", Article 3 "Invasions of Privacy", Part 3 "Invasion of privacy" '''GA CODE § 16-11-90'''] | |||
<ref group="1st seen in" name="CCRI US deepfake laws list">https://cybercivilrights.org/deep-fake-laws/</ref> | |||
'''The law is''' as of April 14, 2021<ref name="Georgia Code § 16-11-90 at findlaw.com" /> as follows: | |||
<blockquote> | |||
('''a''') As used in this Code section, the term: | |||
* ('''1''') “''Harassment''” means engaging in conduct directed at a depicted person that is intended to cause substantial emotional harm to the depicted person. | |||
* ('''2''') “''Nudity''” means: | |||
::* (A) The showing of the human male or female genitals, pubic area, or buttocks without any covering or with less than a full opaque covering; | |||
::* (B) The showing of the female breasts without any covering or with less than a full opaque covering; or | |||
::* (C) The depiction of covered male genitals in a discernibly turgid state. | |||
* ('''3''') “''Sexually explicit conduct” shall have the same meaning as set forth in Code Section 16-12-100 . | |||
('''b''') '''A person violates this Code section if he or she, knowing the content of a transmission or post, knowingly and without the consent of the depicted person''': | |||
* ('''1''') '''Electronically transmits or posts''', in one or more transmissions or posts, a photograph or video which depicts nudity or sexually explicit conduct of an adult, '''including a falsely created videographic or still image''', when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person; or | |||
* ('''2''') '''Causes the electronic transmission or posting''', in one or more transmissions or posts, of a photograph or video which depicts nudity or sexually explicit conduct of an adult, '''including a falsely created videographic or still image''', when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person. | |||
:Nothing in this Code section shall be construed to impose liability on an interactive computer service, as such term is defined in 47 U.S.C. 230(f)(2) , or an information service or telecommunications service, as such terms are defined in 47 U.S.C. 153 , for content provided by another person. | |||
('''c''') Any person who violates this Code section shall be guilty of a misdemeanor of a high and aggravated nature; provided, however, that upon a second or subsequent violation of this Code section, he or she shall be guilty of a felony and, upon conviction thereof, shall be punished by imprisonment of not less than one nor more than five years, a fine of not more than $100,000.00, or both. | |||
('''d''') A person shall be '''subject to prosecution in this state''' pursuant to Code Section 17-2-1 for any conduct made unlawful by this Code section which the person engages in while: | |||
* ('''1''') '''Either within''' or '''outside of this state''' if, by such conduct, the person commits a violation of this Code section which involves an '''individual who resides in this state'''; or | |||
* ('''2''') '''Within this state''' if, by such conduct, the person commits a violation of this Code section which involves an '''individual who resides within or outside this state'''. | |||
('''e''') The provisions of subsection (b) of this Code section shall not apply to: | |||
* ('''1''') The activities of law enforcement and prosecution agencies in the investigation and prosecution of criminal offenses; | |||
* ('''2''') Legitimate medical, scientific, or educational activities; | |||
* ('''3''') Any person who transmits or posts a photograph or video depicting only himself or herself engaged in nudity or sexually explicit conduct; | |||
* ('''4''') The transmission or posting of a photograph or video that was originally made for commercial purposes; | |||
* ('''5''') Any person who transmits or posts a photograph or video depicting a person voluntarily engaged in nudity or sexually explicit conduct in a public setting; or | |||
* ('''6''') A transmission that is made pursuant to or in anticipation of a civil action. | |||
('''f''') There shall be a rebuttable presumption that an information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet, for content provided by another person, does not know the content of an electronic transmission or post. | |||
('''g''') Any violation of this Code section shall constitute a separate offense and shall not merge with any other crimes set forth in this title.</blockquote><ref name="Georgia Code § 16-11-90 at findlaw.com"> | |||
{{cite web | {{cite web | ||
| url = https:// | |url=https://codes.findlaw.com/ga/title-16-crimes-and-offenses/ga-code-sect-16-11-90.html | ||
| title = | |title=Georgia Code Title 16. Crimes and Offenses § 16-11-90 | ||
| last = | |last= | ||
| first = | |first= | ||
| date = | |date=2021-04-14 | ||
| website = [[w: | |website=[[w:FindLaw]] | ||
| publisher = [[w: | |publisher=[[w:Georgia (U.S. state)]] | ||
| access-date = | |access-date=2022-01-04 | ||
| quote = }} | |quote=}} | ||
</ref> | |||
== Law against synthesis crimes in New York State in 2021 == | |||
[[File:Flag of New York.svg|thumb|right|200px|New York State]] | |||
=== New York State - CHAPTER 6 Civil Rights ARTICLE 5 Right of Privacy SECTION 52-C - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual === | |||
[https://www.nysenate.gov/legislation/laws/CONSOLIDATED '''Consolidated Laws of New York'''] / [https://www.nysenate.gov/legislation/laws/CVR/-CH6 '''CHAPTER 6 Civil Rights'''] [https://www.nysenate.gov/legislation/laws/CVR/A5 '''ARTICLE 5 Right of Privacy'''] / [https://www.nysenate.gov/legislation/laws/CVR/52-C#:~:text=Section%2052%2DC%20Private%20right,explicit%20depiction%20of%20an%20individual '''SECTION 52-C''' - '''Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual''']<ref group="1st seen in" name="CCRI US deepfake laws list" /> | |||
'''The law is as follows:''' | |||
<blockquote> | |||
<big>'''§ 52-c''' - '''''Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual.''''' </big><sup>Obs: There are 2 § 52-c's</sup> | |||
'''1. For the purposes of this section:''' | |||
* '''a.''' "'''''depicted individual'''''" means an individual who '''appears''', as a result of digitization, '''to be giving a performance they did not actually perform''' or '''to be performing in a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section'''. | |||
* '''b.''' "'''''digitization'''''" means to '''realistically depict the nude body parts of another human being as the nude body parts of the depicted individual''', '''computer-generated nude body parts as the nude body parts of the depicted individual''' or the '''depicted individual engaging in sexual conduct''', as defined in subdivision ten of section 130.00 of the penal law, '''in which the depicted individual did not engage'''. | |||
* '''c.''' "''individual''" means a natural person. | |||
* '''d.''' "''person''" means a human being or legal entity. | |||
* '''e.''' "''sexually explicit material''" means any portion of an audio visual work that shows the depicted individual performing in the nude, meaning with an unclothed or exposed intimate part, as defined in section 245.15 of the penal law, or appearing to engage in, or being subjected to, sexual conduct, as defined in subdivision ten of section 130.00 of the penal law. | |||
'''2. a.''' A depicted individual shall have a '''cause of action''' against a person who, '''discloses''', '''disseminates''' or '''publishes''' sexually explicit material related to the depicted individual, and the person knows or reasonably should have known the depicted individual in that material '''did not consent''' to its '''creation''', disclosure, dissemination, or publication. | |||
* '''b.''' It shall '''not be a defense''' to an action under this section that '''there is a disclaimer''' in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material. | |||
'''3. a.''' A depicted individual may only consent to the creation, disclosure, dissemination, or publication of sexually explicit material by knowingly and voluntarily signing an agreement written in plain language that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated. | |||
* '''b.''' A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied: | |||
:* i. the depicted individual is given at least three business days to review the terms of the agreement before signing it; or | |||
:* ii. if the depicted individual is represented, the attorney, talent agent, or personal manager authorized to represent the depicted individual provides additional written approval of the signed agreement. | |||
'''4. a.''' A person is not liable under this section if: | |||
:* i. the person discloses, disseminates or publishes the sexually explicit material in the course of reporting unlawful activity, exercising the person's law enforcement duties, or hearings, trials or other legal proceedings; or | |||
:* ii. the sexually explicit material is a matter of legitimate public concern, a work of political or newsworthy value or similar work, or commentary, criticism or disclosure that is otherwise protected by the constitution of this state or the United States; provided that sexually explicit material shall not be considered of newsworthy value solely because the depicted individual is a public figure. | |||
'''5.''' In any action commenced pursuant to this section, the finder of fact, in its discretion, '''may award injunctive relief''', '''punitive damages''', '''compensatory damages''', and '''reasonable court costs''' and '''attorney's fees'''. | |||
'''6.''' A cause of action or special proceeding under this section shall be commenced the later of either: | |||
* '''a.''' three years after the dissemination or publication of sexually explicit material; or | |||
* '''b.''' one year from the date a person discovers, or reasonably should have discovered, the dissemination or publication of such sexually explicit material. | |||
'''7.''' Nothing in this section shall be read to require a prior criminal complaint, prosecution or conviction to establish the elements of the cause of action provided for in this section. | |||
'''8.''' The provisions of this section including the remedies are in addition to, and shall not supersede, any other rights or remedies available in law or equity. | |||
'''9.''' If any provision of this section or its application to any person or circumstance is held invalid, the invalidity shall not affect other provisions or applications of this section which can be given effect without the invalid provision or application, and to this end the provisions of this section are severable. | |||
'''10.''' Nothing in this section shall be construed to limit, or to enlarge, the protections that 47 U.S.C. § 230 confers on an interactive | |||
computer service for content provided by another information content provider, as such terms are defined in 47 U.S.C. § 230.</blockquote><ref name="New York State Civil Rights CHAPTER 6, ARTICLE 5, SECTION 52-C"> | |||
{{cite web | {{cite web | ||
| url = https://www. | |url=https://www.nysenate.gov/legislation/laws/CVR/52-C#:~:text=Section%2052%2DC%20Private%20right,explicit%20depiction%20of%20an%20individual | ||
| title = | |title= SECTION 52-C Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual | ||
| last = | |last= | ||
| first = | |first= | ||
| date = | |date=2021-11-12 | ||
| website = | |website=nysenate.gov | ||
| publisher = [[w: | |publisher=[[w:New York State Legislature]] | ||
| access-date = 2021-01- | |access-date=2021-01-04 | ||
| quote = }} | |quote=}} | ||
</ref> | |||
== Current bills in the USA == | |||
=== US Senate bill S. 3696 - Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act of 2024) (118th Congress - 2023-2024) === | |||
[https://www.congress.gov/bill/118th-congress/senate-bill/3696/text '''''S.3696 - DEFIANCE Act of 2024''''' at congress.gov], a bipartisan Senate bill against synthetic filth. | |||
* [https://www.congress.gov/congressional-record/volume-170/issue-17/senate-section/article/S289-2 '''''Introductory Statement on S. 3696; Congressional Record Vol. 170, No. 17''''' at congress.gov] 2024-01-30 | |||
* [https://www.judiciary.senate.gov/press/releases/durbin-graham-klobuchar-hawley-introduce-defiance-act-to-hold-accountable-those-responsible-for-the-proliferation-of-nonconsensual-sexually-explicit-deepfake-images-and-videos '''''Durbin, Graham, Klobuchar, Hawley Introduce DEFIANCE Act to Hold Accountable Those Responsible for the Proliferation of Nonconsensual, Sexually-Explicit “Deepfake” Images and Videos''''' at judiciary.senate.gov]<ref group="1st seen in">https://onfido.com/blog/deepfake-law/</ref> | |||
* [https://www.govinfo.gov/app/details/BILLS-118s3696is '''''S. 3696 (IS) - Disrupt Explicit Forged Images And Non-Consensual Edits Act of 2024''''' at govinfo.gov] | |||
* [https://www.durbin.senate.gov/imo/media/doc/defiance_act_of_2024.pdf '''DEFIANCE Act of 2024''' at durbin.senate.gov] explains that it would create ''a federal civil remedy for victims who are identifiable in a “digital forgery"''. | |||
'''Reporting and commentary on DEFIANCE Act of 2024 bill''' | |||
* [https://www.theguardian.com/technology/2024/jan/30/taylor-swift-ai-deepfake-nonconsensual-sexual-images-bill '''''Taylor Swift AI images prompt US bill to tackle nonconsensual, sexual deepfakes''''' at theguardian.com] | |||
* [https://www.theverge.com/2024/1/30/24056385/congress-defiance-act-proposed-ban-nonconsensual-ai-porn '''''Lawmakers propose anti-nonconsensual AI porn bill after Taylor Swift controversy''''' at theverge.com] | |||
=== US House bill H.R.7123 - Quashing Unwanted and Interruptive Electronic Telecommunications Act (QUIET Act) (118th Congress - 2023-2024) === | |||
[https://www.congress.gov/bill/118th-congress/house-bill/7123/text?s=1&r=3 '''''Quashing Unwanted and Interruptive Electronic Telecommunications Act''''' at congress.gov]<ref group="1st seen in">https://www.newyorker.com/science/annals-of-artificial-intelligence/the-terrifying-ai-scam-that-uses-your-loved-ones-voice</ref> | |||
=== US House bill H.R.6943 - No AI FRAUD Act (118th Congress - 2023-2024) === | |||
[https://www.congress.gov/bill/118th-congress/house-bill/6943/text '''No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act of 2024''' or the ('''No AI FRAUD Act''') at congress.gov] was introduced to the 118th Congress 2nd session on 2024-01-24.<ref group="1st seen in">https://onfido.com/blog/deepfake-law/</ref> | |||
=== US House bill H.R.5586 - Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023 (118th Congress - 2023-2024) === | |||
[https://www.congress.gov/bill/118th-congress/house-bill/5586/text “Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023” or the “DEEPFAKES Accountability Act” at congress.gov] is a reintroduction of earlier House bill H.R.3230 to the 118th Congress (2023-2024 session) | |||
=== US House bill H.R. 3106 - Preventing Deepfakes of Intimate Images Act (118th Congress - 2023-2024) === | |||
[https://www.congress.gov/bill/118th-congress/house-bill/3106/text '''''Preventing Deepfakes of Intimate Images Act''''' at congress.gov] was introduced in the House on 2023-05-05. It was [https://www.congress.gov/bill/117th-congress/house-bill/9631 a reintroduction of H.R. 9631 from the 117th Congress]. | |||
=== NY Senate bill S5583 in the 2023-2024 regular session === | |||
[https://www.nysenate.gov/legislation/bills/2023/S5583 '''NY Senate bill S5583 in the 2023-2024 regular session''' at nysenate.gov] would establish ''the crime of aggravated harassment by means of electronic or digital communication and provides for a private right of action for the unlawful dissemination or publication of deep fakes''. | |||
== Past bills in the USA == | |||
=== US Senate bill S1641 - Preventing Rampant Online Technological Exploitation and Criminal Trafficking Act of 2022 - (117th Congress) === | |||
The bill known as [https://www.congress.gov/bill/117th-congress/senate-bill/4991/text?r=4&s=1 Senate bill '''S.4991''' - ''''''PROTECT Act'''''' at congress.gov] was read by Senator [[w:Mike Lee]] on Wednesday 2022-09-28 twice. | |||
=== New York Senate bill - Unlawful Electronic Transmission of Sexually Explicit Visual Material - in regular session 2021-2022 === | |||
[[File:Flag of New York (1909–2020).png|thumb|right|200px|[[w:New York State Legislature]] regular session 2021-2022 is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 New York senate bill '''S1641'''] and identical [https://www.nysenate.gov/legislation/bills/2021/A6517 assembly bill '''A6517'''] to ban sending unsolicited pornography.]] | |||
Bill '''''Unlawful Electronic Transmission of Sexually Explicit Visual Material'''''' is essentially a bill that aims to ban sending unsolicited nudes. | |||
In the 2021-2022 [[w:New York State Senate]] regular sessions, on 2021-01-14 Senator [[w:James Skoufis]] ([https://www.nysenate.gov/senators/james-skoufis official website]) sponsored and Senators [[w:Brian Benjamin]] ([https://www.nysenate.gov/senators/brian-benjamin official website]) and [[w:Todd Kaminsky]] ([https://www.nysenate.gov/senators/todd-kaminsky official website)] of the New York State Senate co-sponsored [https://www.nysenate.gov/legislation/bills/2021/S1641 New York Senate bill '''S1641'''] to add section '''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL''''' to the [https://www.nysenate.gov/legislation/laws/PEN/P3TNA250 Article 250 of the penal law]. On 2021-03-19 an identical [https://www.nysenate.gov/legislation/bills/2021/A6517 New York Assembly bill '''A6517''' - ''Establishes the crime of unlawful electronic transmission of sexually explicit visual material''] was introduced to the [[w:New York State Assembly]] by Assembly Member [[w:Aileen Gunther]] ([https://nyassembly.gov/mem/Aileen-M-Gunther official website]).<ref group="1st seen in"> | |||
First seen in the [https://trackbill.com/search/#/related=%7B%22id%22:%221690501%22,%22state%22:%22CA%22,%22session%22:%222019%22,%22billId%22:%22AB602%22%7D&direction=desc&page=1&resultsPerPage=25&sort=relevancy&tracked&upcoming_hearings&type=bills&state=all&session suggestions for similar bills for Bills similar to CA AB602 by trackbill.com]. | |||
</ref> | </ref> | ||
<section | If this bill passes it will be codified in the [[w:Consolidated Laws of New York]]. View the [https://www.nysenate.gov/legislation/laws/CONSOLIDATED '''Consolidated Laws of New York''' at nysenate.gov]. | ||
* '''Title of bill''': ''An act to amend the penal law, in relation to the creation of the criminal offense of unlawful electronic transmission of sexually explicit visual material'' | |||
* '''Purpose''': ''The purpose of this bill is to make it unlawful to send sexually explicit material through electronic means unless the material is sent at the request of, or with the express consent of the recipient.'' | |||
* '''Summary of provisions''': Adds a '''new section 250.70''' to the penal law making it ''unlawful to knowingly transmit by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed unless the material is sent at the request of, or with the express consent of the recipient. | |||
* '''Justification''': ''Currently under New York State law, indecent exposure in person is a crime, but it is not unlawful to send sexually explicit photos to nonconsenting adult recipients through electronic transmission. With the growing modem age of online dating, many individuals are receiving sexually explicit visual content without their consent from strangers. No person should be forced to view sexually explicit material without their consent.'' | |||
''The bill offers a clear deterrent to those considering sending unsolicited sexual pics and similar inappropriate conduct, and protects the | |||
unwilling recipients who currently have no legal recourse for such abuses. | |||
''What is illegal in the real world must be illegal in the digital world, and this legislation is a first step in the right direction in adding that accountability.'' | |||
* '''Legislative history''': | |||
*# Senate - 2020 - S5949 Referred to Codes | |||
*# Assembly - 2020 - A7801 Referred to Codes | |||
* '''Fiscal implications''': ''Minimal'' | |||
* '''Effective date''': ''This act shall take effect on the first of November next succeeding the date on which it shall have become a law.'' | |||
The text of the bill is, as of 2021-03-24, as follows: | |||
:"Section 1. The penal law is amended by adding a new section 250.70 to read as follows: | |||
:'''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL'''''. | |||
:A person is guilty of unlawful electronic transmission of sexually explicit visual material if a person knowingly transmits by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed or depicts the covered genitals of a male person that are in a discernibly turgid state and such visual material is not sent at the request of or with the express consent of the recipient. For purposes of this section the term "intimate parts" means the naked genitals, pubic area, anus, or female postpubescent nipple of the person and the term "sexual conduct" shall have the same meaning as defined in [https://www.nysenate.gov/legislation/laws/PEN/130.00 section 130.00 (Sex offenses; definitions of terms)] of this chapter. Unlawful electronic transmission of sexually explicit visual material is a class a misdemeanor. | |||
:'''§ 2'''. This act shall take effect on the first of November next succeeding the date on which it shall have become a law." | |||
---- | |||
<gallery> | |||
File:Nopic.jpg|thumb|right|260px|Senator [[w:James Skoufis]] ([https://www.nysenate.gov/senators/james-skoufis official website]) sponsored [https://www.nysenate.gov/legislation/bills/2021/S1641 New York Senate bill '''S1641'''] to add section '''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL''''' to the [https://www.nysenate.gov/legislation/laws/PEN/P3TNA250 Article 250 of the penal law]. | |||
File:LIRR Elmont, Brian Benjamin (cropped).jpg|thumb|260px|Senator [[w:Brian Benjamin]] ([https://www.nysenate.gov/senators/brian-benjamin official website]) is a cosponsor of S1641 | |||
File:Todd_Kaminsky_Head_Shot.jpg||thumb|260px|Senator [[w:Todd Kaminsky]] ([https://www.nysenate.gov/senators/todd-kaminsky official website)] is a cosponsor of S1641 | |||
File:Aileen_Gunther.jpg|thumb|260px|NY Assembly Member [[w:Aileen Gunther]] ([https://nyassembly.gov/mem/Aileen-M-Gunther official website]) presented an identical [https://www.nysenate.gov/legislation/bills/2021/A6517 New York Assembly bill '''A6517''' - ''Establishes the crime of unlawful electronic transmission of sexually explicit visual material''] to the [[w:New York State Assembly]] on 2021-03-19. | |||
* [[w:New York State Legislature]] | |||
</gallery> | |||
=== US Senate bill - Stop Internet Sexual Exploitation Act - 2019-2020 US Senate session (116th Congress) === | |||
The Stop Internet Sexual Exploitation Act (SISE) was a bill introduced to the 2019-2020 session of the US Senate. | |||
* [https://www.congress.gov/bill/116th-congress/senate-bill/5054?r=1&s=1 US Senate Bill ''''''S.5054''' - '''Stop Internet Sexual Exploitation Act''''' at congress.gov] | |||
= | === US House bill - H.R.3230 - DEEP FAKES Accountability Act (116th Congress) === | ||
== | |||
* [https://www. | * [https://www.congress.gov/bill/116th-congress/house-bill/3230/text '''H.R.3230''' - '''''DEEP FAKES Accountability Act''''' at congress.gov]<ref group="1st seen in" name="ChatGPT 2023 inquiry" /> also known as ''Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019'' and it aimed to include requirements for producers of synthetic human-like fakes '' to generally comply with certain [[w:Digital watermarking|w:digital watermark]] and disclosure requirements''.<ref>https://www.congress.gov/bill/116th-congress/house-bill/3230</ref> | ||
=== US Senate bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress) === | |||
* [https://www. | * [https://www.congress.gov/bill/115th-congress/senate-bill/3805/text '''S.3805''' - '''''Malicious Deep Fake Prohibition Act of 2018''''' at congress.gov]<ref group="1st seen in" name="ChatGPT 2023 inquiry"> | ||
Chatting with ChatGPT 2023 | |||
* First I asked ChatGPT to "''list some legislative approaches against so-called "deep fakes" or "deepfakes"''" and it mentioned the Singaporean [[#Protection from Online Falsehoods and Manipulation Act 2019]] and the [[#Bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress)]] | |||
</ref> aimed at criminalizing synthesizing human-likeness media for intent of breaking federal, state, local or tribal law | |||
---- | |||
---- | |||
= Law proposals = | = Law proposals = | ||
== Law proposals against synthetic filth by [[User:Juho Kunsola|Juho Kunsola]] == | |||
* '''Audience''': Developed with suitability for national, supranational and UN treaty levels. | * '''Audience''': Developed with suitability for national, supranational and UN treaty levels. | ||
Line 146: | Line 1,022: | ||
* '''History''': This version is an evolution of a Finnish language original written in 2016. | * '''History''': This version is an evolution of a Finnish language original written in 2016. | ||
[[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very [https://www.finlex.fi/en/laki/kaannokset/ accessible laws], but also here we need updating of the laws for this age of industrial disinformation.]] | |||
Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]. | Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]. | ||
Line 166: | Line 1,043: | ||
* Section 12 - '''Right to bring charges''' (879/2013) | * Section 12 - '''Right to bring charges''' (879/2013) | ||
* Section 13 - '''Corporate criminal liability''' (511/2011) | * Section 13 - '''Corporate criminal liability''' (511/2011) | ||
[[File:The-diffuse-reflection-deducted-from-the-specular-reflection-Debevec-2000.png|thumb|right|260px|Subtraction of the diffuse reflection from the specular reflection. Image is scaled for luminocity. Diffuse reflection is acquired by placing polarizers in 90 degree angle and specular with 0 degree angle. | [[File:The-diffuse-reflection-deducted-from-the-specular-reflection-Debevec-2000.png|thumb|right|260px|Subtraction of the diffuse reflection from the specular reflection. Image is scaled for luminocity. Diffuse reflection is acquired by placing polarizers in 90 degree angle and specular with 0 degree angle. | ||
Line 173: | Line 1,048: | ||
[[:File:Deb-2000-reflectance-separation.png|Original picture]] by [[w:Paul Debevec|Debevec]] et al. - Copyright ACM 2000 https://dl.acm.org/citation.cfm?doid=311779.344855]] | [[:File:Deb-2000-reflectance-separation.png|Original picture]] by [[w:Paul Debevec|Debevec]] et al. - Copyright ACM 2000 https://dl.acm.org/citation.cfm?doid=311779.344855]] | ||
=== Law proposal to ban | === Law proposal to ban visual synthetic filth === | ||
==== §1 Models of human appearance ==== | |||
A '''model of human appearance''' means | |||
::* A realistic '''3D model''' | |||
::* A '''[[Glossary#Bidirectional reflectance distribution function|7D bidirectional reflectance distribution function]] model''' | |||
::* A '''direct-to-2D''' capable [[w:machine learning]] model | |||
::* Or a model made with any technology whatsoever, that looks deceivingly like the target person. | |||
==== §2 | ==== §2 Producing synthetic pornography ==== | ||
''' | Making '''projections''', still or videographic, where targets are portrayed in a '''nude''' or in a '''sexual situation''' from models of human appearance defined in §1 '''without express consent''' of the targets is '''illegal'''. | ||
==== §3 Aggravated application of | ==== §3 Distributing synthetic pornography ==== | ||
If produced media is for a '''purpose''' to | '''Distributing''', '''making available''', '''public display''', '''purchase''', '''sale''', '''yielding''', '''import''' and '''export''' of non-authorized '''synthetic pornography''' defined in §2 are '''punishable'''.<ref group="footnote">People who are found in possession of this synthetic pornography should probably not be penalized, but rather advised to get some help.</ref> | ||
==== §4 Aggravated producing and distributing synthetic pornography ==== | |||
If the media described in §2 or §3 is made or distributed with the '''intent''' to '''frame for a crime''' or for '''blackmail''', the crime should be judged as '''aggravated'''. | |||
==== Afterwords ==== | |||
The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product weaponized synthetic pornography, but then in July 2019 it appeared to me that [[Adequate Porn Watcher AI (concept)]] could really help in this age of industrial disinformation if it were built, trained and operational. Banning modeling of human appearance was in conflict with the revised plan. | |||
It is safe to assume that collecting permissions to model each pornographic recording is not plausible, so an interesting question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet. | |||
In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be <font color="green">formulated</font> so that this <font color="red">'''does not make'''</font> <big>'''[[Adequate Porn Watcher AI (concept)]]'''</big> <font color="red">illegal</font> / <font color="red">impossible</font>. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal. | |||
=== Law proposal to ban unauthorized modeling of human voice === | |||
{{#ev:youtube|0sR1rU3gLzQ|360px|right|[https://www.youtube.com/watch?v=0sR1rU3gLzQ Video 'This AI Clones Your Voice After Listening for 5 Seconds' by '2 minute papers' at YouTube] describes the voice thieving machine by Google Research in [[w:NeurIPS|w:NeurIPS]] 2018.}} | |||
'''Motivation''': The current situation where the criminals can freely trade and grow their libraries of stolen voices is unwise. | |||
==== §1 Unauthorized modeling of a human voice ==== | |||
'''Acquiring''' such a '''model of a human's voice''', that '''deceptively resembles''' some '''dead''' or '''living''' person's voice and the '''possession''', '''purchase''', '''sale''', '''yielding''', '''import''' and '''export''' '''without''' the '''express consent''' of the target are '''punishable'''. | |||
==== §2 Application of unauthorized voice models ==== | |||
'''Producing''' and '''making available''' media from covert voice models defined in §1 is '''punishable'''. | |||
==== §3 Aggravated application of unauthorized voice models ==== | |||
If the produced media is for a '''purpose''' to | |||
::* '''frame''' a '''human''' target or targets for '''crimes''' | ::* '''frame''' a '''human''' target or targets for '''crimes''' | ||
::* to attempt '''extortion''' or | ::* to attempt '''extortion''' or | ||
Line 188: | Line 1,092: | ||
the crime should be judged as '''aggravated'''. | the crime should be judged as '''aggravated'''. | ||
=== | ---- | ||
= Resources and reporting on law = | |||
== AI and law in general == | |||
[[File:LOC_Main_Reading_Room_Highsmith.jpg|thumb|right|340px|[[w:Library of Congress]] ([https://loc.gov/ loc.gov]) reading room]] | |||
''' Reviews and regulation ''' | |||
From the '''[[w:Library of Congress]]''': | |||
* [https://www.loc.gov/law/help/artificial-intelligence/index.php ''''''Regulation of Artificial Intelligence'''''' at loc.gov] | |||
** [https://www.loc.gov/law/help/artificial-intelligence/compsum.php 'Regulation of Artificial Intelligence: Comparative Summary' at loc.gov] | |||
** [https://www.loc.gov/law/help/artificial-intelligence/international.php 'Regulation of Artificial Intelligence: International and Regional Approaches' (loc.gov)] | |||
** [https://www.loc.gov/law/help/artificial-intelligence/americas.php 'Regulation of Artificial Intelligence: The Americas and the Caribbean' (loc.gov)] | |||
** [https://www.loc.gov/law/help/artificial-intelligence/asia-pacific.php 'Regulation of Artificial Intelligence: East/South Asia and the Pacific' (loc.gov)] | |||
** [https://www.loc.gov/law/help/artificial-intelligence/europe-asia.php 'Regulation of Artificial Intelligence: Europe and Central Asia' loc.gov] | |||
** [https://www.loc.gov/law/help/artificial-intelligence/middleeast-northafrica.php 'Regulation of Artificial Intelligence: Middle East and North Africa' (loc.gov)] | |||
** [https://www.loc.gov/law/help/artificial-intelligence/africa.php 'Regulation of Artificial Intelligence: Sub-Saharan Africa' (loc.gov)] | |||
'''[[w:Gibson, Dunn & Crutcher|w:Gibson Dunn & Crutcher]]''' (gibsondunn.com) publishes a quarterly legal update on 'Artificial Intelligence and Autonomous Systems'. Gibson Dunn & Crutcher is a global [[w:law firm|w:law firm]], founded in Los Angeles in 1890. | |||
* [https://www.gibsondunn.com/artificial-intelligence-and-autonomous-systems-legal-update-4q18/ ''''''Artificial Intelligence and Autonomous Systems Legal Update'''''' Quarter 4 2018 at Gibson & Dunn] | |||
* [https://www.gibsondunn.com/artificial-intelligence-and-autonomous-systems-legal-update-1q19/ 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 1 2019'] | |||
* [https://www.gibsondunn.com/artificial-intelligence-and-autonomous-systems-legal-update-2q19/ 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 2 2019'] | |||
* [https://www.gibsondunn.com/artificial-intelligence-and-autonomous-systems-legal-update-3q19/ 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 3 2019'] | |||
* [https://www.gibsondunn.com/artificial-intelligence-and-automated-systems-legal-update-4q19/ 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 4 2019'] | |||
''' From Europe''' | |||
[[File:Flag of Europe.svg|thumb|right|180px|The [[w:European Union]]'s [[w:European Parliament|Parliament]]'s [[w:European Parliamentary Research Service]] on [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf The ethics of artificial intelligence: Issues and initiatives].]] | |||
* [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf ''''''The ethics of artificial intelligence: Issues and initiatives'''''' (.pdf) at europarl.europa.eu], a March 2020 study by the [[w:European Parliamentary Research Service]] Starting from page 37 the .pdf lists organizations in the field. | |||
== Synthetic filth in the law and media == | |||
* [https://scholarship.law.vanderbilt.edu/cgi/viewcontent.cgi?article=4409&context=vlr ''''''"The New Weapon of Choice": Law's Current Inability to Properly Address Deepfake Pornography'''''' at scholarship.law.vanderbilt.edu], October 2020 Notes by Anne Pechenik Gieseke published in the The [[w:Vanderbilt Law Review]], the flagship [[w:academic journal]] of [[w:Vanderbilt University Law School]]. | |||
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]]. | |||
* [https://ssri.duke.edu/news/don%E2%80%99t-believe-your-eyes-or-ears-weaponization-artificial-intelligence-machine-learning-and ''''''Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes'''''' at ssri.duke.edu], an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the [[w:Duke University]] | |||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ ''''''Deepfakes: False Pornography Is Here and the Law Cannot Protect You'''''' at scholarship.law.duke.edu], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review. | |||
* [https://www.bloomberg.com/news/articles/2023-06-20/deepfake-porn-political-ads-push-states-to-curb-rampant-ai-use '''''States Are Rushing to Regulate Deepfakes as AI Goes Mainstream''''' at bloomberg.com] (paywalled), June 2023 reporting on US legislation against synthetic human-like fakes. It lists CA, WA, WY, MN, TX, GA and VA as having anti-deepfake laws and lists LA, IL, MA and NJ as planning to legislate. | |||
== The countries that have unfortunately banned full face veil == | |||
{{Q|There are currently 16 nations that have banned the burqa (not to be confused with the hijab), including [[w:Tunisia]],<ref name="reuters.com"> | |||
{{cite news | |||
|url=https://www.reuters.com/article/uk-tunisia-niqab-security-idUKKCN1U01CH | |||
|title= Tunisian PM bans wearing of niqab in public institutions | |||
|date=5 July 2019 | |||
|access-date=2021-03-13 | |||
|newspaper= Reuters | |||
}} | |||
</ref> [[w:Austria]], [[w:Denmark]], [[w:France]], [[w:Belgium]], [[w:Tajikistan]], [[w:Latvia]],<ref name="independent.co.uk"> | |||
==== | {{cite web | ||
|url=https://www.independent.co.uk/news/islamic-muslim-face-veil-niqab-burqa-banned-latvia-despite-being-worn-by-just-three-women-entire-a6993991.html | |||
|title=A European government has banned Islamic face veils despite them being worn by just three women | |||
|date=21 April 2016 | |||
|access-date=2021-03-13 | |||
}} | |||
</ref> [[w:Bulgaria]],<ref> | |||
[https://www.smh.com.au/world/bulgaria-the-latest-european-country-to-ban-the-burqa-and-niqab-in-public-places-20160930-grss9q.html Bulgaria the latest European country to ban the burqa and [niqab in public places], Smh.com.au: accessed 5 December 2016. | |||
==== | </ref> [[w:Cameroon]], [[w:Chad]], [[w:Congo-Brazzaville]], [[w:Gabon]], [[w:Netherlands]],<ref> | ||
{{cite news | |||
|last1=Halasz | |||
|first1=Stephanie | |||
|last2=McKenzie | |||
|first2=Sheena | |||
|title=The Netherlands introduces burqa ban in some public spaces | |||
|url=https://edition.cnn.com/2018/06/27/europe/netherlands-partial-burqa-ban-intl/index.html | |||
|access-date=2021-03-13 | |||
|agency=CNN | |||
|issue=27 June 2018 | |||
|publisher=CNN | |||
|date=27 June 2018 | |||
}} | |||
==== | </ref> [[w:China]],<ref> | ||
{{cite news | |||
|last1=Phillips | |||
|first1=Tom | |||
|title=China bans burqa in capital of Muslim region of Xinjiang | |||
|url=https://www.telegraph.co.uk/news/worldnews/asia/china/11342070/China-bans-burqa-in-capital-of-Muslim-region-of-Xinjiang.html | |||
|access-date=2021-03-13 | |||
|agency=The Telegraph | |||
|issue=13 January 2015 | |||
|newspaper=The Telegraph | |||
|date=13 January 2015 | |||
}} | |||
</ref> [[w:Morocco]], and [[w:Switzerland]].|Wikipedia|[[w:Hijab by country]] as of 2021-03-13}} | |||
Taking into consideration these times of industrial disinformation, it is vicious and uncivilized to have laws banning wearing a the full face veil in public. | |||
== Quotes on the current laws and their application == | |||
{{#lst:Quotes|FinalLineOfDefenseForTheTimeBeing}} | |||
---- | ---- | ||
= Footnotes = | |||
<references group="footnote" /> | |||
= 1st seen in = | |||
<references group="1st seen in" /> | |||
= References = | |||
<references /> | |||
[[Category:In English]] | |||
[[Category:Law]] | |||
[[Category:Antifake]] |
Latest revision as of 15:06, 7 October 2024
This article contains some current laws against abusive uses of synthetic human-like fakes and also information what kind of laws are being prepared and two SSFWIKI original law proposals, one against digital look-alikes and one against digital sound-alikes.
New laws
- UK's Online Safety Act 2023 has been passed to law and reportedly criminalizes non-consensual synthetic pornography.
- The European Union has finalized a law package to regulate AI called the Artificial Intelligence Act
New bills are currently in the works in
- Current bills in the EU of high importance are the almost finalized Directive on combating violence against women and domestic violence and the Regulation to Prevent and Combat Child Sexual Abuse
- US Senate is considering an anti-fake bill and the House is considering several bills
- Canada is considering C-27
- China seems to be planning to ban all synthetic pornography, however consensual its making was
Bills that didn't make it
- Canada's House of Commons was considering banning all pornographic content, for which there is no proof-of-age and written consent from everybody visible in the pornographic recording.
- Past bills in the USA
Information elsewhere / legal information compilations (recommended)
- Existing Nonconsensual Pornography, Sextortion, and Deep Fake Laws at cybercivilrights.org
- A Look at Global Deepfake Regulation Approaches at responsible.ai[2] April 2023 compilation and reporting by Amanda Lawson of the Responsible Artificial Intelligence Institute.
- The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology at legaljournal.princeton.edu[3] compilation and reporting by Caroline Quirk. PLJ is Princeton’s only student-run law review.
- Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography at techpolicy.press[4] May 2023 compilation and reporting by Kaylee Williams
- Deepfake AI laws for USA at foundationra.com, Sextortion laws for USA at foundationra.com and Revenge porn laws for USA at foundationra.com compilations by Foundation RA
- Deepfake laws: is AI outpacing legislation? at onfido.com[5] February 2024 summary and compilation by Aled Owen, Director of Global Policy at Onfido (for-profit)
- Is Deepfake Pornography Illegal? at criminaldefenselawyer.com [6] by Rebecca Pirius is a good bring-together of the current illegality/legality situation in the USA federally and state-wise. Published by w:Nolo (publisher), updated Feb 2024
- Deepfake Pornography: A Legal and Ethical Menace at tclf.in[7] October 2023 compilation and reporting by Janvhi Rastogi, published in the The Contemporary Law Forum.
Australia[edit | edit source]
The Online Safety Act 2021 at legislation.gov.au[1st seen in 1] regulates the non-consensual sharing or threatening to share sexual images.
If the synthetic fake human-like images depict illegal and restricted online content, then the Online Content Scheme, as defined in the Online Safety Act 2021, may apply.[8]
Office of the eSafety Commissioner at esafety.gov.au is Australia's independent regulator for online safety.
Links
- Learn about the Online Safety Act at esafety.gov.au
- Key elements of the Online Safety Act 2021 at infrastructure.gov.au
- Online Safety Bill 2021 at aph.gov.au
- Revenge porn laws in Australia at foundationra.com
Canada[edit | edit source]
The existing Canadian law ban the distribution of non-consensual disclosure of intimate images.[9]
Active bills in Canada[edit | edit source]
Digital Charter Implementation Act - House of Commons of Canada bill C-27[edit | edit source]
Digital Charter Implementation Act at parl.ca or An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
Past bills in Canada[edit | edit source]
Stopping Internet Sexual Exploitation Act - House of Commons of Canada bill C-270[edit | edit source]
- House of Commons of Canada bill C-270 'Stopping Internet Sexual Exploitation Act' at parl.ca, an act to amend the Criminal Code regarding pornography was first read to the Commons by w:Arnold Viersen on Thursday 2022-04-28 at 10:15.
According to townandcountrytoday.com the author of the bill introduced an identical bill C-302 on Thursday 2021-05-27, but that got killed off by oncoming federal elections
Stopping Internet Sexual Exploitation Act - An Act to amend the Criminal Code (pornographic material) is a private member's bill by Arnold Viersen (Official Site at arnoldviersen.ca)
Summary of the C-270 from parl.ca
"This enactment amends the Criminal Code to prohibit a person from making, distributing or advertising pornographic material for commercial purposes without having first ascertained that, at the time the material was made, each person whose image is depicted in the material was 18 years of age or older and gave their express consent to their image being depicted."
Sommaire en français / Summary in French
Le texte modifie le Code criminel afin d’interdire à toute personne de produire ou de distribuer du matériel pornographique à des fins commerciales, ou d’en faire la publicité, sans s’être au préalable assurée qu’au moment de la production du matériel, chaque personne dont l’image y est représentée était âgée de dix-huit ans ou plus et avait donné son consentement exprès à ce que son image y soit représentée.
Links
- C-270 - Stopping Internet Sexual Exploitation Act in English et en français at publications.gc.ca
- Parliament of Canada LEGISinfo: C-270 - Stopping Internet Sexual Exploitation Act at parl.ca in English
- C-270 - Stopping Internet Sexual Exploitation Act at openparliament.ca includes motivation of Mr. Arnold Viersen and the co-sponsor of the bill Mr. Garnett Genuis.
Reporting
China[edit | edit source]
This information should be updated.
Law against synthesis crimes in China[edit | edit source]
Mandatory labeling of fake media in China since 2020[edit | edit source]
On Wednesday January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a w:crime the w:Cyberspace Administration of China (cac.gov.cn) stated on its website. China announced this new law in November 2019.[12] The Chinese government seems to be reserving the right to prosecute both users and w:online video platforms failing to abide by the rules. [13]
Deep Synthesis Provisions 2023[edit | edit source]
On Tuesday 2023-01-10 the Deep Synthesis Provisions came into effect. It was originally drafted in 2022 by the w:Cyberspace Administration of China as Provisions on the Administration of Deep Synthesis Internet Information Services (Draft for solicitation of comments) at chinalawtranslate.com or view the Chinese language draft 国家互联网信息办公室关于《互联网信息服务深度合成管理规定(征求意见稿)》公开征求意见的通知 at cac.gov.cn[1st seen in 2].
Reporting
- China’s New Legislation on Deepfakes: Should the Rest of Asia Follow Suit? at thediplomat.com March 2023 reporting
EU[edit | edit source]
Laws in the EU[edit | edit source]
Artificial Intelligence Act[edit | edit source]
The European Union has a law on AI called w:Artificial Intelligence Act. The European Commission proposed the AI Act in 2021. On Wednesday 2024-03-13 the MEPs adopted this law.[14] The AI Act will have a key role in the effective implementation of upcoming EU Directive on combating violence against women and domestic violence and Regulation to Prevent and Combat Child Sexual Abuse that intend to protect us against synthetic pornography.
- European Artificial Intelligence Act has been approved by the member countries and is on track to be approved by April 2024.[15] Read Proposal for a REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS at eur-lex.europa.eu[1st seen in 3] (also contains translations)
- Artificial Intelligence Act: MEPs adopt landmark law at europarl.europa.eu, a press announcement on the adoption of the Artificial Intelligence Act on Wednesday 2024-03-13.
Studies and information
- Tackling deepfakes in European policy at europarl.europa.eu, a 2021 study by the Panel for the Future of Science and Technology and published by the w:European Parliamentary Research Service. View .pdf at europarl.europa.eu
- The EU Artificial Intelligence Act at artificialintelligenceact.eu is a website promising Up-to-date developments and analyses of the EU AI Act on the new EU law by the Future of Life Institute, an American non-profit NGO.
Reporting
- The AI Act vs. deepfakes: A step forward, but is it enough? at euractiv.com, 2024-02-26 opinion piece by Cristina Vanberghen
Digital Services Act[edit | edit source]
The Digital Services Act package at digital-strategy.ec.europa.eu w:Digital Services Act (DSA) came into force in November 2022.[16]
The Artificial Intelligence Act and Digital Services Act together will help in the enforcement of the upcoming protections to shield us from synthesis crimes.
Bills in the EU[edit | edit source]
Directive on combating violence against women and domestic violence[edit | edit source]
The Directive on combating violence against women and domestic violence at commission.europa.eu will require, among other things, that member states criminalize non-consensual synthetic digital look-alike pornography in their criminal codes.
Official
- Proposal for a DIRECTIVE OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL on combating violence against women and domestic violence at eur-lex.europa.eu has the directive text in all languages and various formats
- Legislative proposal on combating violence against women and domestic violence Legislative Train Schedule at europarl.europa.eu lists this EU bill as close to adoption. (As of March 2024)
- Commission welcomes political agreement on new rules to combat violence against women and domestic violence at ec.europa.eu, a 2024-02-06 press release by the w:European Commission.
- International Women's Day 2022: Commission proposes EU-wide rules to combat violence against women and domestic violence at ec.europa.eu, a 2024-03-08 press release by the EC.
Unofficial
- Deepfakes and dick pics: EU protects women from digital violence at hateaid.org, a 2024-02-07 press release by HateAid.org
- Disrupting the Deepfake Pipeline in Europe at futureoflife.org, 2024-02-22 article on the approach of Leveraging corporate criminal liability under the Violence Against Women Directive to safeguard against pornographic deepfake exploitation by Alexandra Tsalidis
Regulation to Prevent and Combat Child Sexual Abuse[edit | edit source]
- New legislation to fight child sexual abuse online Legislative Train Schedule at europarl.europa.eu lists this EU bill as tabled. (As of March 2024)
- Prevention of online child sexual abuse at consilium.europa.eu
- Timeline - Prevention of online child sexual abuse at consilium.europa.eu
“The Regulation to Prevent and Combat Child Sexual Abuse (Child Sexual Abuse Regulation, or CSAR) is a w:European Union regulation proposed by the w:European Commissioner for Home Affairs w:Ylva Johansson on 11 May 2022. The stated aim of the legislation is to prevent child sexual abuse online through the implementation of a number of measures, including the establishment of a framework that would make the detection and reporting of child sexual abuse material (CSAM) by digital platforms a legal requirement within the European Union.”
Reporting
- New EU rules will criminalise 'paedophilia handbooks' and deepfakes of child abuse at euronews.com, 2024-02-06 reporting
Code of practice on Disinformation 2022[edit | edit source]
The 2022 Code of Practice on Disinformation at digital-strategy.ec.europa.eu - Major online platforms, emerging and specialised platforms, players in the advertising industry, fact-checkers, research and civil society organisations delivered a strengthened Code of Practice on Disinformation following the Commission’s Guidance of May 2021.[17]
Finland[edit | edit source]
Laws in Finland[edit | edit source]
Suomeksi / in Finnish Suomen seksuaalirikoslaki 2023
Law on sexual offences in Finland 2023[edit | edit source]
Law on sexual offences in Finland 2023 is found in Chapter 20 of the Finnish Criminal Code titled "Seksuaalirikoksista" ("Sexual offences") and came into effect on Sunday 2023-01-01.[18]
The new law in Finland protects adults against sexual image based abuse be it real or synthetic in origin.
Other countries have also woken up to the problems of synthesis crime and have legislated laws against synthesis and other related crimes.
Relevant sections of Chapter 20
- 7 § Non-consensual dissemination of a sexual image criminalizes distribution of unauthorized real and synthetic sexual images without permission. (7 § Seksuaalisen kuvan luvaton levittäminen[18])
- 19 § Distribution of an image depicting a child in a sexual manner [18] criminalizes the distribution of real and synthetic child sexual abuse material (CSAM). Attempting this crime is also punishable. (19 § Lasta seksuaalisesti esittävän kuvan levittäminen[18])
- 20 § Aggravated distribution of an image depicting a child in a sexual manner [18] defines the parameters for aggravated form of the crime of making CSAM available. (20 § Törkeä lasta seksuaalisesti esittävän kuvan levittäminen[18])
- 21 § Possession of an image depicting a child in a sexual manner[18] criminalizes the possession of CSAM and acquiring access for the intent to access CSAM. (21 § Lasta seksuaalisesti esittävän kuvan hallussapito)[18])
This 2023 upgrade and gather-together of the Finnish Criminal Code on sexual offences was made upon the initiative of the 2019-2023 w:Marin Cabinet, was voted into law by the w:Members of the Parliament of Finland, 2019–2023 and it came into effect on Sunday 2023-01-01.
Translation to English by the Ministry of Justice: Criminal Code (39/1889) - Chapter 20 - Sexual offences (translation) as .pdf at oikeusministerio.fi (subject to possible revisions)
Finland criminalized synthetic CSAM in 2011[edit | edit source]
Distribution and attempt of distribution and also possession of synthetic CSAM was already criminalized earlier on 2011-06-01 upon the initiative of the w:Vanhanen II Cabinet. Real CSAM was already criminalized before this improvement. These protections against real and synthetic CSAM were moved in the criminal code into 19 §, 20 § and 21 § of chapter 20 in the 2023 sexual offences legislation improvement.
Germany[edit | edit source]
- Deepfakes and German law at mj-cohen.com, January 2020 summary by Maureen Cohen states that the existing German laws are well equipped rendering non-consensual digital look-alikes illegal.
India[edit | edit source]
Laws in India
Laws in India can be accessed through the India Code at indiacode.nic.in
w:Information Technology Act, 2000
- Section 66. - Computer related offences.[19]
- Section 66A. - Omitted. Would have been Punishment for sending offensive messages through communication service, etc.
- Section 66B. - Punishment for dishonestly receiving stolen computer resource or communication device.
- Section 66C. - Punishment for identity theft.[20]
- Section 66D. - Punishment for cheating by personation by using computer resource.[21]
- Section 66E. - Punishment for violation of privacy.[22]
- Section 66F. - Punishment for cyber terrorism.
- Section 67. - Punishment for publishing or transmitting obscene material in electronic form.[23]
- Section 67A. - Punishment for publishing or transmitting of material containing sexually explicit act, etc., in electronic form.[24]
- Section 67B. - Punishment for publishing or transmitting of material depicting children in sexually explicit act, etc., in electronic form.[25]
- Section 67C. - Preservation and retention of information by intermediaries.
w:Information Technology Rules, 2021
Past bills in India
Legal compilations on the legal situation against synthetic filth in India
- Deepfake Pornography: A Legal and Ethical Menace at tclf.in[7] October 2023 compilation and reporting by Janvhi Rastogi, published in the The Contemporary Law Forum.
- Deepfakes And Breach Of Personal Data – A Bigger Picture at livelaw.in[26] 2023-11-24 compilation on laws against synthetic fakes in India, by Vikrant Rana, Anuradha Gandhi and Rachita Thakur
- What Is Deep Fake Cyber Crime? What Does Indian Law Say About It? at cybercert.in[27]
- Cyberlaw in India at cybercert.in provides a wider look into the Indian Cyber Laws
New Zealand[edit | edit source]
- 'Harmful Digital Communications Act 2015' at legislation.govt.nz[1st seen in 1] criminalises intimate visual recordings and image-based sexual abuse.
- w:Netsafe netsafe.org.nz is an online safety non-profit organisation in New Zealand. It provides educational, anti-bullying and support services. The organisation is contracted under the w:Harmful Digital Communications Act until 2026. (Wikipedia)
- CERT.govt.nz, the w:Computer emergency response team of New Zealand - "Responding to cyber security threats in New Zealand - CERT NZ is your first port of call when you need to report a cyber security problem."
- The w:Department of Internal Affairs investigates the possession of, and trading in, child exploitation material.[28]
Links regarding the Harmful Digital Communications Act 2015
- 'Harmful Digital Communications (HDC)' at police.govt.nz by the w:New Zealand Police
- 'Harmful digital communications' at justice.govt.nz by the w:Ministry of Justice (New Zealand)
- What is the HDCA? at netsafe.org.nz by w:Netsafe, the agency approved by the w:New Zealand Police to process complaints about harmful digital communications.
- See Wikipedia article on w:Harmful Digital Communications Act 2015 for more info
Singapore[edit | edit source]
Law in Singapore[edit | edit source]
Protection from Online Falsehoods and Manipulation Act 2019[edit | edit source]
w:Protection from Online Falsehoods and Manipulation Act 2019[1st seen in 4] is a w:statute of the w:Parliament of Singapore that enables authorities to tackle the spread of w:fake news or w:false information. (Wikipedia)
South Africa[edit | edit source]
Cybercrimes Act 19 of 2020[edit | edit source]
South-Africa Cybercrimes Act 19 of 2020 (English / Afrikaans) at gov.za[1st seen in 5] came only partially into effect on Wednesday 2021-12-01.[29]
- Cybercrimes Act, Chapter 2 Cybercrimes, Section 16 - Disclosure of data message of intimate image at cybercrimesact.co.za Subsection 1 states
Any person (‘‘A’’) who unlawfully and intentionally discloses, by means of an electronic communications service, a data message of an intimate image of a person (‘‘B’’), without the consent of B, is guilty of an offence.
Any person who unlawfully and intentionally
- attempts;
- conspires with any other person; or
- aids, abets, induces, incites, instigates, instructs, commands or procures another person, to commit an offence in terms of Part I or Part II of this Chapter, is guilty of an offence and is liable on conviction to the punishment to which a person convicted of actually committing that offence would be liable.
Links
- Cybercrimes Act at cybercrimesact.co.za by Accessible Law contains the law in website format
Reporting
South Korea[edit | edit source]
Law in South Korea[edit | edit source]
ACT ON SPECIAL CASES CONCERNING THE PUNISHMENT OF SEXUAL CRIMES[edit | edit source]
ACT ON SPECIAL CASES CONCERNING THE PUNISHMENT OF SEXUAL CRIMES at elaw.klri.re.kr is a law in South Korea.[1st seen in 6]
UK[edit | edit source]
Law against synthesis crimes in the UK[edit | edit source]
On Tuesday 2024-04-16 the UK announced that creating sexually explicit deepfake images to be made offence in UK.[30] This is a very good move and next logical step would be the prohibition of non-consensually possessing other people's appearance models and voice models, as suggested in Law proposals against synthetic filth by Juho Kunsola. There is really no logical reason to allow criminal leagues to legally possess and trade their libraries of models without this acquisition through covert modeling or trading and possession of raw materials (covert models) to produce the disinformation weapons being illegal.
Online Safety Act 2023[edit | edit source]
w:Online Safety Act 2023 Online Safety Act 2023 at legislation.gov.uk received Royal Assent on 2023-10-26[31] and it reportedly criminalizes non-consensual synthetic pornography.
- Part 4 - CSEA reporting of the Act on CSEA reporting has not yet come into effect as of February 2024.
- Section 66 requires providers of services regulated under the Act to have systems and processes in place (so far as possible) to ensure that they report all detected and unreported CSEA content present on the service to the NCA (National Crime Agency)
- Offence in relation to CSEA reporting in Section 69
- Part 7 – Enforcement offences
- Part 10 - Communication offences establishes
- False communications offence in Section 179
- Threatening communications offence in Section 181
- Offences of sending or showing flashing images electronically in Section 183
- Offence of encouraging or assisting serious self-harm in Section 184
- Sending etc photograph or film of genitals in Section 187
- Sharing or threatening to share intimate photograph or film in Section 188
The Online Safety Act 2023 came to be from a House of Lords bill UK's HL Bill 151 - Online Safety Bill at bills.parliament.uk The bill originated from the House of Commons sessions 2021-22 2022-23.
- Online Safety Act: new criminal offences circular at gov.uk published 2024-01-31 - It is a circular is issued to inform the police and other relevant public authorities of certain provisions of the Online Safety Act in particular new criminal offences.
- Creating a safer life online for people in the UK at ofcom.org.uk
- A guide to the Online Safety Bill at gov.uk
- Documents, publications and announcements relating to the government's Online Safety Bill at gov.uk
Reporting and summaries of the Online Safety Act
- The Online Safety Act 2023: a primer at infolaw.co.uk, a 2023 primer on the Online Safety Act by Alex Heshmaty on 2023-11-29
- UK: The Online Safety Act is now law; Ofcom’s powers as online safety regulator have officially commenced at epra.org states that there will three successive implementation phases.
The Domestic Abuse Act 2021 Chapter 17, part 6 - Disclosure of private sexual photographs and films[edit | edit source]
w:Domestic Abuse Act 2021 / Chapter 17 / Part 6 - Offences involving abusive or violent behaviour / Disclosure of private sexual photographs and films - Threats to disclose private sexual photographs and films with intent to cause distress
According to the UK-based Revenge Porn Helpline at revengepornhelpline.org.uk article What to do if someone is threatening to share your intimate images threats to share intimate images with the intent to cause distress is now an offense in UK law. This is included within the w:Domestic Abuse Act 2021 which was enacted into UK law on 29th June 2021.[32]
We don't know quite yet if the bug has been fixed, that if the pictures are not pictures of you, but synthetic human-like fakes the police cannot do anything.
Links
Historical about the UK law against synthesis crimes[edit | edit source]
The UK law was not very up-to-date on the issue of synthetic filth until recent improvements.
The independent w:Law Commission (England and Wales) reviewed the law where it applies to taking, making and sharing intimate images without consent. The outcome of the consultation was due to be published later in 2021.[33]
"In 2019, law expert Dr Aislinn O’Connell told w:The Independent that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The w:Women and Equalities Committee called on the UK Government to introduce new legislation on image-based sexual abuse in order to criminalise ALL non-consensual creation and distribution of intimate sexual images."[34] This call is for similar laws as California put in place on January 1 2020.
USA[edit | edit source]
- Is Deepfake Pornography Illegal? at criminaldefenselawyer.com [6] by Rebecca Pirius is a good bring-together of the current illegality/legality situation in the USA federally and state-wise. Published by w:Nolo (publisher), updated Feb 2024
- Deepfake AI laws for USA at foundationra.com, Sextortion laws for USA at foundationra.com and Revenge porn laws for USA at foundationra.com compilations by Foundation RA
See also:
- Current bills in the USA and past bills in the USA
- The Legislative Process: Introduction and Referral of Bills (Video) at congress.gov
- Glossary of Legislative Terms at congress.gov
Law against synthesis crimes in Virginia 2019[edit | edit source]
Code of Virginia § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty[edit | edit source]
Since July 1 2019[35] w:Virginia w:has criminalized the sale and dissemination of unauthorized synthetic pornography, but not the manufacture.[36], as section § 18.2-386.2 titled 'Unlawful dissemination or sale of images of another; penalty.' became part of the w:Code of Virginia.
Code of Virginia (TOC) » Title 18.2. Crimes and Offenses Generally » Chapter 8. Crimes Involving Morals and Decency » Article 5. Obscenity and Related Offenses » Section § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty
The section § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty. of Virginia is as follows:
A. Any w:person who, with the w:intent to w:coerce, w:harass, or w:intimidate, w:maliciously w:disseminates or w:sells any videographic or still image created by any means whatsoever that w:depicts another person who is totally w:nude, or in a state of undress so as to expose the w:genitals, pubic area, w:buttocks, or female w:breast, where such person knows or has reason to know that he is not w:licensed or w:authorized to disseminate or sell such w:videographic or w:still image is w:guilty of a Class 1 w:misdemeanor.
- For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's w:face, w:likeness, or other distinguishing characteristic.
B. If a person uses w:services of an w:Internet service provider, an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.
C. Venue for a prosecution under this section may lie in the w:jurisdiction where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.
D. The provisions of this section shall not preclude prosecution under any other w:statute.[36]
The identical bills were House Bill 2678 presented by w:Delegate w:Marcus Simon to the w:Virginia House of Delegates on January 14 2019 and three day later an identical Senate bill 1736 was introduced to the w:Senate of Virginia by Senator w:Adam Ebbin.
Law against synthesis crimes in Texas 2019[edit | edit source]
Texas SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election[edit | edit source]
On September 1 2019 w:Texas Senate bill SB 751 - Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election w:amendments to the election code came into effect in the w:Law of Texas, giving w:candidates in w:elections a 30-day protection period to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality"[37] SB 751 was introduced to the Senate by w:Bryan Hughes (politician).[38]
The text of S.B. No. 751 is as follows
AN ACT relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election.
BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF TEXAS:
SECTION 1. Section 255.004, Election Code, is w:amended by adding Subsections (d) and (e) to read as follows:
- (d) A person commits an offense if the person, with intent to injure a candidate or influence the result of an election:
- creates a deep fake video; and
- causes the deep fake video to be published or distributed within 30 days of an election.
- (e) In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality.
SECTION 2. This Act takes effect September 1, 2019.
Law against synthesis crimes in California 2020[edit | edit source]
California AB-602 - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action[edit | edit source]
January 1 2020 [39] the w:California w:US state law "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." came into effect in the civil code of the w:California Codes banning the manufacturing and w:digital distribution of synthetic pornography without the w:consent of the people depicted. AB-602 provides victims of synthetic pornography with w:injunctive relief and poses legal threats of w:statutory and w:punitive damages on w:criminals making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California w:Governor w:Gavin Newsom on October 3 2019 and was authored by w:California State Assemblymember w:Marc Berman and an identical Senate bill was coauthored by w:California Senator w:Connie Leyva.[40][41] AB602 at trackbill.com
Introduction by Assemblymember Marc Berman:
AB 602, Berman. Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.
Existing law creates a private w:right of action against a person who intentionally distributes a photograph or recorded image of another that exposes the intimate body parts of that person or of a person engaged in a sexual act without the person’s consent if specified conditions are met.
This bill would provide that a depicted individual, as defined, has a w:cause of action against a person who either
- (1) creates and intentionally discloses sexually explicit material if the person knows or reasonably should have known the depicted individual did not w:consent to its creation or disclosure or
- (2) who intentionally discloses sexually explicit material that the person did not create if the person knows the depicted individual did not consent to its creation.
The bill would specify exceptions to those provisions, including if the material is a matter of legitimate public concern or a work of political or newsworthy value.
The bill would authorize a prevailing w:plaintiff who suffers harm to seek w:injunctive relief and recover reasonable w:attorney’s fees and costs as well as specified monetary w:damages, including statutory and w:punitive damages.
The law is as follows:
SECTION 1. Section 1708.86 is added to the Civil Code of California, to read:
1708.86. (a) For purposes of this section:
- (1) “Altered depiction” means a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
- (2) “Authorized Representative” means an attorney, talent agent, or personal manager authorized to represent a depicted individual if the depicted individual is represented.
- (3) (A) “Consent” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.
- (3) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied:
- (i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it.
- (ii) The depicted individual’s authorized representative provides written approval of the signed agreement.
- (4) “Depicted individual” means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.
- (5) “Despicable conduct” means conduct that is so vile, base, or contemptible that it would be looked down on and despised by a reasonable person.
- (6) “Digitization” means to realistically depict any of the following:
- (A) The nude body parts of another human being as the nude body parts of the depicted individual.
- (B) Computer-generated nude body parts as the nude body parts of the depicted individual.
- (C) The depicted individual engaging in sexual conduct in which the depicted individual did not engage.
- (7) “Disclose” means to publish, make available, or distribute to the public.
- (8) “Individual” means a natural person.
- (9) “Malice” means that the defendant acted with intent to cause harm to the plaintiff or despicable conduct that was done with a willful and knowing disregard of the rights of the plaintiff. A person acts with knowing disregard within the meaning of this paragraph when they are aware of the probable harmful consequences of their conduct and deliberately fail to avoid those consequences.
- (10) “Nude” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola.
- (11) “Person” means a human being or legal entity.
- (12) “Plaintiff” includes cross-plaintiff.
- (13) “Sexual conduct” means any of the following:
- (A) Masturbation.
- (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals.
- (C) Sexual penetration of the vagina or rectum by, or with, an object.
- (D) The transfer of semen by means of sexual conduct from the penis directly onto the depicted individual as a result of ejaculation.
- (E) Sadomasochistic abuse involving the depicted individual.
(14) “Sexually explicit material” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct.
(b) A depicted individual has a cause of action against a person who does either of the following:
- (1) Creates and intentionally discloses sexually explicit material and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation or disclosure.
- (2) Intentionally discloses sexually explicit material that the person did not create and the person knows the depicted individual in that material did not consent to the creation of the sexually explicit material.
(c) (1) A person is not liable under this section in either of the following circumstances:
- (A) The person discloses the sexually explicit material in the course of any of the following:
- (i) Reporting unlawful activity.
- (ii) Exercising the person’s law enforcement duties.
- (iii) Hearings, trials, or other legal proceedings.
- (B) The material is any of the following:
- (i) A matter of legitimate public concern.
- (ii) A work of political or newsworthy value or similar work.
- (iii) Commentary, criticism, or disclosure that is otherwise protected by the California Constitution or the United States Constitution.
- (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value solely because the depicted individual is a public figure.
(d) It shall not be a defense to an action under this section that there is a disclaimer included in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
(e) (1) A prevailing plaintiff who suffers harm as a result of the violation of subdivision (b) may recover any of the following:
- (A) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material.
- (B) One of the following:
- (i) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress.
- (ii) Upon request of the plaintiff at any time before the final judgment is rendered, the plaintiff may instead recover an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, as follows:
- (I) A sum of not less than one thousand five hundred dollars ($1,500) but not more than thirty thousand dollars ($30,000).
- (II) If the unlawful act was committed with malice, the award of statutory damages may be increased to a maximum of one hundred fifty thousand dollars ($150,000).
- (C) Punitive damages.
- (D) Reasonable attorney’s fees and costs.
- (E) Any other available relief, including injunctive relief.
(2) The remedies provided by this section are cumulative and shall not be construed as restricting a remedy that is available under any other law.
(f) An action under this section shall be commenced no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence.
(g) The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions.
Law against synthesis crimes in Georgia in 2021[edit | edit source]
Georgia Code § 16-11-90[edit | edit source]
Georgia Code / Title 16. Crimes and Offenses Chapter 11 "Offenses Against Public Order and Safety", Article 3 "Invasions of Privacy", Part 3 "Invasion of privacy" GA CODE § 16-11-90 [1st seen in 7]
The law is as of April 14, 2021[43] as follows:
(a) As used in this Code section, the term:
- (1) “Harassment” means engaging in conduct directed at a depicted person that is intended to cause substantial emotional harm to the depicted person.
- (2) “Nudity” means:
- (A) The showing of the human male or female genitals, pubic area, or buttocks without any covering or with less than a full opaque covering;
- (B) The showing of the female breasts without any covering or with less than a full opaque covering; or
- (C) The depiction of covered male genitals in a discernibly turgid state.
- (3) “Sexually explicit conduct” shall have the same meaning as set forth in Code Section 16-12-100 .
(b) A person violates this Code section if he or she, knowing the content of a transmission or post, knowingly and without the consent of the depicted person:
- (1) Electronically transmits or posts, in one or more transmissions or posts, a photograph or video which depicts nudity or sexually explicit conduct of an adult, including a falsely created videographic or still image, when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person; or
- (2) Causes the electronic transmission or posting, in one or more transmissions or posts, of a photograph or video which depicts nudity or sexually explicit conduct of an adult, including a falsely created videographic or still image, when the transmission or post is harassment or causes financial loss to the depicted person and serves no legitimate purpose to the depicted person.
- Nothing in this Code section shall be construed to impose liability on an interactive computer service, as such term is defined in 47 U.S.C. 230(f)(2) , or an information service or telecommunications service, as such terms are defined in 47 U.S.C. 153 , for content provided by another person.
(c) Any person who violates this Code section shall be guilty of a misdemeanor of a high and aggravated nature; provided, however, that upon a second or subsequent violation of this Code section, he or she shall be guilty of a felony and, upon conviction thereof, shall be punished by imprisonment of not less than one nor more than five years, a fine of not more than $100,000.00, or both.
(d) A person shall be subject to prosecution in this state pursuant to Code Section 17-2-1 for any conduct made unlawful by this Code section which the person engages in while:
- (1) Either within or outside of this state if, by such conduct, the person commits a violation of this Code section which involves an individual who resides in this state; or
- (2) Within this state if, by such conduct, the person commits a violation of this Code section which involves an individual who resides within or outside this state.
(e) The provisions of subsection (b) of this Code section shall not apply to:
- (1) The activities of law enforcement and prosecution agencies in the investigation and prosecution of criminal offenses;
- (2) Legitimate medical, scientific, or educational activities;
- (3) Any person who transmits or posts a photograph or video depicting only himself or herself engaged in nudity or sexually explicit conduct;
- (4) The transmission or posting of a photograph or video that was originally made for commercial purposes;
- (5) Any person who transmits or posts a photograph or video depicting a person voluntarily engaged in nudity or sexually explicit conduct in a public setting; or
- (6) A transmission that is made pursuant to or in anticipation of a civil action.
(f) There shall be a rebuttable presumption that an information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet, for content provided by another person, does not know the content of an electronic transmission or post.
(g) Any violation of this Code section shall constitute a separate offense and shall not merge with any other crimes set forth in this title.
Law against synthesis crimes in New York State in 2021[edit | edit source]
New York State - CHAPTER 6 Civil Rights ARTICLE 5 Right of Privacy SECTION 52-C - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual[edit | edit source]
Consolidated Laws of New York / CHAPTER 6 Civil Rights ARTICLE 5 Right of Privacy / SECTION 52-C - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual[1st seen in 7]
The law is as follows:
§ 52-c - Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual. Obs: There are 2 § 52-c's
1. For the purposes of this section:
- a. "depicted individual" means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
- b. "digitization" means to realistically depict the nude body parts of another human being as the nude body parts of the depicted individual, computer-generated nude body parts as the nude body parts of the depicted individual or the depicted individual engaging in sexual conduct, as defined in subdivision ten of section 130.00 of the penal law, in which the depicted individual did not engage.
- c. "individual" means a natural person.
- d. "person" means a human being or legal entity.
- e. "sexually explicit material" means any portion of an audio visual work that shows the depicted individual performing in the nude, meaning with an unclothed or exposed intimate part, as defined in section 245.15 of the penal law, or appearing to engage in, or being subjected to, sexual conduct, as defined in subdivision ten of section 130.00 of the penal law.
2. a. A depicted individual shall have a cause of action against a person who, discloses, disseminates or publishes sexually explicit material related to the depicted individual, and the person knows or reasonably should have known the depicted individual in that material did not consent to its creation, disclosure, dissemination, or publication.
- b. It shall not be a defense to an action under this section that there is a disclaimer in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
3. a. A depicted individual may only consent to the creation, disclosure, dissemination, or publication of sexually explicit material by knowingly and voluntarily signing an agreement written in plain language that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.
- b. A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied:
- i. the depicted individual is given at least three business days to review the terms of the agreement before signing it; or
- ii. if the depicted individual is represented, the attorney, talent agent, or personal manager authorized to represent the depicted individual provides additional written approval of the signed agreement.
4. a. A person is not liable under this section if:
- i. the person discloses, disseminates or publishes the sexually explicit material in the course of reporting unlawful activity, exercising the person's law enforcement duties, or hearings, trials or other legal proceedings; or
- ii. the sexually explicit material is a matter of legitimate public concern, a work of political or newsworthy value or similar work, or commentary, criticism or disclosure that is otherwise protected by the constitution of this state or the United States; provided that sexually explicit material shall not be considered of newsworthy value solely because the depicted individual is a public figure.
5. In any action commenced pursuant to this section, the finder of fact, in its discretion, may award injunctive relief, punitive damages, compensatory damages, and reasonable court costs and attorney's fees.
6. A cause of action or special proceeding under this section shall be commenced the later of either:
- a. three years after the dissemination or publication of sexually explicit material; or
- b. one year from the date a person discovers, or reasonably should have discovered, the dissemination or publication of such sexually explicit material.
7. Nothing in this section shall be read to require a prior criminal complaint, prosecution or conviction to establish the elements of the cause of action provided for in this section.
8. The provisions of this section including the remedies are in addition to, and shall not supersede, any other rights or remedies available in law or equity.
9. If any provision of this section or its application to any person or circumstance is held invalid, the invalidity shall not affect other provisions or applications of this section which can be given effect without the invalid provision or application, and to this end the provisions of this section are severable.
10. Nothing in this section shall be construed to limit, or to enlarge, the protections that 47 U.S.C. § 230 confers on an interactive
computer service for content provided by another information content provider, as such terms are defined in 47 U.S.C. § 230.
Current bills in the USA[edit | edit source]
US Senate bill S. 3696 - Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 (DEFIANCE Act of 2024) (118th Congress - 2023-2024)[edit | edit source]
S.3696 - DEFIANCE Act of 2024 at congress.gov, a bipartisan Senate bill against synthetic filth.
- Introductory Statement on S. 3696; Congressional Record Vol. 170, No. 17 at congress.gov 2024-01-30
- Durbin, Graham, Klobuchar, Hawley Introduce DEFIANCE Act to Hold Accountable Those Responsible for the Proliferation of Nonconsensual, Sexually-Explicit “Deepfake” Images and Videos at judiciary.senate.gov[1st seen in 8]
- S. 3696 (IS) - Disrupt Explicit Forged Images And Non-Consensual Edits Act of 2024 at govinfo.gov
- DEFIANCE Act of 2024 at durbin.senate.gov explains that it would create a federal civil remedy for victims who are identifiable in a “digital forgery".
Reporting and commentary on DEFIANCE Act of 2024 bill
- Taylor Swift AI images prompt US bill to tackle nonconsensual, sexual deepfakes at theguardian.com
- Lawmakers propose anti-nonconsensual AI porn bill after Taylor Swift controversy at theverge.com
US House bill H.R.7123 - Quashing Unwanted and Interruptive Electronic Telecommunications Act (QUIET Act) (118th Congress - 2023-2024)[edit | edit source]
Quashing Unwanted and Interruptive Electronic Telecommunications Act at congress.gov[1st seen in 9]
US House bill H.R.6943 - No AI FRAUD Act (118th Congress - 2023-2024)[edit | edit source]
No Artificial Intelligence Fake Replicas And Unauthorized Duplications Act of 2024 or the (No AI FRAUD Act) at congress.gov was introduced to the 118th Congress 2nd session on 2024-01-24.[1st seen in 10]
US House bill H.R.5586 - Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023 (118th Congress - 2023-2024)[edit | edit source]
“Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2023” or the “DEEPFAKES Accountability Act” at congress.gov is a reintroduction of earlier House bill H.R.3230 to the 118th Congress (2023-2024 session)
US House bill H.R. 3106 - Preventing Deepfakes of Intimate Images Act (118th Congress - 2023-2024)[edit | edit source]
Preventing Deepfakes of Intimate Images Act at congress.gov was introduced in the House on 2023-05-05. It was a reintroduction of H.R. 9631 from the 117th Congress.
NY Senate bill S5583 in the 2023-2024 regular session[edit | edit source]
NY Senate bill S5583 in the 2023-2024 regular session at nysenate.gov would establish the crime of aggravated harassment by means of electronic or digital communication and provides for a private right of action for the unlawful dissemination or publication of deep fakes.
Past bills in the USA[edit | edit source]
US Senate bill S1641 - Preventing Rampant Online Technological Exploitation and Criminal Trafficking Act of 2022 - (117th Congress)[edit | edit source]
The bill known as Senate bill S.4991 - 'PROTECT Act' at congress.gov was read by Senator w:Mike Lee on Wednesday 2022-09-28 twice.
New York Senate bill - Unlawful Electronic Transmission of Sexually Explicit Visual Material - in regular session 2021-2022[edit | edit source]
Bill Unlawful Electronic Transmission of Sexually Explicit Visual Material' is essentially a bill that aims to ban sending unsolicited nudes.
In the 2021-2022 w:New York State Senate regular sessions, on 2021-01-14 Senator w:James Skoufis (official website) sponsored and Senators w:Brian Benjamin (official website) and w:Todd Kaminsky (official website) of the New York State Senate co-sponsored New York Senate bill S1641 to add section § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL to the Article 250 of the penal law. On 2021-03-19 an identical New York Assembly bill A6517 - Establishes the crime of unlawful electronic transmission of sexually explicit visual material was introduced to the w:New York State Assembly by Assembly Member w:Aileen Gunther (official website).[1st seen in 11]
If this bill passes it will be codified in the w:Consolidated Laws of New York. View the Consolidated Laws of New York at nysenate.gov.
- Title of bill: An act to amend the penal law, in relation to the creation of the criminal offense of unlawful electronic transmission of sexually explicit visual material
- Purpose: The purpose of this bill is to make it unlawful to send sexually explicit material through electronic means unless the material is sent at the request of, or with the express consent of the recipient.
- Summary of provisions: Adds a new section 250.70 to the penal law making it unlawful to knowingly transmit by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed unless the material is sent at the request of, or with the express consent of the recipient.
- Justification: Currently under New York State law, indecent exposure in person is a crime, but it is not unlawful to send sexually explicit photos to nonconsenting adult recipients through electronic transmission. With the growing modem age of online dating, many individuals are receiving sexually explicit visual content without their consent from strangers. No person should be forced to view sexually explicit material without their consent.
The bill offers a clear deterrent to those considering sending unsolicited sexual pics and similar inappropriate conduct, and protects the unwilling recipients who currently have no legal recourse for such abuses.
What is illegal in the real world must be illegal in the digital world, and this legislation is a first step in the right direction in adding that accountability.
- Legislative history:
- Senate - 2020 - S5949 Referred to Codes
- Assembly - 2020 - A7801 Referred to Codes
- Fiscal implications: Minimal
- Effective date: This act shall take effect on the first of November next succeeding the date on which it shall have become a law.
The text of the bill is, as of 2021-03-24, as follows:
- "Section 1. The penal law is amended by adding a new section 250.70 to read as follows:
- § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL.
- A person is guilty of unlawful electronic transmission of sexually explicit visual material if a person knowingly transmits by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed or depicts the covered genitals of a male person that are in a discernibly turgid state and such visual material is not sent at the request of or with the express consent of the recipient. For purposes of this section the term "intimate parts" means the naked genitals, pubic area, anus, or female postpubescent nipple of the person and the term "sexual conduct" shall have the same meaning as defined in section 130.00 (Sex offenses; definitions of terms) of this chapter. Unlawful electronic transmission of sexually explicit visual material is a class a misdemeanor.
- § 2. This act shall take effect on the first of November next succeeding the date on which it shall have become a law."
Senator w:James Skoufis (official website) sponsored New York Senate bill S1641 to add section § 250.70 UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL to the Article 250 of the penal law.
Senator w:Brian Benjamin (official website) is a cosponsor of S1641
Senator w:Todd Kaminsky (official website) is a cosponsor of S1641
NY Assembly Member w:Aileen Gunther (official website) presented an identical New York Assembly bill A6517 - Establishes the crime of unlawful electronic transmission of sexually explicit visual material to the w:New York State Assembly on 2021-03-19.
US Senate bill - Stop Internet Sexual Exploitation Act - 2019-2020 US Senate session (116th Congress)[edit | edit source]
The Stop Internet Sexual Exploitation Act (SISE) was a bill introduced to the 2019-2020 session of the US Senate.
US House bill - H.R.3230 - DEEP FAKES Accountability Act (116th Congress)[edit | edit source]
- H.R.3230 - DEEP FAKES Accountability Act at congress.gov[1st seen in 4] also known as Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019 and it aimed to include requirements for producers of synthetic human-like fakes to generally comply with certain w:digital watermark and disclosure requirements.[45]
US Senate bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress)[edit | edit source]
- S.3805 - Malicious Deep Fake Prohibition Act of 2018 at congress.gov[1st seen in 4] aimed at criminalizing synthesizing human-likeness media for intent of breaking federal, state, local or tribal law
Law proposals[edit | edit source]
Law proposals against synthetic filth by Juho Kunsola[edit | edit source]
- Audience: Developed with suitability for national, supranational and UN treaty levels.
- Writing context:
- Written from context of inclusion to criminal codes.
- I'm a Finn so this has been worded to fit in the Chapter 24 of the Criminal Code of Finland (in Finnish at finlex.fi) titled "Offences against privacy, public peace and personal reputation"
- Access the English translations of the Finnish Criminal Code at finlex.fi or go straight to the latest .pdf from 2016. Chapter 24 starts on page 107.
- History: This version is an evolution of a Finnish language original written in 2016.
Existing law in Chapter 24. of the Finnish Criminal Code - "Offences against privacy, public peace and personal reputation" seems to be ineffective against many synthetic human-like fake attack and seems it could be used to frame victims for crimes with digital sound-alikes.
The portions affected by or affecting the synthetic filth situation in bold font:
- Section 1 - Invasion of domestic premises (879/2013)
- Section 1(a) - Harassing communications (879/2013)
- Section 2 - Aggravated invasion of domestic premises (531/2000)
- Section 3 - Invasion of public premises (585/2005)
- Section 4 - Aggravated invasion of public premises (531/2000)
- Section 5 - Eavesdropping (531/2000)
- Section 6 - Illicit observation (531/2000)
- Section 7 - Preparation of eavesdropping or illicit observation (531/2000)
- Section 8 - Dissemination of information violating personal privacy (879/2013)
- Section 8(a) - Aggravated dissemination of information violating personal privacy (879/2013)
- Section 9 - Defamation (879/2013)
- Section 10 - Aggravated defamation (879/2013)
- Section 11 - Definition (531/2000)
- Section 12 - Right to bring charges (879/2013)
- Section 13 - Corporate criminal liability (511/2011)
Law proposal to ban visual synthetic filth[edit | edit source]
§1 Models of human appearance[edit | edit source]
A model of human appearance means
- A realistic 3D model
- A 7D bidirectional reflectance distribution function model
- A direct-to-2D capable w:machine learning model
- Or a model made with any technology whatsoever, that looks deceivingly like the target person.
§2 Producing synthetic pornography[edit | edit source]
Making projections, still or videographic, where targets are portrayed in a nude or in a sexual situation from models of human appearance defined in §1 without express consent of the targets is illegal.
§3 Distributing synthetic pornography[edit | edit source]
Distributing, making available, public display, purchase, sale, yielding, import and export of non-authorized synthetic pornography defined in §2 are punishable.[footnote 1]
§4 Aggravated producing and distributing synthetic pornography[edit | edit source]
If the media described in §2 or §3 is made or distributed with the intent to frame for a crime or for blackmail, the crime should be judged as aggravated.
Afterwords[edit | edit source]
The original idea I had was to ban both the raw materials i.e. the models to make the visual synthetic filth and also the end product weaponized synthetic pornography, but then in July 2019 it appeared to me that Adequate Porn Watcher AI (concept) could really help in this age of industrial disinformation if it were built, trained and operational. Banning modeling of human appearance was in conflict with the revised plan.
It is safe to assume that collecting permissions to model each pornographic recording is not plausible, so an interesting question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet.
In case we want to pursue banning modeling people's appearance from non-pornographic images/videos without explicit permission be pursued it must be formulated so that this does not make Adequate Porn Watcher AI (concept) illegal / impossible. This would seem to lead to a weird situation where modeling a human from non-pornographic media would be illegal, but modeling from pornography legal.
Law proposal to ban unauthorized modeling of human voice[edit | edit source]
Motivation: The current situation where the criminals can freely trade and grow their libraries of stolen voices is unwise.
§1 Unauthorized modeling of a human voice[edit | edit source]
Acquiring such a model of a human's voice, that deceptively resembles some dead or living person's voice and the possession, purchase, sale, yielding, import and export without the express consent of the target are punishable.
§2 Application of unauthorized voice models[edit | edit source]
Producing and making available media from covert voice models defined in §1 is punishable.
§3 Aggravated application of unauthorized voice models[edit | edit source]
If the produced media is for a purpose to
- frame a human target or targets for crimes
- to attempt extortion or
- to defame the target,
the crime should be judged as aggravated.
Resources and reporting on law[edit | edit source]
AI and law in general[edit | edit source]
Reviews and regulation From the w:Library of Congress:
- 'Regulation of Artificial Intelligence' at loc.gov
- 'Regulation of Artificial Intelligence: Comparative Summary' at loc.gov
- 'Regulation of Artificial Intelligence: International and Regional Approaches' (loc.gov)
- 'Regulation of Artificial Intelligence: The Americas and the Caribbean' (loc.gov)
- 'Regulation of Artificial Intelligence: East/South Asia and the Pacific' (loc.gov)
- 'Regulation of Artificial Intelligence: Europe and Central Asia' loc.gov
- 'Regulation of Artificial Intelligence: Middle East and North Africa' (loc.gov)
- 'Regulation of Artificial Intelligence: Sub-Saharan Africa' (loc.gov)
w:Gibson Dunn & Crutcher (gibsondunn.com) publishes a quarterly legal update on 'Artificial Intelligence and Autonomous Systems'. Gibson Dunn & Crutcher is a global w:law firm, founded in Los Angeles in 1890.
- 'Artificial Intelligence and Autonomous Systems Legal Update' Quarter 4 2018 at Gibson & Dunn
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 1 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 2 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 3 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 4 2019'
From Europe
- 'The ethics of artificial intelligence: Issues and initiatives' (.pdf) at europarl.europa.eu, a March 2020 study by the w:European Parliamentary Research Service Starting from page 37 the .pdf lists organizations in the field.
Synthetic filth in the law and media[edit | edit source]
- '"The New Weapon of Choice": Law's Current Inability to Properly Address Deepfake Pornography' at scholarship.law.vanderbilt.edu, October 2020 Notes by Anne Pechenik Gieseke published in the The w:Vanderbilt Law Review, the flagship w:academic journal of w:Vanderbilt University Law School.
- 'Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios' at carnegieendowment.org, a 2020-07-08 assessment identifies some types of criminalities that can be made using synthetic human-like fakes.
- 'Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes' at ssri.duke.edu, an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the w:Duke University
- 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You' at scholarship.law.duke.edu, published in 2019 in the Duke Law Journal, a student-run law review.
- States Are Rushing to Regulate Deepfakes as AI Goes Mainstream at bloomberg.com (paywalled), June 2023 reporting on US legislation against synthetic human-like fakes. It lists CA, WA, WY, MN, TX, GA and VA as having anti-deepfake laws and lists LA, IL, MA and NJ as planning to legislate.
The countries that have unfortunately banned full face veil[edit | edit source]
“There are currently 16 nations that have banned the burqa (not to be confused with the hijab), including w:Tunisia,[46] w:Austria, w:Denmark, w:France, w:Belgium, w:Tajikistan, w:Latvia,[47] w:Bulgaria,[48] w:Cameroon, w:Chad, w:Congo-Brazzaville, w:Gabon, w:Netherlands,[49] w:China,[50] w:Morocco, and w:Switzerland.”
Taking into consideration these times of industrial disinformation, it is vicious and uncivilized to have laws banning wearing a the full face veil in public.
Quotes on the current laws and their application[edit | edit source]
“If no-one who wants to hurt you knows what you look like, so how could someone malevolent make a covert digital look-alike of you?”
Footnotes[edit | edit source]
- ↑ People who are found in possession of this synthetic pornography should probably not be penalized, but rather advised to get some help.
1st seen in[edit | edit source]
- ↑ 1.0 1.1 https://equalitynow.org/resource/briefing-paper-deepfake-image-based-sexual-abuse-tech-facilitated-sexual-exploitation-and-the-law/
- ↑ Politico AI: Decoded mailing list Wednesday 2022-02-02
- ↑ https://artificialintelligenceact.eu/the-act/ via https://futureoflife.org/ newsletter
- ↑ 4.0 4.1 4.2
Chatting with ChatGPT 2023
- First I asked ChatGPT to "list some legislative approaches against so-called "deep fakes" or "deepfakes"" and it mentioned the Singaporean #Protection from Online Falsehoods and Manipulation Act 2019 and the #Bill - Malicious Deep Fake Prohibition Act of 2018 (115th Congress)
- ↑ https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/
- ↑ https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches
- ↑ 7.0 7.1 https://cybercivilrights.org/deep-fake-laws/
- ↑ https://onfido.com/blog/deepfake-law/
- ↑ https://www.newyorker.com/science/annals-of-artificial-intelligence/the-terrifying-ai-scam-that-uses-your-loved-ones-voice
- ↑ https://onfido.com/blog/deepfake-law/
- ↑ First seen in the suggestions for similar bills for Bills similar to CA AB602 by trackbill.com.
References[edit | edit source]
- ↑
"You Won't Believe What Obama Says In This Video!". w:YouTube. w:BuzzFeed. 2018-04-17. Retrieved 2022-01-05.
We're entering an era in which our enemies can make anyone say anything at any point in time.
- ↑ Lawson, Amanda (2023-04-24). "A Look at Global Deepfake Regulation Approaches". responsible.ai. Responsible Artificial Intelligence Institute. Retrieved 2024-02-14.
- ↑ Quirk, Caroline (2023-06-19). "The High Stakes of Deepfakes: The Growing Necessity of Federal Legislation to Regulate This Rapidly Evolving Technology". legaljournal.princeton.edu. Princeton Legal Journal. Retrieved 2024-02-14.
- ↑ Williams, Kaylee (2023-05-15). "Exploring Legal Approaches to Regulating Nonconsensual Deepfake Pornography". techpolicy.press. Retrieved 2024-02-14.
- ↑ Owen, Aled (2024-02-02). "Deepfake laws: is AI outpacing legislation?". onfido.com. Onfido. Retrieved 2024-02-14.
- ↑ 6.0 6.1 Pirius, Rebecca (2024-02-07). "Is Deepfake Pornography Illegal?". Criminaldefenselawyer.com. w:Nolo (publisher). Retrieved 2024-02-22.
- ↑ 7.0 7.1 Rastogi, Janvhi (2023-10-16). "Deepfake Pornography: A Legal and Ethical Menace". tclf.in. The Contemporary Law Forum. Retrieved 2024-02-14.
- ↑ https://equalitynow.org/resource/briefing-paper-deepfake-image-based-sexual-abuse-tech-facilitated-sexual-exploitation-and-the-law/
- ↑ https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches
- ↑ Viersen, Arnold (2022-04-28). "Stopping Internet Sexual Exploitation Act - An Act to amend the Criminal Code (pornographic material)". parl.ca. w:House of Commons of Canada. Retrieved 2022-10-06.
- ↑ Bilingual version of C-270 https://publications.gc.ca/collections/collection_2022/parl/XB441-270-1.pdf
- ↑ "China seeks to root out fake news and deepfakes with new online content rules". w:Reuters.com. w:Reuters. 2019-11-29. Retrieved 2021-01-23.
- ↑ Statt, Nick (2019-11-29). "China makes it a criminal offense to publish deepfakes or fake news without disclosure". w:The Verge. Retrieved 2021-01-23.
- ↑
"Artificial Intelligence Act: MEPs adopt landmark law". europarl.europa.eu. w:European Parliament. 2024-03-13. Retrieved 2024-03-22.
The regulation, agreed in negotiations with member states in December 2023, was endorsed by MEPs with 523 votes in favour, 46 against and 49 abstentions.
- ↑ https://www.politico.eu/article/eu-artificial-intelligence-act-ai-technology-risk-rules/
- ↑ https://www.responsible.ai/post/a-look-at-global-deepfake-regulation-approaches
- ↑ https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation
- ↑ 18.0 18.1 18.2 18.3 18.4 18.5 18.6 18.7
Authoritative up-to-date version of the Criminal Code chapter 20 On sexual offences can always be found at finlex.fi
Translation to English by the Ministry of Justice: Criminal Code (39/1889) - Chapter 20 - Sexual offences (translation) as .pdf at oikeusministerio.fi (subject to possible revisions) - ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=76&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=79&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=80&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=81#:~:text=Whoever%2C%20intentionally%20or%20knowingly%20captures,two%20lakh%20rupees%2C%20or%20with
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=83&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=84&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ https://www.indiacode.nic.in/show-data?abv=null&statehandle=null&actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=85&orgactid=AC_CEN_45_76_00001_200021_1517807324077
- ↑ Rana, Vikrant; Gandhi, Anuradha; Thakur, Rachita (2023-11-24). "Deepfakes And Breach Of Personal Data – A Bigger Picture". livelaw.in. Retrieved 2024-02-21.
- ↑
"What Is Deep Fake Cyber Crime? What Does Indian Law Say About It?". cybercert.in. Retrieved 2024-03-23.
At present, India does not have any law specifically for deep fake cybercrime, but various other laws can be combined to deal with it.
- ↑ https://www.police.govt.nz/advice-services/cybercrime-and-internet/online-child-safety
- ↑ https://www.dailymaverick.co.za/article/2021-12-01-not-all-of-the-cogs-in-the-cybercrimes-act-machine-are-turning-at-once-we-still-remain-vulnerable/
- ↑ creating sexually explicit deepfake images to be made offence in UK at theguardian.com
- ↑ https://www.gov.uk/government/publications/online-safety-act-new-criminal-offences-circular/online-safety-act-new-criminal-offences-circular
- ↑ https://revengepornhelpline.org.uk/information-and-advice/need-help-and-advice/threats-to-share-intimate-images/
- ↑
Royle, Sara (2021-01-05). "'Deepfake porn images still give me nightmares'". w:BBC Online. w:BBC. Retrieved 2021-01-31.
She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.
- ↑
Mort, Helen (2020). "Change.org petition: 'Tighten regulation on taking, making and faking explicit images'". w:Change.org. w:Change.org. Retrieved 2021-01-31.
Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.
- ↑ "New state laws go into effect July 1".
- ↑ 36.0 36.1 "§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty". w:Virginia. Retrieved 2021-01-23.
- ↑
"Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election". w:Texas. 2019-06-14. Retrieved 2021-01-23.
In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality
- ↑ https://capitol.texas.gov/BillLookup/History.aspx?LegSess=86R&Bill=SB751
- ↑ Johnson, R.J. (2019-12-30). "Here Are the New California Laws Going Into Effect in 2020". KFI. iHeartMedia. Retrieved 2021-01-23.
- ↑ "AB 602 - California Assembly Bill 2019-2020 Regular Session - Depiction of individual using digital or electronic technology: sexually explicit material: cause of action". openstates.org. openstates.org. Retrieved 2021-03-24.
- ↑ Mihalcik, Carrie (2019-10-04). "California laws seek to crack down on deepfakes in politics and porn". w:cnet.com. w:CNET. Retrieved 2021-01-23.
- ↑ Berman, Marc; Leyva, Connie (2019), "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action.", w:California
- ↑ 43.0 43.1 "Georgia Code Title 16. Crimes and Offenses § 16-11-90". w:FindLaw. w:Georgia (U.S. state). 2021-04-14. Retrieved 2022-01-04.
- ↑ "SECTION 52-C Private right of action for unlawful dissemination or publication of a sexually explicit depiction of an individual". nysenate.gov. w:New York State Legislature. 2021-11-12. Retrieved 2021-01-04.
- ↑ https://www.congress.gov/bill/116th-congress/house-bill/3230
- ↑ "Tunisian PM bans wearing of niqab in public institutions". Reuters. 5 July 2019. Retrieved 2021-03-13.
- ↑ "A European government has banned Islamic face veils despite them being worn by just three women". 21 April 2016. Retrieved 2021-03-13.
- ↑ Bulgaria the latest European country to ban the burqa and [niqab in public places, Smh.com.au: accessed 5 December 2016.
- ↑ Halasz, Stephanie; McKenzie, Sheena (27 June 2018). "The Netherlands introduces burqa ban in some public spaces" (27 June 2018). CNN. CNN. Retrieved 2021-03-13.
- ↑ Phillips, Tom (13 January 2015). "China bans burqa in capital of Muslim region of Xinjiang". The Telegraph (13 January 2015). The Telegraph. Retrieved 2021-03-13.