Laws against synthesis and other related crimes: Difference between revisions

Jump to navigation Jump to search
m (fmt)
(8 intermediate revisions by the same user not shown)
Line 23: Line 23:
  | quote = }}
  | quote = }}


</ref>, as [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''.  
</ref>, as section [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''.  


'''[https://law.lis.virginia.gov/vacode/ Code of Virginia (TOC)]''' » [https://law.lis.virginia.gov/vacode/title18.2/ Title 18.2. Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ Chapter 8. Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ Article 5. Obscenity and Related Offenses] » [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty]
[https://law.lis.virginia.gov/vacode/ '''Code of Virginia''' (TOC)] » [https://law.lis.virginia.gov/vacode/title18.2/ '''Title 18.2.''' Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ '''Chapter 8.''' Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ '''Article 5.''' Obscenity and Related Offenses] » '''Section''' [https://law.lis.virginia.gov/vacode/18.2-386.2/ § '''18.2-386.2.''' Unlawful dissemination or sale of images of another; penalty]


The [https://law.lis.virginia.gov/vacode/18.2-386.2/ law '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows:  
The [https://law.lis.virginia.gov/vacode/18.2-386.2/ section '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows:  


A. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]].  
'''A'''. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]].  
::For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's  [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.''
::For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's  [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.''


B. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.''
'''B'''. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.''


C. ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.''
'''C.''' ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.''


D. ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/>  
'''D.''' ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/>  


The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]].  
The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]].  
Line 257: Line 257:


<section end=China2020 />
<section end=China2020 />
= Action needed =
== EU Law on AI 20?? ==
[[File:Flag of Europe.svg|thumb|left|200px|The EU must address the malicious uses of AI in the law it is planning to regulate AI.]]
The European Union is planning a law on AI
* Read [https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206 Proposal for a '''''REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS''''' at eur-lex.europa.eu]<ref group="1st seen in">https://artificialintelligenceact.eu/the-act/ via https://futureoflife.org/ newsletter</ref> (also contains translations)
* https://artificialintelligenceact.eu/ is a website on the planned law by the [https://futureoflife.org/ Future of Life Institute], an American non-profit NGO.
== Law on synthetic filth in the UK 20?? ==
[[File:Flag of the United Kingdom.svg|thumb|left|200px|The UK needs to improve its legislation. Please [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images sign the petition initiated by Helen Mort] in late 2020.]]
[[File:Helen Mort (2014).jpg|thumb|right|245px|[[w:Helen Mort]] is a British poet, novelist and activist against [[synthetic human-like fakes]]. Please sign the petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] originated by her and to be delivered to the [[w:Law Commission (England and Wales)]] and the prime minister.]]
The UK law does not seem very up-to-date on the issue of synthetic filth.
The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref name="BBC2021">
{{cite web
|url = https://www.bbc.com/news/technology-55546372
|title = 'Deepfake porn images still give me nightmares'
|last = Royle
|first = Sara
|date = 2021-01-05
|website = [[w:BBC Online]]
|publisher = [[w:BBC]]
|access-date = 2021-01-31
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}}
</ref>
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020">
{{cite web
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images'
|last = Mort
|first = Helen
|date = 2020
|website = [[w:Change.org]]
|publisher = [[w:Change.org]]
|access-date = 2021-01-31
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}}
</ref> This call is for similar laws as California put in place on January 1 2020.
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks.
{{#lst:Mediatheque|HelenMort2020}}


= Bills in the works =
= Bills in the works =
== Law on synthetic filth in New York 20?? ==
== Law on synthetic filth in New York 20?? ==
[[File:Flag of New York (1909–2020).png|thumb|left|200px|[[w:New York State Legislature]] regular session 2021-2022 is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 New York senate bill '''S1641'''] and identical [https://www.nysenate.gov/legislation/bills/2021/A6517 assembly bill '''A6517'''] to ban sending unsolicited pornography.]]
[[File:Flag of New York (1909–2020).png|thumb|left|200px|[[w:New York State Legislature]] regular session 2021-2022 is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 New York senate bill '''S1641'''] and identical [https://www.nysenate.gov/legislation/bills/2021/A6517 assembly bill '''A6517'''] to ban sending unsolicited pornography.]]
Line 311: Line 364:
</gallery>
</gallery>


= Action needed =
== Law on synthetic filth in the UK 20?? ==
[[File:Flag of the United Kingdom.svg|thumb|left|200px|The UK needs to improve its legislation. Please [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images sign the petition initiated by Helen Mort] in late 2020.]]
[[File:Helen Mort (2014).jpg|thumb|right|245px|[[w:Helen Mort]] is a British poet, novelist and activist against [[synthetic human-like fakes]]. Please sign the petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] originated by her and to be delivered to the [[w:Law Commission (England and Wales)]] and the prime minister.]]
The UK law does not seem very up-to-date on the issue of synthetic filth.
The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref name="BBC2021">
{{cite web
|url = https://www.bbc.com/news/technology-55546372
|title = 'Deepfake porn images still give me nightmares'
|last = Royle
|first = Sara
|date = 2021-01-05
|website = [[w:BBC Online]]
|publisher = [[w:BBC]]
|access-date = 2021-01-31
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}}
</ref>
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020">
{{cite web
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images'
|last = Mort
|first = Helen
|date = 2020
|website = [[w:Change.org]]
|publisher = [[w:Change.org]]
|access-date = 2021-01-31
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}}
</ref> This call is for similar laws as California put in place on January 1 2020.
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks.
{{#lst:Mediatheque|HelenMort2020}}


----
----
Line 365: Line 377:
* '''History''': This version is an evolution of a Finnish language original written in 2016.
* '''History''': This version is an evolution of a Finnish language original written in 2016.


[[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very accessible laws, but also here we need updating of the laws for this age of industrial disinformation.]]
[[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very [https://www.finlex.fi/en/laki/kaannokset/ accessible laws], but also here we need updating of the laws for this age of industrial disinformation.]]


Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]].
Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]].
Line 468: Line 480:


== Synthetic filth in the law and media ==
== Synthetic filth in the law and media ==
* [https://scholarship.law.vanderbilt.edu/cgi/viewcontent.cgi?article=4409&context=vlr ''''''"The New Weapon of Choice": Law's Current Inability to Properly Address Deepfake Pornography'''''' at scholarship.law.vanderbilt.edu], October 2020 Notes by Anne Pechenik Gieseke published in the The [[w:Vanderbilt Law Review]], the flagship [[w:academic journal]] of [[w:Vanderbilt University Law School]].
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]].
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]].
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You'], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review.
* [https://ssri.duke.edu/news/don%E2%80%99t-believe-your-eyes-or-ears-weaponization-artificial-intelligence-machine-learning-and ''''''Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes'''''' at ssri.duke.edu], an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the [[w:Duke University]]
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ ''''''Deepfakes: False Pornography Is Here and the Law Cannot Protect You'''''' at scholarship.law.duke.edu], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review.


== The countries that have unfortunately banned full face veil ==
== The countries that have unfortunately banned full face veil ==
We use only those cookies necessary for the functioning of the wiki and we will never sell your data. All data is stored in the EU.

Navigation menu