Laws against synthesis and other related crimes: Difference between revisions

Jump to navigation Jump to search
(→‎Law on synthetic filth in New York 20??: + File:Nopic.jpg (placeholder for James Skoufis) + File:BrianBenjaminFlag.jpg + File:Todd_Kaminsky_Head_Shot.jpg + File:Aileen_Gunther.jpg as a <gallery></gallery> .. this does not unfortunately enable linking + looked at other solutions)
(16 intermediate revisions by the same user not shown)
Line 23: Line 23:
  | quote = }}
  | quote = }}


</ref>, as [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''.  
</ref>, as section [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''.  


'''[https://law.lis.virginia.gov/vacode/ Code of Virginia (TOC)]''' » [https://law.lis.virginia.gov/vacode/title18.2/ Title 18.2. Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ Chapter 8. Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ Article 5. Obscenity and Related Offenses] » [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty]
[https://law.lis.virginia.gov/vacode/ '''Code of Virginia''' (TOC)] » [https://law.lis.virginia.gov/vacode/title18.2/ '''Title 18.2.''' Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ '''Chapter 8.''' Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ '''Article 5.''' Obscenity and Related Offenses] » '''Section''' [https://law.lis.virginia.gov/vacode/18.2-386.2/ § '''18.2-386.2.''' Unlawful dissemination or sale of images of another; penalty]


The [https://law.lis.virginia.gov/vacode/18.2-386.2/ law '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows:  
The [https://law.lis.virginia.gov/vacode/18.2-386.2/ section '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows:  


A. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]]. For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's  [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.''
'''A'''. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]].  
::For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's  [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.''


B. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.''
'''B'''. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.''


C. ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.''
'''C.''' ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.''


D. ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/>  
'''D.''' ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/>  


The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]].  
The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]].  
Line 43: Line 44:
[[File:Flag of Texas.svg|thumb|left|200px|[[w:Texas]], the Lone Star State has protected the political candidates, but not ordinary folk against synthetic filth.]]
[[File:Flag of Texas.svg|thumb|left|200px|[[w:Texas]], the Lone Star State has protected the political candidates, but not ordinary folk against synthetic filth.]]


<section begin=Texas2019 />On September 1 2019 [[w:Texas Senate]] bill [https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00751F.htm '''SB 751''' - '''''Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election'''''] [[w:amendment]]s to the election code came into effect in the [[w:Law of Texas]], giving [[w:candidates]] in [[w:elections]] a '''30-day protection period'''' to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "''a  video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality''"<ref name="TexasSB751">
<section begin=Texas2019 />On September 1 2019 [[w:Texas Senate]] bill [https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00751F.htm '''SB 751''' - '''''Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election'''''] [[w:amendment]]s to the election code came into effect in the [[w:Law of Texas]], giving [[w:candidates]] in [[w:elections]] a '''30-day protection period''' to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "''a  video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality''"<ref name="TexasSB751">


{{cite web
{{cite web
Line 80: Line 81:
[[File:Flag of California.svg|thumb|left|200px|[[w:California]] moved later than Virginia, but it outlawed also the manufacture of synthetic filth on Jan 1 2020.]]
[[File:Flag of California.svg|thumb|left|200px|[[w:California]] moved later than Virginia, but it outlawed also the manufacture of synthetic filth on Jan 1 2020.]]


 
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and an Assemblymember of the [[w:California State Assembly]], most loved for authoring [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 '''AB-602''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''], which came into effect on Jan 1 2020, banning both the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.]] January 1 2020 <ref name="KFI2019">
[[File:Connie Leyva 2015.jpg|thumb|right|240px|[[w:California]] [[w:California State Senate|w:Senator]] [[w:Connie Leyva]] sponsored [https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false '''California Senate Bill SB 564''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''] in Feb '''2019'''. It is identical to Assembly Bill 602 authored by [[w:Marc Berman]]. The bill was [https://www.sagaftra.org/action-alert-support-california-bill-end-deepfake-porn endorsed by SAG-AFTRA]. It became law on 1 January 2020 in the [[w:California Civil Code|w:California Civil Code]] of the [[w:California Codes]].]]
 
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and a member of the [[w:California State Assembly]], most loved for authoring [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 '''AB-602''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''], which came into effect on Jan 1 2020, banning both the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.]] January 1 2020 <ref name="KFI2019">


{{cite web
{{cite web
Line 96: Line 94:
  |quote=}}
  |quote=}}


</ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.  AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]] member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602">
</ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.  AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]]member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602">


{{cite web
{{cite web
Line 125: Line 123:
<section end=California2020 />
<section end=California2020 />


The law of California is as follows:
[[File:Connie Leyva 2015.jpg|thumb|right|240px|[[w:California]] [[w:California State Senate|w:Senator]] [[w:Connie Leyva]] sponsored [https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false '''California Senate Bill SB 564''' - ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''] in Feb '''2019''' and the Senate Bill was [https://www.sagaftra.org/action-alert-support-california-bill-end-deepfake-porn endorsed by SAG-AFTRA], but the Assembly Bill AB-602 authored by [[w:California State Assembly]]member [[w:Marc Berman]] was the one that became law on 1 January 2020 in the [[w:California Civil Code|w:California Civil Code]] of the [[w:California Codes]].]]
 
'''Introduction''' by [https://a24.asmdc.org/ Assemblymember Marc Berman]:
 
'''AB 602''', Berman. '''''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action'''''.
 
Existing law creates a private [[w:Cause of action|w:right of action]] against a person who intentionally distributes a photograph or recorded image of another that exposes the intimate body parts of that person or of a person engaged in a sexual act '''without the person’s consent''' if specified conditions are met.
 
This bill would provide that a '''depicted individual''', as defined, has a '''[[w:cause of action]] against''' a person who either
* (1) '''creates''' and intentionally discloses sexually explicit material if the person knows or reasonably should have known the '''depicted''' individual '''did not [[w:consent]]''' to its creation or disclosure or
* (2) who '''intentionally discloses''' sexually explicit material that the person did not create if the person knows the '''depicted''' individual '''did not consent''' to its creation.
 
The bill would specify exceptions to those provisions, including if the material is a matter of legitimate public concern or a work of political or newsworthy value.


The bill would authorize a prevailing [[w:plaintiff]] who suffers harm to seek [[w:Injunction|w:injunctive]] relief and recover reasonable [[w:attorney’s fee]]s and costs as well as specified monetary [[w:damages]], including [[w:Statutory damages|statutory]] and [[w:punitive damages]].


'''1708.86.'''


('''a''') '''For purposes of this section''':
'''The law is as follows''':
* (1) “''Authorized Representative''” means an attorney, talent agent, or personal manager authorized to represent a depicted individual or, if the depicted individual does not have an attorney, talent agent, or personal manager, a labor union representing performers in audiovisual works.
 
* (2) (A) “''Consent''” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a description of the sexually explicit material and the audiovisual work in which it will be incorporated.
SECTION 1. Section 1708.86 is added to the [https://leginfo.legislature.ca.gov/faces/codesTOCSelected.xhtml?tocCode=CIV&tocTitle=+Civil+Code+-+CIV Civil Code of California], to read:
* (2) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made unless one of the following requirements is satisfied:
 
:** (i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it.
'''1708.86.''' ('''a''') '''For purposes of this section''':
:** (ii) The depicted individual’s authorized representative provides written approval of the signed agreement.
* (1) “''Altered depiction''” means a performance that was actually performed by the depicted individual but was subsequently altered to be in violation of this section.
* (3) “''Depicted individual''” means an individual depicted in a realistic digitized performance in which the individual did not actually perform. For purposes of this paragraph, “''digitized''” includes depicting the nude body parts of another human being as being those of the individual or imposing digitally created nude body parts onto the individual.
* (2) “''Authorized Representative''” means an attorney, talent agent, or personal manager authorized to represent a depicted individual if the depicted individual is represented.
* (4) “''Disclose''” means to transfer, publish, make available, or distribute.
* (3) (A) “''Consent''” means an agreement written in plain language signed knowingly and voluntarily by the depicted individual that includes a general description of the sexually explicit material and the audiovisual work in which it will be incorporated.
* (5) “''Harm''” includes, but is not limited to, economic harm or emotional distress.
* (3) (B) A depicted individual may rescind consent by delivering written notice within three business days from the date consent was given to the person in whose favor consent was made, unless one of the following requirements is satisfied:
* (6) “''Individual''” means a natural person.
:**(i) The depicted individual is given at least 72 hours to review the terms of the agreement before signing it.
* (7) “''Nude''” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola.
:**(ii) The depicted individual’s authorized representative provides written approval of the signed agreement.
* (8) “''Person''” means a human being or legal entity.
* (4) “''Depicted individual''” means an individual who appears, as a result of digitization, to be giving a performance they did not actually perform or to be performing in an altered depiction.
* (9) “''Sexual conduct''” means any of the following:
* (5) “''Despicable conduct''” means conduct that is so vile, base, or contemptible that it would be looked down on and despised by a reasonable person.
* (6) “''Digitization''” means to realistically depict any of the following:
** (A) The nude body parts of another human being as the nude body parts of the depicted individual.
** (B) Computer-generated nude body parts as the nude body parts of the depicted individual.
** (C) The depicted individual engaging in sexual conduct in which the depicted individual did not engage.
* (7) “''Disclose''” means to publish, make available, or distribute to the public.
* (8) “''Individual''” means a natural person.
* (9) “''Malice''” means that the defendant acted with intent to cause harm to the plaintiff or despicable conduct that was done with a willful and knowing disregard of the rights of the plaintiff. A person acts with knowing disregard within the meaning of this paragraph when they are aware of the probable harmful consequences of their conduct and deliberately fail to avoid those consequences.
* (10) “''Nude''” means visible genitals, pubic area, anus, or a female’s postpubescent nipple or areola.
* (11) “''Person''” means a human being or legal entity.
* (12) “''Plaintiff''” includes cross-plaintiff.
* (13) “''Sexual conduct''” means any of the following:
** (A) Masturbation.
** (A) Masturbation.
** (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals.
** (B) Sexual intercourse, including genital, oral, or anal, whether between persons regardless of sex or gender or between humans and animals.
** (C) Sexual penetration of the mouth, vagina, or rectum by, or with, an object.
** (C) Sexual penetration of the vagina or rectum by, or with, an object.
** (D) The transfer of semen onto the depicted individual.
** (D) The transfer of semen by means of sexual conduct from the penis directly onto the depicted individual as a result of ejaculation.
** (E) Sadomasochistic abuse involving the depicted individual.
** (E) Sadomasochistic abuse involving the depicted individual.
* (10) “''Sexually explicit material''” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct.
(14) “''Sexually explicit material''” means any portion of an audiovisual work that shows the depicted individual performing in the nude or appearing to engage in, or being subjected to, sexual conduct.
 
('''b''') A '''depicted individual''' who has suffered harm resulting from the intentional disclosure of sexually explicit material without the depicted individual’s consent has a '''cause of action''' against a person who does either of the following:
* (1) The person '''creates and discloses sexually explicit material''' in an audiovisual work and the person knew or reasonably should have known the depicted individual did not consent.
* (2) The person '''discloses, but did not create''', sexually explicit material in an audiovisual work and the person knew the depicted individual did not consent.


('''c''') (1) '''A person''' is '''not liable''' under this section if the person proves any of the following:
('''b''') '''A depicted individual has a cause of action against''' a person who does '''either of the following''':
* (A) The person disclosed the sexually explicit material in the course of reporting unlawful activity, in the course of a legal proceeding, or the person is a member of law enforcement and disclosed the sexually explicit material in the course of exercising the person’s law enforcement duties.
* (1) '''Creates and intentionally discloses sexually explicit material''' and the person knows or reasonably should have known the '''depicted individual''' in that material '''did not consent''' to its creation or disclosure.
* (B) The person disclosed the sexually explicit material in relation to a matter of legitimate public concern.
* (2) '''Intentionally discloses''' sexually explicit material that the person did not create and the person knows the depicted individual in that material '''did not consent''' to the creation of the sexually explicit material.
* (C) The person disclosed the sexually explicit material in a work of political or newsworthy value, or similar work.
* (D) The person disclosed the sexually explicit material for the purposes of commentary or criticism or the disclosure is otherwise protected by the California Constitution or the United States Constitution.


('''c''') (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value because the depicted individual is a public figure.
('''c''') (1) A person is '''not liable''' under this section in either of the following circumstances:
* (A) The person discloses the sexually explicit material in the course of any of the following:
:** (i) Reporting unlawful activity.
:** (ii) Exercising the person’s law enforcement duties.
:**(iii) Hearings, trials, or other legal proceedings.
* (B) The material is any of the following:
:** (i) A matter of legitimate public concern.
:** (ii) A work of political or newsworthy value or similar work.
:** (iii) Commentary, criticism, or disclosure that is otherwise protected by the California Constitution or the United States Constitution.
* (2) For purposes of this subdivision, sexually explicit material is not of newsworthy value solely because the depicted individual is a public figure.


('''d''') It shall '''not be''' a '''defense''' to an action under this section that there is a '''disclaimer included''' in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.
('''d''') It shall not be a defense to an action under this section that there is a disclaimer included in the sexually explicit material that communicates that the inclusion of the depicted individual in the sexually explicit material was unauthorized or that the depicted individual did not participate in the creation or development of the material.


('''e''') (1) '''A prevailing plaintiff may recover any of the following''':
('''e''') (1) A prevailing plaintiff who suffers harm as a result of the violation of subdivision (b) may recover any of the following:
* (A) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress.
* (A) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material.
* (B) An amount equal to the monetary gain made by the defendant from the creation, development, or disclosure of the sexually explicit material, or the plaintiff may, at any time before the final judgment is rendered, recover instead an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, in a sum not less than five thousand dollars ($5,000) or more than five hundred thousand dollars ($500,000).
* (B) One of the following:
:** (i) Economic and noneconomic damages proximately caused by the disclosure of the sexually explicit material, including damages for emotional distress.
:** (ii) Upon request of the plaintiff at any time before the final judgment is rendered, the plaintiff may instead recover an award of statutory damages for all unauthorized acts involved in the action, with respect to any one work, as follows:
:** (I) A sum of not less than one thousand five hundred dollars ($1,500) but not more than thirty thousand dollars ($30,000).
:** (II) If the unlawful act was committed with malice, the award of statutory damages may be increased to a maximum of one hundred fifty thousand dollars ($150,000).
* (C) Punitive damages.
* (C) Punitive damages.
* (D) Reasonable attorney’s fees and costs.
* (D) Reasonable attorney’s fees and costs.
* (E) Any other available relief, including injunctive relief.
* (E) Any other available relief, including injunctive relief.
* (2) This act does not affect any right or remedy available under any other law.
(2) The remedies provided by this section are cumulative and shall not be construed as restricting a remedy that is available under any other law.
* (f) An action under this section shall be brought no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence.
 
('''g''') (1) A '''plaintiff may''' proceed '''using''' a '''pseudonym''', either John Doe, Jane Doe, or Doe, for the true name of the plaintiff and may exclude or redact from all pleadings and documents filed in the action other identifying characteristics of the plaintiff. A plaintiff who proceeds using a pseudonym and excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon the defendant a confidential information form for this purpose that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential.


('''g''') (2) In cases where a '''plaintiff proceeds using a pseudonym''' under this section, the following provisions shall apply:
('''f''') An action under this section shall be commenced no later than three years from the date the unauthorized creation, development, or disclosure was discovered or should have been discovered with the exercise of reasonable diligence.
* (A) All other parties and their agents and attorneys shall use this pseudonym in all pleadings, discovery documents, and other documents filed or served in the action, and at hearings, trial, and other court proceedings that are open to the public.
* (B)
::* (i) Any party filing a pleading, discovery document, or other document in the action shall exclude or redact any identifying characteristics of the plaintiff from the pleading, discovery document, or other document, except for a confidential information form filed pursuant to this subdivision.
::*(ii) A party excluding or redacting identifying characteristics as provided in this section shall file with the court and serve upon all other parties a confidential information form that includes the plaintiff’s name and other identifying characteristics excluded or redacted. The court shall keep the plaintiff’s name and excluded or redacted characteristics confidential.
* (C) All court decisions, orders, petitions, discovery documents, and other documents shall be worded so as to protect the name or other identifying characteristics of the plaintiff from public revelation.


('''g''') (3) The following '''definitions''' apply to this subdivision:
('''g''') The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions.
* (A) “''Identifying characteristics''” means name or any part thereof, address or any part thereof, city or unincorporated area of residence, age, marital status, relationship to defendant, and race or ethnic background, telephone number, email address, social media profiles, online identifiers, contact information, or any other information, including images of the plaintiff, from which the plaintiff’s identity can be discerned.
* (B) “''Online identifiers''” means any personally identifying information or signifiers that would tie the plaintiff to a particular electronic service, device, or internet application, website, or platform account, including, but not limited to, access names, access codes, account names, aliases, avatars, credentials, gamer tags, display names, handles, login names, member names, online identities, pseudonyms, screen names, user accounts, user identifications, usernames, Uniform Resource Locators (URLs), domain names, Internet Protocol (IP) addresses, and media access control (MAC) addresses.


('''g''') (4) The '''responsibility''' for '''excluding or redacting the name or identifying characteristics of the plaintiff''' from '''all documents''' filed with the court rests solely with the parties and their attorneys. Nothing in this section requires the court to review pleadings or other papers for compliance with this provision.
<ref name="CaliforniaStateLaw AB 602">
 
('''g''') (5) Upon request of the plaintiff, the clerk shall allow access to the court file in an action filed under this section only as follows:
* (A) To a party to the action, including a party’s attorney.
* (B) To a person by order of the court on a showing of good cause for access.
* (C) To any person 60 days after judgment is entered unless the court grants a plaintiff’s motion to seal records pursuant to Chapter 3 of Division 4 of Title 2 of the California Rules of Court.
 
('''h''') The provisions of this section are severable. If any provision of this section or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.<ref name="CaliforniaStateLaw SB 564">
{{Citation
| title = "SB-564 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action."
| first = Marc
| last = Berman
| author-link = w:Marc Berman
| first2 = Connie
| last2 = Leyva
| author2-link = w:Connie Leyva
| series =
| year = 2019
| publisher = [[w:California]]
| url = https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=201920200SB564&showamends=false
 
}}
 
</ref><ref name="CaliforniaStateLaw AB 602">
{{Citation
{{Citation
  | title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action."
  | title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action."
Line 262: Line 257:


<section end=China2020 />
<section end=China2020 />
= Action needed =
== EU Law on AI 20?? ==
[[File:Flag of Europe.svg|thumb|left|200px|The EU must address the malicious uses of AI in the law it is planning to regulate AI.]]
The European Union is planning a law on AI
* Read [https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52021PC0206 Proposal for a '''''REGULATION OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL LAYING DOWN HARMONISED RULES ON ARTIFICIAL INTELLIGENCE (ARTIFICIAL INTELLIGENCE ACT) AND AMENDING CERTAIN UNION LEGISLATIVE ACTS''''' at eur-lex.europa.eu]<ref group="1st seen in">https://artificialintelligenceact.eu/the-act/ via https://futureoflife.org/ newsletter</ref> (also contains translations)
* https://artificialintelligenceact.eu/ is a website on the planned law by the [https://futureoflife.org/ Future of Life Institute], an American non-profit NGO.
== Law on synthetic filth in the UK 20?? ==
[[File:Flag of the United Kingdom.svg|thumb|left|200px|The UK needs to improve its legislation. Please [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images sign the petition initiated by Helen Mort] in late 2020.]]
[[File:Helen Mort (2014).jpg|thumb|right|245px|[[w:Helen Mort]] is a British poet, novelist and activist against [[synthetic human-like fakes]]. Please sign the petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] originated by her and to be delivered to the [[w:Law Commission (England and Wales)]] and the prime minister.]]
The UK law does not seem very up-to-date on the issue of synthetic filth.
The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref name="BBC2021">
{{cite web
|url = https://www.bbc.com/news/technology-55546372
|title = 'Deepfake porn images still give me nightmares'
|last = Royle
|first = Sara
|date = 2021-01-05
|website = [[w:BBC Online]]
|publisher = [[w:BBC]]
|access-date = 2021-01-31
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}}
</ref>
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020">
{{cite web
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images'
|last = Mort
|first = Helen
|date = 2020
|website = [[w:Change.org]]
|publisher = [[w:Change.org]]
|access-date = 2021-01-31
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}}
</ref> This call is for similar laws as California put in place on January 1 2020.
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks.
{{#lst:Mediatheque|HelenMort2020}}


= Bills in the works =
= Bills in the works =
== Law on synthetic filth in New York 20?? ==
== Law on synthetic filth in New York 20?? ==
[[File:Flag of New York (1909–2020).png|thumb|left|200px|[[w:New York State Legislature]] regular session 2021-2022 is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 New York senate bill '''S1641'''] and identical [https://www.nysenate.gov/legislation/bills/2021/A6517 assembly bill '''A6517'''] to ban sending unsolicited pornography.]]
[[File:Flag of New York (1909–2020).png|thumb|left|200px|[[w:New York State Legislature]] regular session 2021-2022 is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 New York senate bill '''S1641'''] and identical [https://www.nysenate.gov/legislation/bills/2021/A6517 assembly bill '''A6517'''] to ban sending unsolicited pornography.]]
Line 316: Line 364:
</gallery>
</gallery>


== Law on synthetic filth in the UK 20?? ==
[[File:Flag of the United Kingdom.svg|thumb|left|200px|The UK needs to improve its legislation. Please [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images sign the petition initiated by Helen Mort] in late 2020.]]
[[File:Helen Mort (2014).jpg|thumb|right|245px|[[w:Helen Mort]] is a British poet, novelist and activist against [[synthetic human-like fakes]]. Please sign the petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] originated by her and to be delivered to the [[w:Law Commission (England and Wales)]] and the prime minister.]]
The UK law does not seem very up-to-date on the issue of synthetic filth.
The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref name="BBC2021">
{{cite web
|url = https://www.bbc.com/news/technology-55546372
|title = 'Deepfake porn images still give me nightmares'
|last = Royle
|first = Sara
|date = 2021-01-05
|website = [[w:BBC Online]]
|publisher = [[w:BBC]]
|access-date = 2021-01-31
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}}
</ref>
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020">
{{cite web
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images'
|last = Mort
|first = Helen
|date = 2020
|website = [[w:Change.org]]
|publisher = [[w:Change.org]]
|access-date = 2021-01-31
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}}
</ref> This call is for similar laws as California put in place on January 1 2020.
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks.
{{#lst:Mediatheque|HelenMort2020}}


----
----
Line 369: Line 377:
* '''History''': This version is an evolution of a Finnish language original written in 2016.
* '''History''': This version is an evolution of a Finnish language original written in 2016.


[[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very accessible laws, but also here we need updating of the laws for this age of industrial disinformation.]]
[[File:Suomen lippu valokuva.png|right|thumb|260px|[[w:Finland]] has very logical and very [https://www.finlex.fi/en/laki/kaannokset/ accessible laws], but also here we need updating of the laws for this age of industrial disinformation.]]


Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]].
Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]].
Line 423: Line 431:
=== Law proposal to ban unauthorized modeling of human voice ===
=== Law proposal to ban unauthorized modeling of human voice ===
{{#ev:youtube|0sR1rU3gLzQ|360px|right|[https://www.youtube.com/watch?v=0sR1rU3gLzQ Video 'This AI Clones Your Voice After Listening for 5 Seconds' by '2 minute papers' at YouTube] describes the voice thieving machine by Google Research in [[w:NeurIPS|w:NeurIPS]] 2018.}}
{{#ev:youtube|0sR1rU3gLzQ|360px|right|[https://www.youtube.com/watch?v=0sR1rU3gLzQ Video 'This AI Clones Your Voice After Listening for 5 Seconds' by '2 minute papers' at YouTube] describes the voice thieving machine by Google Research in [[w:NeurIPS|w:NeurIPS]] 2018.}}
'''Motivation''': The current situation where the criminals can freely trade and grow their libraries of stolen voices is unwise.


==== §1 Unauthorized modeling of a human voice ====
==== §1 Unauthorized modeling of a human voice ====
Line 470: Line 480:


== Synthetic filth in the law and media ==
== Synthetic filth in the law and media ==
* [https://scholarship.law.vanderbilt.edu/cgi/viewcontent.cgi?article=4409&context=vlr ''''''"The New Weapon of Choice": Law's Current Inability to Properly Address Deepfake Pornography'''''' at scholarship.law.vanderbilt.edu], October 2020 Notes by Anne Pechenik Gieseke published in the The [[w:Vanderbilt Law Review]], the flagship [[w:academic journal]] of [[w:Vanderbilt University Law School]].
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]].
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]].
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You'], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review.
* [https://ssri.duke.edu/news/don%E2%80%99t-believe-your-eyes-or-ears-weaponization-artificial-intelligence-machine-learning-and ''''''Don’t Believe Your Eyes (or Ears): The Weaponization of Artificial Intelligence, Machine Learning, and Deepfakes'''''' at ssri.duke.edu], an October 2019 news article by Joe Littell, published by the Social Science Research Institute at the [[w:Duke University]]
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ ''''''Deepfakes: False Pornography Is Here and the Law Cannot Protect You'''''' at scholarship.law.duke.edu], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review.


== The countries that have unfortunately banned full face veil ==
== The countries that have unfortunately banned full face veil ==
We use only those cookies necessary for the functioning of the wiki and we will never sell your data. All data is stored in the EU.

Navigation menu