Editing Laws against synthesis and other related crimes
Jump to navigation
Jump to search
The edit can be undone. Please check the comparison below to verify that this is what you want to do, and then publish the changes below to finish undoing the edit.
Latest revision | Your text | ||
Line 1: | Line 1: | ||
= Laws and their application = | |||
== Law on synthetic filth in Virginia == | |||
[[File:Flag of Virginia.svg|thumb|left|200px|[[w:Virginia]], an avant-garde state. Motto "[[w:Sic semper tyrannis]]"]] | |||
== Law | |||
[[File:Flag of Virginia.svg|thumb| | |||
<section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | <section begin=Virginia2019 />[[File:Marcus Simon.jpeg|thumb|right|108px|Homie [[w:Marcus Simon|w:Marcus Simon]] ([http://marcussimon.com/ marcussimon.com]) is a Member of the [[w:Virginia House of Delegates]] and a true pioneer in legislating against synthetic filth.]] Since July 1 2019<ref> | ||
Line 499: | Line 23: | ||
| quote = }} | | quote = }} | ||
</ref>, as | </ref>, as [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2 titled ''''''Unlawful dissemination or sale of images of another; penalty.''''''] became part of the '''[[w:Code of Virginia]]'''. | ||
'''[https://law.lis.virginia.gov/vacode/ Code of Virginia (TOC)]''' » [https://law.lis.virginia.gov/vacode/title18.2/ Title 18.2. Crimes and Offenses Generally] » [https://law.lis.virginia.gov/vacode/title18.2/chapter8/ Chapter 8. Crimes Involving Morals and Decency] » [https://law.lis.virginia.gov/vacodefull/title18.2/chapter8/article5/ Article 5. Obscenity and Related Offenses] » [https://law.lis.virginia.gov/vacode/18.2-386.2/ § 18.2-386.2. Unlawful dissemination or sale of images of another; penalty] | |||
[https://law.lis.virginia.gov/vacode/ | The [https://law.lis.virginia.gov/vacode/18.2-386.2/ law '''§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty.''' of Virginia] is as follows: | ||
A. ''Any [[w:person]] who, with the [[w:Intention (criminal law)|w:intent]] to [[w:coercion|w:coerce]], [[w:harassment|w:harass]], or [[w:intimidation|w:intimidate]], [[w:Malice_(law)|w:malicious]]ly [[w:dissemination|w:disseminates]] or [[w:sales|w:sells]] any videographic or still image created by any means whatsoever that [[w:Depiction|w:depicts]] another person who is totally [[w:nudity|w:nude]], or in a state of undress so as to expose the [[w:sex organs|w:genitals]], pubic area, [[w:buttocks]], or female [[w:breast]], where such person knows or has reason to know that he is not [[w:license]]d or [[w:authorization|w:authorized]] to disseminate or sell such [[w:Video|w:videographic]] or [[w:Film still|w:still image]] is [[w:Guilt (law)|w:guilty]] of a Class 1 [[w:Misdemeanor#United States|w:misdemeanor]]. For purposes of this subsection, "another person" includes a person whose image was used in creating, adapting, or modifying a videographic or still image with the intent to depict an actual person and who is recognizable as an actual person by the person's [[w:face]], [[w:Simulacrum|w:likeness]], or other distinguishing characteristic.'' | |||
B. ''If a person uses [[w:Service (economics)|w:services]] of an [[w:Internet service provider]], an electronic mail service provider, or any other information service, system, or access software provider that provides or enables computer access by multiple users to a computer server in committing acts prohibited under this section, such provider shall not be held responsible for violating this section for content provided by another person.'' | |||
C. ''Venue for a prosecution under this section may lie in the [[w:jurisdiction]] where the unlawful act occurs or where any videographic or still image created by any means whatsoever is produced, reproduced, found, stored, received, or possessed in violation of this section.'' | |||
D. ''The provisions of this section shall not preclude prosecution under any other [[w:statute]].''<ref name="Virginia2019Chapter515"/> | |||
The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | The identical bills were [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+HB2678 House Bill 2678] presented by [[w:Delegate (American politics)|w:Delegate]] [[w:Marcus Simon]] to the [[w:Virginia House of Delegates]] on January 14 2019 and three day later an identical [https://lis.virginia.gov/cgi-bin/legp604.exe?191+sum+SB1736 Senate bill 1736] was introduced to the [[w:Senate of Virginia]] by Senator [[w:Adam Ebbin]]. | ||
<section end=Virginia2019 /> | <section end=Virginia2019 /> | ||
== Law | == Law on synthetic filth in Texas == | ||
[[File:Flag of Texas.svg|thumb|left|200px|[[w:Texas]], the Lone Star State has protected the political candidates, but not ordinary folk against synthetic filth.]] | |||
[[File:Flag of Texas.svg|thumb| | |||
<section begin=Texas2019 />On September 1 2019 [[w:Texas | <section begin=Texas2019 />On September 1 2019 [[w:Texas]] senate bill [https://capitol.texas.gov/tlodocs/86R/billtext/html/SB00751F.htm '''SB 751''' - '''''Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election'''''] [[w:amendment]]s to the election code came into effect in the [[w:Law of Texas]], giving [[w:candidates]] in [[w:elections]] a 30-day protection period to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "''a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality''"<ref name="TexasSB751"> | ||
{{cite web | {{cite web | ||
Line 535: | Line 56: | ||
|quote= In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality}} | |quote= In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality}} | ||
</ref> | |||
<section end=Texas2019 /> | <section end=Texas2019 /> | ||
== Law on synthetic filth in California == | |||
[[File:Flag of California.svg|thumb|left|200px|[[w:California]] moved later than Virginia, but it outlawed also the distribution of synthetic filth on Jan 1 2020.]] | |||
== Law | |||
[[File:Flag of California.svg|thumb| | |||
<section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and | <section begin=California2020 />[[File:Marc Berman.jpg|thumb|120px|right|Homie [[w:Marc Berman|w:Marc Berman]], a righteous fighter for our human rights in this age of industrial disinformation filth and a member of the [[w:California State Assembly]], most loved for authoring [https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 AB-602], which came into effect on Jan 1 2020, banning both the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted.]] January 1 2020 <ref name="KFI2019"> | ||
{{cite web | {{cite web | ||
Line 573: | Line 76: | ||
|quote=}} | |quote=}} | ||
</ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]]member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602"> | </ref> the [[w:California]] [[w:State law (United States)|w:US state law]] '''[https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201920200AB602 "AB-602 ''Depiction of individual using digital or electronic technology: sexually explicit material: cause of action''."]''' came into effect in the [[w:California Civil Code|civil code]] of the [[w:California Codes]] banning the manufacturing and [[w:digital distribution]] of synthetic pornography without the [[w:consent]] of the people depicted. AB-602 provides victims of synthetic pornography with [[w:injunction|w:injunctive relief]] and poses legal threats of [[w:statutory damages|w:statutory]] and [[w:punitive damages]] on [[w:criminal]]s making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California [[w:Governor (United States)|w:Governor]] [[w:Gavin Newsom]] on October 3 2019 and was authored by [[w:California State Assembly]] member [[w:Marc Berman]] and an identical Senate bill was coauthored by [[w:California State Senate|w:California Senator]] [[w:Connie Leyva]].<ref name="OpenStates AB 602"> | ||
{{cite web | {{cite web | ||
Line 602: | Line 105: | ||
<section end=California2020 /> | <section end=California2020 /> | ||
The law of California is as follows:<ref name="CaliforniaStateLaw AB 602"> | |||
<ref name="CaliforniaStateLaw AB 602"> | |||
{{Citation | {{Citation | ||
| title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | | title = "AB-602 Depiction of individual using digital or electronic technology: sexually explicit material: cause of action." | ||
Line 700: | Line 122: | ||
</ref> | </ref> | ||
* [[w:California State Legislature]] | * [[w:California State Legislature]] | ||
== Law | == Law on synthetic filth in New York == | ||
[[File:Flag of | [[File:Flag of New York (1909–2020).png|thumb|left|200px|[[w:New York State]] is contemplating the [https://www.nysenate.gov/legislation/bills/2021/S1641 senate bill S1641] to ban sending unsolicited pornography.]] | ||
In the 2021-2022 [[w:New York State Senate]] regular sessions, on 2021-01-14 Senators [[w:James Skoufis]], [[w:Brian Benjamin]] and [[w:Todd Kaminsky]] of the New York State Senate introduced [https://www.nysenate.gov/legislation/bills/2021/S1641 New York Senate bill '''S1641'''] to add '''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL''''' to the penal law. On 2021-03-19 an identical [https://www.nysenate.gov/legislation/bills/2021/A6517 New York Assembly bill '''A6517''' - ''Establishes the crime of unlawful electronic transmission of sexually explicit visual material''] was introduced to the [[w:New York State Assembly]] by Assembly Member [[w:Aileen Gunther]].<ref group="1st seen in"> | |||
In the 2021-2022 [[w:New York State Senate]] regular sessions, on 2021-01-14 | |||
First seen in the [https://trackbill.com/search/#/related=%7B%22id%22:%221690501%22,%22state%22:%22CA%22,%22session%22:%222019%22,%22billId%22:%22AB602%22%7D&direction=desc&page=1&resultsPerPage=25&sort=relevancy&tracked&upcoming_hearings&type=bills&state=all&session suggestions for similar bills for Bills similar to CA AB602 by trackbill.com]. | First seen in the [https://trackbill.com/search/#/related=%7B%22id%22:%221690501%22,%22state%22:%22CA%22,%22session%22:%222019%22,%22billId%22:%22AB602%22%7D&direction=desc&page=1&resultsPerPage=25&sort=relevancy&tracked&upcoming_hearings&type=bills&state=all&session suggestions for similar bills for Bills similar to CA AB602 by trackbill.com]. | ||
Line 918: | Line 164: | ||
:'''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL'''''. | :'''§ 250.70 ''UNLAWFUL ELECTRONIC TRANSMISSION OF SEXUALLY EXPLICIT VISUAL MATERIAL'''''. | ||
:A person is guilty of unlawful electronic transmission of sexually explicit visual material if a person knowingly transmits by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed or depicts the covered genitals of a male person that are in a discernibly turgid state and such visual material is not sent at the request of or with the express consent of the recipient. For purposes of this section the term "intimate parts" means the naked genitals, pubic area, anus, or female postpubescent nipple of the person and the term "sexual conduct" shall have the same meaning as defined in | :A person is guilty of unlawful electronic transmission of sexually explicit visual material if a person knowingly transmits by electronic means visual material that depicts any person engaging in sexual conduct or with a person's intimate parts exposed or depicts the covered genitals of a male person that are in a discernibly turgid state and such visual material is not sent at the request of or with the express consent of the recipient. For purposes of this section the term "intimate parts" means the naked genitals, pubic area, anus, or female postpubescent nipple of the person and the term "sexual conduct" shall have the same meaning as defined in section 130.00 of this chapter. Unlawful electronic transmission of sexually explicit visual material is a class a misdemeanor. | ||
:'''§ 2'''. This act shall take effect on the first of November next succeeding the date on which it shall have become a law." | :'''§ 2'''. This act shall take effect on the first of November next succeeding the date on which it shall have become a law." | ||
== Law on synthetic filth in China == | |||
< | <section begin=China2020 />On January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a [[w:crime]] the [[w:Cyberspace Administration of China]] ([http://www.cac.gov.cn/ cac.gov.cn]) stated on its website. China announced this new law in November 2019.<ref name="Reuters2019"> | ||
< | |||
= | {{cite web | ||
| url = https://www.reuters.com/article/us-china-technology/china-seeks-to-root-out-fake-news-and-deepfakes-with-new-online-content-rules-idUSKBN1Y30VU | |||
| title = China seeks to root out fake news and deepfakes with new online content rules | |||
| last = | |||
| first = | |||
| date = 2019-11-29 | |||
| website = [[w:Reuters.com]] | |||
| publisher = [[w:Reuters]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> The Chinese government seems to be reserving the right to prosecute both users and [[w:online video platform]]s failing to abide by the rules. <ref name="TheVerge2019"> | |||
{{cite web | |||
| url = https://www.theverge.com/2019/11/29/20988363/china-deepfakes-ban-internet-rules-fake-news-disclosure-virtual-reality | |||
| title = China makes it a criminal offense to publish deepfakes or fake news without disclosure | |||
| last = Statt | |||
| first = Nick | |||
| date = 2019-11-29 | |||
| website = | |||
| publisher = [[w:The Verge]] | |||
| access-date = 2021-01-23 | |||
| quote = }} | |||
</ref> | |||
<section end=China2020 /> | |||
== Law on synthetic filth in the UK == | |||
[[File:Flag of the United Kingdom.svg|thumb|left|150px|The UK needs to improve its legislation. Please [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images sign the petition initiated by Helen Mort] in late 2020.]] | |||
[[File:Helen Mort (2014).jpg|thumb|right|245px|[[w:Helen Mort]] is a British poet, novelist and activist against [[synthetic human-like fakes]]. Please sign the petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] originated by her and to be delivered to the [[w:Law Commission (England and Wales)]] and the prime minister.]] | |||
-- | The UK law does not seem very up-to-date on the issue of synthetic filth. | ||
The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref name="BBC2021"> | |||
---- | {{cite web | ||
|url = https://www.bbc.com/news/technology-55546372 | |||
|title = 'Deepfake porn images still give me nightmares' | |||
|last = Royle | |||
|first = Sara | |||
|date = 2021-01-05 | |||
|website = [[w:BBC Online]] | |||
|publisher = [[w:BBC]] | |||
|access-date = 2021-01-31 | |||
|quote = She alerted the police to the images but was told that no action could be taken. Dr Aislinn O'Connell, a lecturer in law at Royal Holloway University of London, explained that Helen's case fell outside the current law.}} | |||
</ref> | |||
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref name="MortPetition2020"> | |||
{{cite web | |||
|url = https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images | |||
|title = Change.org petition: 'Tighten regulation on taking, making and faking explicit images' | |||
|last = Mort | |||
|first = Helen | |||
|date = 2020 | |||
|website = [[w:Change.org]] | |||
|publisher = [[w:Change.org]] | |||
|access-date = 2021-01-31 | |||
|quote = Unlike other forms of revenge porn, creating pictures or videos like this is not yet illegal in the UK, though it is in some places in the US. The police were unable to help me.}} | |||
</ref> This call is for similar laws as California put in place on January 1 2020. | |||
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks. | |||
[[ | |||
[[ | |||
{{#lst:Mediatheque|HelenMort2020}} | |||
{{# | |||
= Resources and reporting on law = | = Resources and reporting on law = | ||
== AI and law in general == | == AI and law in general == | ||
[[File:LOC_Main_Reading_Room_Highsmith.jpg|thumb|right|340px|[[w:Library of Congress]] ([https://loc.gov/ loc.gov]) reading room]] | [[File:LOC_Main_Reading_Room_Highsmith.jpg|thumb|right|340px|[[w:Library of Congress]] ([https://loc.gov/ loc.gov]) reading room]] | ||
Line 1,061: | Line 265: | ||
''' From Europe''' | ''' From Europe''' | ||
[[File:Flag of Europe.svg|thumb| | [[File:Flag of Europe.svg|thumb|left|180px|The [[w:European Union]]'s [[w:European Parliament|Parliament]]'s [[w:European Parliamentary Research Service]] on [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf The ethics of artificial intelligence: Issues and initiatives].]] | ||
* [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf ''''''The ethics of artificial intelligence: Issues and initiatives'''''' (.pdf) at europarl.europa.eu], a March 2020 study by the [[w:European Parliamentary Research Service]] Starting from page 37 the .pdf lists organizations in the field. | * [https://www.europarl.europa.eu/RegData/etudes/STUD/2020/634452/EPRS_STU(2020)634452_EN.pdf ''''''The ethics of artificial intelligence: Issues and initiatives'''''' (.pdf) at europarl.europa.eu], a March 2020 study by the [[w:European Parliamentary Research Service]] Starting from page 37 the .pdf lists organizations in the field. | ||
== Synthetic filth in the law and media == | == Synthetic filth in the law and media == | ||
* [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]]. | * [https://carnegieendowment.org/2020/07/08/deepfakes-and-synthetic-media-in-financial-system-assessing-threat-scenarios-pub-82237 ''''''Deepfakes and Synthetic Media in the Financial System: Assessing Threat Scenarios'''''' at carnegieendowment.org], a 2020-07-08 assessment identifies some types of criminalities that can be made using [[synthetic human-like fakes]]. | ||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You'], published in 2019 in the [[w:Duke Law Journal|Duke Law Journal]], a student-run law review. | |||
* [https://scholarship.law.duke.edu/dltr/vol17/iss1/4/ | |||
== The countries that have unfortunately banned full face veil == | == The countries that have unfortunately banned full face veil == | ||
Line 1,131: | Line 332: | ||
{{#lst:Quotes|FinalLineOfDefenseForTheTimeBeing}} | {{#lst:Quotes|FinalLineOfDefenseForTheTimeBeing}} | ||
= Law proposals = | |||
== Law proposals to ban covert modeling by [[User:Juho Kunsola|Juho Kunsola]] == | |||
* '''Audience''': Developed with suitability for national, supranational and UN treaty levels. | |||
* '''Writing context''': | |||
** Written from context of inclusion to '''criminal codes'''. | |||
** I'm a Finn so this has been worded to fit in the [https://www.finlex.fi/fi/laki/ajantasa/1889/18890039001#L24 Chapter 24 of the Criminal Code of Finland (in Finnish at finlex.fi)] titled "''Offences against privacy, public peace and personal reputation''" | |||
** [https://www.finlex.fi/fi/laki/kaannokset/1889/en18890039 Access the English translations of the Finnish Criminal Code at finlex.fi] or [https://www.finlex.fi/fi/laki/kaannokset/1889/en18890039_20150766.pdf go straight to the latest .pdf from 2016]. Chapter 24 starts on page 107. | |||
* '''History''': This version is an evolution of a Finnish language original written in 2016. | |||
Existing law in <big>Chapter 24. of the Finnish Criminal Code - "''Offences against privacy, public peace and personal reputation''"</big> seems to be ineffective against many [[synthetic human-like fakes|synthetic human-like fake attack]] and seems it could be used to frame victims for crimes with [[synthetic human-like fakes#Digital sound-alikes|digital sound-alikes]]. | |||
The portions affected by or affecting the synthetic filth situation in bold font: | |||
* Section 1 - Invasion of domestic premises (879/2013) | |||
* Section 1(a) - '''Harassing communications''' (879/2013) | |||
* Section 2 - Aggravated invasion of domestic premises (531/2000) | |||
* Section 3 - Invasion of public premises (585/2005) | |||
* Section 4 - Aggravated invasion of public premises (531/2000) | |||
* Section 5 - '''Eavesdropping''' (531/2000) | |||
* Section 6 - '''Illicit observation''' (531/2000) | |||
* Section 7 - '''Preparation of eavesdropping or illicit observation''' (531/2000) | |||
* Section 8 - '''Dissemination of information violating personal privacy''' (879/2013) | |||
* Section 8(a) - '''Aggravated dissemination of information violating personal privacy''' (879/2013) | |||
* Section 9 - '''Defamation''' (879/2013) | |||
* Section 10 - '''Aggravated defamation''' (879/2013) | |||
* Section 11 - Definition (531/2000) | |||
* Section 12 - '''Right to bring charges''' (879/2013) | |||
* Section 13 - '''Corporate criminal liability''' (511/2011) | |||
[[File:The-diffuse-reflection-deducted-from-the-specular-reflection-Debevec-2000.png|thumb|right|260px|Subtraction of the diffuse reflection from the specular reflection. Image is scaled for luminocity. Diffuse reflection is acquired by placing polarizers in 90 degree angle and specular with 0 degree angle. | |||
<br /><br /> | |||
[[:File:Deb-2000-reflectance-separation.png|Original picture]] by [[w:Paul Debevec|Debevec]] et al. - Copyright ACM 2000 https://dl.acm.org/citation.cfm?doid=311779.344855]] | |||
=== Law proposal to ban covert modeling of human voice === | |||
==== §1 Covert modeling of a human voice ==== | |||
'''Acquiring''' such a '''model of a human's voice''', that '''deceptively resembles''' some '''dead''' or '''living''' person's voice model of human voice, possession, purchase, sale, yielding, import and export '''without the express consent of the target''' is '''punishable'''. | |||
==== §2 Application of covert voice models ==== | |||
'''Producing''' and '''making available''' media from a covert voice model is '''punishable'''. | |||
==== §3 Aggravated application of covert voice models ==== | |||
If produced media is for a '''purpose''' to | |||
::* '''frame''' a '''human''' target or targets for '''crimes''' | |||
::* to attempt '''extortion''' or | |||
::* to '''defame the target''', | |||
the crime should be judged as '''aggravated'''. | |||
=== Law proposal to ban covert modeling of human appearance === | |||
<font color="red">'''Obs.'''</font> Should banning modeling people's appearance without explicit permission be pursued it must be <font color="green">formulated</font> so that this <font color="red">'''does not make'''</font> <big>'''[[Adequate Porn Watcher AI (concept)]]'''</big> <font color="red">illegal</font> / <font color="red">impossible</font>. | |||
One would assume that collecting permissions to model each porn is not plausible, so the question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet. | |||
==== §1 Covert modeling of human appearance ==== | |||
'''Covertly acquiring''' | |||
::* A '''3D model''' | |||
::* A '''[[Glossary#Bidirectional reflectance distribution function|7D bidirectional reflectance distribution function]] model''' and similar in results, but technically different model | |||
::* A '''direct-to-2D''' capable model and similar in results, but a technically different model | |||
without '''without consent''' covert modeling of appearance is '''illegal'''. Also '''possession''', '''purchase''', '''sale''', '''yielding''', '''import''' and '''export''' of '''covert models''' are '''punishable'''. | |||
==== §2 Aggravated covert modeling of human appearance ==== | |||
If a '''covert model''' of the '''head''' or the '''face''' is '''attached''' to a '''look-alike of a naked body''', regardless of whether that is '''synthetic''' or '''of human''', the crime should be judged as '''aggravated'''. | |||
==== §3 Application of covert appearance models ==== | |||
'''Projection''' and '''making available''' media from covert models defined in §1 is '''punishable'''. | |||
= | ==== §4 Aggravated application of covert appearance models ==== | ||
If the projection is portrayed in a '''nude''' or '''sexual situation''' or used with the '''intent to frame for a crime''' or for '''blackmail''', the crime should be judged as '''aggravated'''. | |||
---- | |||
= 1st seen in = | = 1st seen in = | ||
<references group="1st seen in" /> | <references group="1st seen in" /> | ||
Line 1,142: | Line 412: | ||
= References = | = References = | ||
<references /> | <references /> | ||