Laws against synthesis and other related crimes: Difference between revisions
Juho Kunsola (talk | contribs) (+ == Law on synthetic filth in the UK == + British law requires improvement to meet today's world of industrial disinformation filth + petition 'Tighten regulation on taking, making and faking explicit images' at Change.org) |
Juho Kunsola (talk | contribs) (→Law on synthetic filth in the UK: + The independent Law Commission is currently reviewing the law as it applies to taking, making and sharing intimate images without consent... + linking to Wikipedia) |
||
Line 104: | Line 104: | ||
== Law on synthetic filth in the UK == | == Law on synthetic filth in the UK == | ||
The UK law does not seem very up-to-date on the issue of synthetic filth. | |||
"In 2019, law expert Dr Aislinn O’Connell told The Independent that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The Women and Equalities Committee called on the UK Government to introduce new legislation on image-based sexual abuse in order to criminalise ALL non-consensual creation and distribution of intimate sexual images."<ref>https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images</ref> | The independent [[w:Law Commission (England and Wales)]] is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.<ref>https://www.bbc.com/news/technology-55546372</ref> | ||
"In 2019, law expert Dr Aislinn O’Connell told [[w:The Independent]] that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The [[w:Women and Equalities Committee]] called on the UK Government to introduce new legislation on image-based sexual abuse in order to '''criminalise ALL''' non-consensual creation and distribution of intimate sexual images."<ref>https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images</ref> | |||
The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks. | The petition [https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images ''''''Tighten regulation on taking, making and faking explicit images'''''' at Change.org] by [[w:Helen Mort]] aims to petition the UK govt for proper legislation against synthetic filth. See the [[mediatheque]] for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks. |
Revision as of 17:06, 31 January 2021
Laws and their application
Law on synthetic filth in Virginia
Since July 1 2019[1] w:Virginia w:has criminalized the sale and dissemination of unauthorized synthetic pornography, but not the manufacture.[2], as § 18.2-386.2 titled 'Unlawful dissemination or sale of images of another; penalty.' became part of the w:Code of Virginia. The law text states: "Any person who, with the w:intent to w:coerce, w:harass, or w:intimidate, w:maliciously w:disseminates or w:sells any videographic or still image created by any means whatsoever that depicts another person who is totally [w:[nudity|nude]], or in a state of undress so as to expose the w:genitals, pubic area, w:buttocks, or female w:breast, where such person knows or has reason to know that he is not w:licensed or w:authorized to disseminate or sell such videographic or still image is guilty of a Class 1 w:misdemeanor.".[2] The identical bills were House Bill 2678 presented by w:Delegate w:Marcus Simon to the w:Virginia House of Delegates on January 14 2019 and three day later an identical Senate bill 1736 was introduced to the w:Senate of Virginia by Senator w:Adam Ebbin.
Law on synthetic filth in Texas
Since September 1 2019 w:Texas senate bill SB 751 w:amendments to the election code came into effect, giving w:candidates in w:elections a 30-day protection period to the elections during which making and distributing digital look-alikes or synthetic fakes of the candidates is an offense. The law text defines the subject of the law as "a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality"[3]
Law on synthetic filth in California
January 1 2020 [4] the w:California w:US state law AB-602 came into effect banning the manufacturing and w:digital distribution of synthetic pornography without the w:consent of the people depicted. AB-602 provides victims of synthetic pornography with w:injunctive relief and poses legal threats of w:statutory and w:punitive damages on w:criminals making or distributing synthetic pornography without consent. The bill AB-602 was signed into law by California w:Governor w:Gavin Newsom on October 3 2019 and was authored by w:California State Assembly member w:Marc Berman.[5]
Law on synthetic filth in China
On January 1 2020 Chinese law requiring that synthetically faked footage should bear a clear notice about its fakeness came into effect. Failure to comply could be considered a w:crime the w:Cyberspace Administration of China stated on its website. China announced this new law in November 2019.[6] The Chinese government seems to be reserving the right to prosecute both users and w:online video platforms failing to abide by the rules. [7]
Law on synthetic filth in the UK
The UK law does not seem very up-to-date on the issue of synthetic filth.
The independent w:Law Commission (England and Wales) is currently reviewing the law as it applies to taking, making and sharing intimate images without consent. The outcome of the consultation is due to be published later in 2021.[8]
"In 2019, law expert Dr Aislinn O’Connell told w:The Independent that our current laws on image sharing are piecemeal and not fit for purpose. In October 2018 The w:Women and Equalities Committee called on the UK Government to introduce new legislation on image-based sexual abuse in order to criminalise ALL non-consensual creation and distribution of intimate sexual images."[9]
The petition 'Tighten regulation on taking, making and faking explicit images' at Change.org by w:Helen Mort aims to petition the UK govt for proper legislation against synthetic filth. See the mediatheque for a video by Helen Mort on her ordeal of becoming the victim of covert disinformation attacks.
Resources and reporting on law
AI and law in general
Reviews and regulation From the w:Library of Congress:
- 'Regulation of Artificial Intelligence' at loc.gov
- 'Regulation of Artificial Intelligence: Comparative Summary' at loc.gov
- 'Regulation of Artificial Intelligence: International and Regional Approaches' (loc.gov)
- 'Regulation of Artificial Intelligence: The Americas and the Caribbean' (loc.gov)
- 'Regulation of Artificial Intelligence: East/South Asia and the Pacific' (loc.gov)
- 'Regulation of Artificial Intelligence: Europe and Central Asia' loc.gov
- 'Regulation of Artificial Intelligence: Middle East and North Africa' (loc.gov)
- 'Regulation of Artificial Intelligence: Sub-Saharan Africa' (loc.gov)
w:Gibson Dunn & Crutcher (gibsondunn.com) publishes a quarterly legal update on 'Artificial Intelligence and Autonomous Systems'. Gibson Dunn & Crutcher is a global w:law firm, founded in Los Angeles in 1890.
- 'Artificial Intelligence and Autonomous Systems Legal Update' Quarter 4 2018 at Gibson & Dunn
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 1 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 2 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 3 2019'
- 'Artificial Intelligence and Autonomous Systems Legal Update Quarter 4 2019'
From Europe
- 'The ethics of artificial intelligence: Issues and initiatives' (.pdf) at europarl.europa.eu, a March 2020 study by the w:European Parliamentary Research Service Starting from page 37 the .pdf lists organizations in the field.
Synthetic filth and the law
- 'Deepfakes: False Pornography Is Here and the Law Cannot Protect You', published in 2019 in the Duke Law Journal, a student-run law review.
The unfortunate countries that have banned full face veil
France and Denmark are known to have done the uncivilized and have laws in place banning wearing a the full face veil in public.
Quotes on the current laws and their application
“If no-one who wants to hurt you knows what you look like, so how could someone malevolent make a covert digital look-alike of you?”
Law proposals
- Audience: Developed with suitability for national, supranational and UN treaty levels.
- Writing context:
- Written from context of inclusion to criminal codes.
- I'm a Finn so this has been worded to fit in the Chapter 24 of the Criminal Code of Finland (in Finnish at finlex.fi) titled "Offences against privacy, public peace and personal reputation"
- Access the English translations of the Finnish Criminal Code at finlex.fi or go straight to the latest .pdf from 2016. Chapter 24 starts on page 107.
- History: This version is an evolution of a Finnish language original written in 2016.
Existing law in Chapter 24. of the Finnish Criminal Code - "Offences against privacy, public peace and personal reputation" seems to be ineffective against many synthetic human-like fake attack and seems it could be used to frame victims for crimes with digital sound-alikes.
The portions affected by or affecting the synthetic filth situation in bold font:
- Section 1 - Invasion of domestic premises (879/2013)
- Section 1(a) - Harassing communications (879/2013)
- Section 2 - Aggravated invasion of domestic premises (531/2000)
- Section 3 - Invasion of public premises (585/2005)
- Section 4 - Aggravated invasion of public premises (531/2000)
- Section 5 - Eavesdropping (531/2000)
- Section 6 - Illicit observation (531/2000)
- Section 7 - Preparation of eavesdropping or illicit observation (531/2000)
- Section 8 - Dissemination of information violating personal privacy (879/2013)
- Section 8(a) - Aggravated dissemination of information violating personal privacy (879/2013)
- Section 9 - Defamation (879/2013)
- Section 10 - Aggravated defamation (879/2013)
- Section 11 - Definition (531/2000)
- Section 12 - Right to bring charges (879/2013)
- Section 13 - Corporate criminal liability (511/2011)
Law proposals to ban covert modeling by Juho Kunsola
Law proposal to ban covert modeling of human voice
§1 Covert modeling of a human voice
Acquiring such a model of a human's voice, that deceptively resembles some dead or living person's voice model of human voice, possession, purchase, sale, yielding, import and export without the express consent of the target is punishable.
§2 Application of covert voice models
Producing and making available media from a covert voice model is punishable.
§3 Aggravated application of covert voice models
If produced media is for a purpose to
- frame a human target or targets for crimes
- to attempt extortion or
- to defame the target,
the crime should be judged as aggravated.
Law proposal to ban covert modeling of human appearance
Obs. Should banning modeling people's appearance without explicit permission be pursued it must be formulated so that this does not make Adequate Porn Watcher AI (concept) illegal / impossible.
One would assume that collecting permissions to model each porn is not plausible, so the question is that can we ban covert modeling from non-pornographic pictures, while still retaining the ability to model all porn found on the Internet.
§1 Covert modeling of human appearance
Covertly acquiring
- A 3D model
- A 7D bidirectional reflectance distribution function model and similar in results, but technically different model
- A direct-to-2D capable model and similar in results, but a technically different model
without without consent covert modeling of appearance is illegal. Also possession, purchase, sale, yielding, import and export of covert models are punishable.
§2 Aggravated covert modeling of human appearance
If a covert model of the head or the face is attached to a look-alike of a naked body, regardless of whether that is synthetic or of human, the crime should be judged as aggravated.
§3 Application of covert appearance models
Projection and making available media from covert models defined in §1 is punishable.
§4 Aggravated application of covert appearance models
If the projection is portrayed in a nude or sexual situation or used with the intent to frame for a crime or for blackmail, the crime should be judged as aggravated.
References
- ↑ "New state laws go into effect July 1".
- ↑ 2.0 2.1 "§ 18.2-386.2. Unlawful dissemination or sale of images of another; penalty". w:Virginia. Retrieved 2021-01-23.
- ↑
"Relating to the creation of a criminal offense for fabricating a deceptive video with intent to influence the outcome of an election". w:Texas. 2019-06-14. Retrieved 2021-01-23.
In this section, "deep fake video" means a video, created with the intent to deceive, that appears to depict a real person performing an action that did not occur in reality
- ↑ Johnson, R.J. (2019-12-30). "Here Are the New California Laws Going Into Effect in 2020". KFI. iHeartMedia. Retrieved 2021-01-23.
- ↑ Mihalcik, Carrie (2019-10-04). "California laws seek to crack down on deepfakes in politics and porn". w:cnet.com. w:CNET. Retrieved 2021-01-23.
- ↑ "China seeks to root out fake news and deepfakes with new online content rules". w:Reuters.com. w:Reuters. 2019-11-29. Retrieved 2021-01-23.
- ↑ Statt, Nick (2019-11-29). "China makes it a criminal offense to publish deepfakes or fake news without disclosure". w:The Verge. Retrieved 2021-01-23.
- ↑ https://www.bbc.com/news/technology-55546372
- ↑ https://www.change.org/p/the-law-comission-tighten-regulation-on-taking-making-and-faking-explicit-images