3,839
edits
Juho Kunsola (talk | contribs) (→Text synthesis: + w:OpenAI's Generative Pre-trained Transformer is a left-to-right transformer-based text generation model succeeded by GPT-2 and GPT-3) |
Juho Kunsola (talk | contribs) m (→Text synthesis: + abbreviation GPT for "Generative Pre-trained Transformer") |
||
Line 136: | Line 136: | ||
In [[w:natural language processing]] development in [[w:natural-language understanding]] leads to more cunning [[w:natural-language generation]] AI. | In [[w:natural language processing]] development in [[w:natural-language understanding]] leads to more cunning [[w:natural-language generation]] AI. | ||
[[w:OpenAI]]'s [[w:OpenAI#GPT|Generative Pre-trained Transformer]] is a left-to-right [[w:Transformer (machine learning model)|transformer]]-based [[w:Natural-language generation|text generation]] model succeeded by [[w:OpenAI#GPT-2|GPT-2]] and [[w:OpenAI#GPT-3|GPT-3]] | [[w:OpenAI]]'s [[w:OpenAI#GPT|Generative Pre-trained Transformer]] ('''GPT''') is a left-to-right [[w:Transformer (machine learning model)|transformer]]-based [[w:Natural-language generation|text generation]] model succeeded by [[w:OpenAI#GPT-2|GPT-2]] and [[w:OpenAI#GPT-3|GPT-3]] | ||
== Countermeasures against synthetic human-like fakes == | == Countermeasures against synthetic human-like fakes == |