The Right Click By Richard Lee

An Acronym That Says It All

The Right Click – January 2024 Edition

by Richard Lee

Generative, pre-trained, transformers are the words behind GPT, as in ChatGPT. Let’s break down each one, including what the word “chat” means.

Starting with the word “chat,” ChatGPT is designed to support interactive conversations and is done in ways that chat systems or chatbots have permitted, whereby you type inside of a field what you want to say or ask and, in a vertical flow, you see the response and previous inputs and outputs. As an interactive conversation, it means that unlike your searches with Google, you do not need to continuously repeat the main gist of your search. If you’re interested in geoglyphs and you initially ask, “What is a geoglyph?”, your continued queries on the same topic do not need to include “geoglyph,” as ChatGPT has a memory of what the topic is, and, like a conversation with a person, you can continue to converse around a topic without being redundant.

As for the term generative, it means that it is designed to generate text as the answer. This means that the resulting answer to a question like “What is a geoglyph?” doesn’t yield pages from the web like a Google Search does; it instead generates the answer for you. If you were to copy any of that generated text and search the web for it, you wouldn’t find anything because that text was generated and therefore original. This is where an evolving legal battle continues over who can copyright the content produced by an AI tool that lacks human authorship yet was directed by a human to generate content.

Pre-trained means that ChatGPT was trained using real-world information as the foundation of its intelligence. Not only the knowledge of “What is a geoglyph?” but also a broad understanding of language, context, and how to understand and speak human language. The amount of training data used in size, like gigabytes or terabytes, has never been released by OpenAI, but from what I have read, it did not exceed hundreds of terabytes (a terabyte is one thousand gigabytes) and not hundreds of yottabytes. There’s controversy within this as training data. Recently, the New York Times sued both OpenAI and Microsoft, the most significant investor behind OpenAI, for unlawful use of its work. Prior to that lawsuit, there was news about various authors, including the comedian Sarah Silverman, suing OpenAI for the same reason. Even though the content that ChatGPT creates has been generated, it was sourced from web-based information which did not appear on the web automatically and was generated by a human author.

The transformer is what weighs the importance of each word in your query or prompt in order to understand the context and relationships between words in a sentence. Whether your prompt is in broken English or any other language, the transformer understands intent. This is a significant difference over Google Search, which was all about keywords, and is also why the results with Google versus ChatGPT are significant. With Google, you have to do a lot of parsing of the results before you click to determine which will be the best use of your time. Whereas with ChatGPT, your terrific answer is right in front of you if you asked using a succinct prompt. The transformer architecture offers powerful language processing and is also the cornerstone in the advancement of AI-driven natural language understanding and generation.