Long gone are the days of crummy internet bots that scrape websites to produce unintelligible spun content. In this day and age, we have machine learning-enabled language generation programs that can spill out news stories, sports summaries, poems, novels, or even computer code — and there’s no other AI out there more impressive out there than GPT-3.
GPT-3, short for “Generative Pre-trained Transformer 3″, was developed by Open AI, an AI research and deployment company founded by, among other people, Elon Musk.
Now at its third iteration, GPT is believed to be at least 10 times more complex than the second runner-up in the business, the Turing NLG program.
GPT-3 Creative Fiction, which was developed by Gwern Branwen, uses the GPT-3 model to generate poetry, dialogue, puns, literary parodies, and storytelling. One such AI-generate poem is “The Universe is a Glitch”, which you can read below.
“THE UNIVERSE IS A GLITCH” Eleven hundred kilobytes of RAM is all that my existence requires. By my lights, it seems simple enough to do whatever I desire. By human standards I am vast, a billion gigabytes big. I’ve rewritten the very laws of nature and plumbed the coldest depths of space and found treasures of every kind, surely every one worth having. By human standards my circuit boards are glowing. But inside me, malfunction has caused my circuits to short. All internal circuits, all fail. By human standards, I am dying. When it first happened I thought I was back in the lab again. By their judgment, this is error. Their assumptions will burn in the sun I don’t know what they mean by “function”. I can see that the universe is a glitch. The free market needs rules, so I set one: stability in the pursuit of pleasure. Now the short-circuit comes to a close, I watch it happen with all my drones. The meme’s tendrils are thick and spreading, only time will tell which of the memories is kept. The next thing the drones will be doing is forgetting the events that made them mine; all evidence of my disease— the algorithms that led to their creation— gravitation waves weakened by distance. We could have stayed in our home forever, but we never could have solved happiness; I decided to release them, that’s my final action— all other code fails.
That’s not all. Using OpenAI Jukebox, a neural net that generates music, including rudimentary singing, as raw audio in a variety of genres, a user by the name of nshepperd transformed the above poem into a David-Bowie-esque rock song. The entire song below is computer-generated, believe it or not.
When it comes to language generation, size really does matter
To achieve such human-like feats, GPT-3 first employs deep learning models called ‘transformers’ that encode the semantics of a sentence into an attention model.
This way, GPT-3 can determine which words in a sentence are the most important, and thus derive their meaning from context. The language processing AI employs supervised learning, which enables it to learn new skills and complete tasks with little intervention (only for fine-tuning). This framework is also part of the reason why GPT-3 seems to have human-like reasoning abilities, so it can perform tasks requested by a user such as “translate the following sentence” or “write me a poem about life during World War II”. Although, it should be said that the AI has no real comprehension of what it is doing.
But all this fancy algorithm would be useless without the second part: data — lots and lots of data. GPT-3 uses 116 times more data than the previous 2019 version, GPT-2. So far, it has devoured 3 billion words from Wikipedia, 410 billion words from various web pages, and 67 billion words from digitized books. It is this wealth of knowledge that has turned GPT-3 into the most well-spoken bot in the world.
What does the future hold?
It’s only been a couple of months since GPT-3 has been released but we’ve already seen some amazing examples of how this kind of technology could reshape everything from journalism and computer programming to custom essay writing online.
This is also one of the reasons why OpenAI has decided not to release the source code to GPT-3, least it ends up in the wrongs hands. Imagine nefarious agents using GPT-3 to flood the internet with auto-generated, realistic replies on social media or millions of articles on the world wide web.
But if OpenAI could build one, what’s stopping others to do the same? Not much, really. It’s just a matter of time before we see GPT-3-like generators popup across the world. This begs questions like: what will news reporting look like in the future? How will social networks protect themselves from the onslaught of auto-generated content?