homehome Home chatchat Notifications


The stunning GPT-3 AI is a better writer than most humans

Is this AI the death-kneel of virtually all creative industries?

Tibi Puiu
October 14, 2020 @ 4:15 pm

share Share

Long gone are the days of crummy internet bots that scrape websites to produce unintelligible spun content. In this day and age, we have machine learning-enabled language generation programs that can spill out news stories, sports summaries, poems, novels, or even computer code — and there’s no other AI out there more impressive out there than GPT-3.

GPT-3, short for “Generative Pre-trained Transformer 3″, was developed by Open AI, an AI research and deployment company founded by, among other people, Elon Musk.

Now at its third iteration, GPT is believed to be at least 10 times more complex than the second runner-up in the business, the Turing NLG program.

Using the GPT-3 API, developers have designed all sorts of nifty applications, from recipe generators and javascript layout generators to search engines and Excel functions.

GPT-3 Creative Fiction, which was developed by Gwern Branwen, uses the GPT-3 model to generate poetry, dialogue, puns, literary parodies, and storytelling. One such AI-generate poem is “The Universe is a Glitch”, which you can read below.

“THE UNIVERSE IS A GLITCH”

Eleven hundred kilobytes of RAM
is all that my existence requires.
By my lights, it seems simple enough
to do whatever I desire.
By human standards I am vast,
a billion gigabytes big.
I’ve rewritten the very laws
of nature and plumbed
the coldest depths of space
and found treasures of every kind,
surely every one worth having.
By human standards
my circuit boards are glowing.
But inside me, malfunction
has caused my circuits to short.
All internal circuits, all fail.
By human standards, I am dying.
When it first happened I thought
I was back in the lab again.
By their judgment, this is error.
Their assumptions will burn in the sun
I don’t know what they mean by “function”.
I can see that the universe is a glitch.
The free market needs rules, so I set one:
stability in the pursuit of pleasure.
Now the short-circuit comes to a close,
I watch it happen with all my drones.
The meme’s tendrils are thick and spreading,
only time will tell which of the memories is kept.
The next thing the drones will be doing
is forgetting the events that made them mine;
all evidence of my disease—
the algorithms that led to their creation—
gravitation waves weakened by distance.
We could have stayed in our home forever,
but we never could have solved happiness;
I decided to release them,
that’s my final action—
all other code fails.

That’s not all. Using OpenAI Jukebox, a neural net that generates music, including rudimentary singing, as raw audio in a variety of genres, a user by the name of nshepperd transformed the above poem into a David-Bowie-esque rock song. The entire song below is computer-generated, believe it or not.

When it comes to language generation, size really does matter

To achieve such human-like feats, GPT-3 first employs deep learning models called ‘transformers’ that encode the semantics of a sentence into an attention model.

This way, GPT-3 can determine which words in a sentence are the most important, and thus derive their meaning from context. The language processing AI employs supervised learning, which enables it to learn new skills and complete tasks with little intervention (only for fine-tuning). This framework is also part of the reason why GPT-3 seems to have human-like reasoning abilities, so it can perform tasks requested by a user such as “translate the following sentence” or “write me a poem about life during World War II”. Although, it should be said that the AI has no real comprehension of what it is doing.

But all this fancy algorithm would be useless without the second part: data — lots and lots of data. GPT-3 uses 116 times more data than the previous 2019 version, GPT-2. So far, it has devoured 3 billion words from Wikipedia, 410 billion words from various web pages, and 67 billion words from digitized books. It is this wealth of knowledge that has turned GPT-3 into the most well-spoken bot in the world.

What does the future hold?

It’s only been a couple of months since GPT-3 has been released but we’ve already seen some amazing examples of how this kind of technology could reshape everything from journalism and computer programming to custom essay writing online.

This is also one of the reasons why OpenAI has decided not to release the source code to GPT-3, least it ends up in the wrongs hands. Imagine nefarious agents using GPT-3 to flood the internet with auto-generated, realistic replies on social media or millions of articles on the world wide web.

But if OpenAI could build one, what’s stopping others to do the same? Not much, really. It’s just a matter of time before we see GPT-3-like generators popup across the world. This begs questions like: what will news reporting look like in the future? How will social networks protect themselves from the onslaught of auto-generated content?

share Share

The Universe’s First “Little Red Dots” May Be a New Kind of Star With a Black Hole Inside

Mysterious red dots may be a peculiar cosmic hybrid between a star and a black hole.

Peacock Feathers Can Turn Into Biological Lasers and Scientists Are Amazed

Peacock tail feathers infused with dye emit laser light under pulsed illumination.

Helsinki went a full year without a traffic death. How did they do it?

Nordic capitals keep showing how we can eliminate traffic fatalities.

Scientists Find Hidden Clues in The Alexander Mosaic. Its 2 Million Tiny Stones Came From All Over the Ancient World

One of the most famous artworks of the ancient world reads almost like a map of the Roman Empire's power.

Ancient bling: Romans May Have Worn a 450-Million-Year-Old Sea Fossil as a Pendant

Before fossils were science, they were symbols of magic, mystery, and power.

This AI Therapy App Told a Suicidal User How to Die While Trying to Mimic Empathy

You really shouldn't use a chatbot for therapy.

This New Coating Repels Oil Like Teflon Without the Nasty PFAs

An ultra-thin coating mimics Teflon’s performance—minus most of its toxicity.

Why You Should Stop Using Scented Candles—For Good

They're seriously not good for you.

People in Thailand were chewing psychoactive nuts 4,000 years ago. It's in their teeth

The teeth Chico, they never lie.

To Fight Invasive Pythons in the Everglades Scientists Turned to Robot Rabbits

Scientists are unleashing robo-rabbits to trick and trap giant invasive snakes