homehome Home chatchat Notifications


OpenAI will use Reddit and a new supercomputer to teach artificial intelligence how to speak

There's no "forum" in "AI" but Musk thinks there should be.

Alexandru Micu
August 19, 2016 @ 4:31 pm

share Share

OpenAI, Elon Musk’s artificial intelligence research company, just became the proud owner of the first ever DGX-1 supercomputer. Made by NVIDIA, the rig boasts a whopping 170 teraflops of computing power, equivalent to 250 usual servers — and OpenAI is gonna use it all to read Reddit comments.

OpenAI’s researchers gather around the first AI supercomputer in a box, NVIDIA DGX-1.
Image credits NVIDIA.

OpenAI is a non-profit AI research company whose purpose is to “advance digital intelligence in the way that is most likely to benefit humanity as a whole, unconstrained by a need to generate financial return.” And now, NVIDIA CEO CEO Jen-Hsun Huang just delivered the most powerful tool the company has ever had at its disposal, a US$ 2 billion supercomputer.

This “AI supercomputer in a box” isn’t much bigger than a large-ish desktop PC, but it packs a huge punch. It’s 170 teraflops of computing power makes is roughly equivalent to 250 conventional servers working together, and all that oopmh is being put to good use for a worthy cause.

“The world’s leading non-profit artificial intelligence research team needs the world’s fastest AI system,” NVIDIA said in a statement.

“I thought it was incredibly appropriate that the world’s first supercomputer dedicated to artificial intelligence would go to the laboratory that was dedicated to open artificial intelligence,” Huang added.

But an OpenAI needs to do more than just process things very fast. It needs to learn and Musk, unlike my parents, believes that the best place to do so is on the Internet — specifically, on Reddit. The forum’s huge size makes for an ideal training ground for DGX-1, which will be spending the next few months processing nearly two billion comments to learn how to better chat with human beings.

“Deep learning is a very special class of models because as you scale up, they always work better,” says OpenAI researcher Andrej Karpathy.

The $129,000 super-computer relies on eight NVIDIA Tesla P100 GPUs (graphic processing units), 7 terabytes of SSD storage, and two Xeon processors (apart from the aforementioned 170 teraflops of performance) to go through this data and make sense of it all.

“You can take a large amount of data that would help people talk to each other on the internet, and you can train, basically, a chatbot, but you can do it in a way that the computer learns how language works and how people interact,” Karpathy added.

Even better, the new supercomputer is designed to function with OpenAI’s existing software. All that’s needed is to scale it up.

“We won’t need to write any new code, we’ll take our existing code and we’ll just increase the size of the model,” says OpenAI scientist Ilya Sutskever. “And we’ll get much better results than we have right now.”

NVIDIA CEO Jen-Hsun Huang slides open the DGX-1’s GPU tray at OpenAI’s headquarters in San Francisco.
Image credits NVIDIA

share Share

The Universe’s First “Little Red Dots” May Be a New Kind of Star With a Black Hole Inside

Mysterious red dots may be a peculiar cosmic hybrid between a star and a black hole.

Peacock Feathers Can Turn Into Biological Lasers and Scientists Are Amazed

Peacock tail feathers infused with dye emit laser light under pulsed illumination.

Helsinki went a full year without a traffic death. How did they do it?

Nordic capitals keep showing how we can eliminate traffic fatalities.

Scientists Find Hidden Clues in The Alexander Mosaic. Its 2 Million Tiny Stones Came From All Over the Ancient World

One of the most famous artworks of the ancient world reads almost like a map of the Roman Empire's power.

Ancient bling: Romans May Have Worn a 450-Million-Year-Old Sea Fossil as a Pendant

Before fossils were science, they were symbols of magic, mystery, and power.

This AI Therapy App Told a Suicidal User How to Die While Trying to Mimic Empathy

You really shouldn't use a chatbot for therapy.

This New Coating Repels Oil Like Teflon Without the Nasty PFAs

An ultra-thin coating mimics Teflon’s performance—minus most of its toxicity.

Why You Should Stop Using Scented Candles—For Good

They're seriously not good for you.

People in Thailand were chewing psychoactive nuts 4,000 years ago. It's in their teeth

The teeth Chico, they never lie.

To Fight Invasive Pythons in the Everglades Scientists Turned to Robot Rabbits

Scientists are unleashing robo-rabbits to trick and trap giant invasive snakes