homehome Home chatchat Notifications


Google AI predicts over 2 million new crystals. Is this the future of material science?

DeepMind's AI GNoME has predicted 2.2 million new crystal structures, vastly expanding potential materials for advanced technology development.

Tibi Puiu
November 30, 2023 @ 4:42 pm

share Share

illustration of google ai predicting new materials
Credit: AI-generated, DALLE-3.

Researchers at Google DeepMind have made a groundbreaking leap in material science, unveiling 2.2 million new crystal structures with immense potential in numerous industries. Just imagine that before these structures were predicted by Google’s deep learning programs, scientists knew of fewer than 50,000 different crystals.

This monumental discovery not only showcases the prowess of AI in material exploration but also marks a significant milestone surpassing centuries of scientific discovery.

AI and material innovation

The huge trove of new crystals was identified by GNoME, DeepMind’s deep learning AI specifically designed for this purpose. Trained with data from the Materials Project, GNoME suggested structures likely to be stable, later verified with established computational techniques.

Of the more than two million predicted crystalline structures, 381,000 of the more promising candidates are being openly shared with scientists worldwide for further exploration. This means that the number of known materials could jump tenfold almost overnight.

“While materials play a very critical role in almost any technology, we as humanity know only a few tens of thousands of stable materials,” said Dogus Cubuk, materials discovery lead at Google DeepMind, during a recent press briefing. 

GNoME uses two methods to discover millions of potentially new materials: searching for similar crystal structures and making new structures from scratch in a more or less random fashion. Credit: Google Deepmind.

Until now, discovering new materials has mostly been a slow, costly process of trial and error. The time-honored approach involved making incremental changes to known materials or combining elements based on principles of solid-state chemistry. This labor-intensive method has produced tens of thousands of stable materials over many years.

But with DeepMind’s latest development, the possibilities are endless. Although these materials will still require synthesis and testing — a process that still takes a long time to follow through — the AI’s predictions are expected to hasten the discovery of materials vital for next-generation technologies like energy storage, solar cells, and high-density batteries.

For instance, among the predicted materials are potential lithium ion conductors and new layered compounds similar to graphene, holding great promise for superconducting materials. Superconductors can conduct electrical current with zero resistance, greatly boosting efficiency.

In parallel, there are initiatives designed to speed up material synthesis. The experimental A-Lab at the Lawrence Berkeley National Laboratory can automatically synthesize materials around the clock. During a session lasting 17 days, the lab synthesized 41 materials, a task that typically takes months or years. Coupled with AI-predicted crystal structures, one could envision how new materials could be made from scratch in almost the blink of an eye compared to cumbersome conventional methods.

“This is the future—to design materials autonomously using computers, but also then to make them autonomously using these robotic labs and learn from the process,”  Kristin Persson of the Lawrence Berkeley National Laboratory said in a media briefing.

A new era for material science?

The research’s potential applications are vast, ranging from developing new layered materials to advancing neuromorphic computing. Scientists from the University of California, Berkeley, and the Lawrence Berkeley National Laboratory have already utilized these findings, creating new materials with a success rate of over 70%, according to DeepMind.

What’s particularly exciting is that this is just the latest in a string of AI breakthroughs from DeepMind. Previously, Google’s artificial intelligence arm unveiled the extremely powerful AlphaFold, which cracked the code for 200 million protein structures, or virtually all proteins known to science.

The findings were reported in the journal Nature.

share Share

World's Oldest Water is 1.6 billion Years Old -- and This Scientist Tasted It

Apparently, it tastes 'very salty and bitter'.

New Dads’ Brains Light Up in Surprising Ways When They See Their Babies

New fathers’ brains respond uniquely to their own infants, tuning for care and connection.

Divers Pulled a Sphinx and Roman Coins From a 2,000-Year-Old Sunken City in Egypt

Archaeologists lift ancient treasures from Abu Qir Bay.

Heatwaves Don't Just kill People. They Also Make Us Older

Every year's worth of heatwaves could add about two weeks of aging to your body

Parked Dark-Colored Cars Are Like Mini Heat Islands That Make City Streets Several Degrees Hotter

The color of your car may be heating your street—and your city

Horned 'Zombie Rabbits' Spook Locals in Colorado But Scientists Say These Could Hold Secrets to Cancer

The bizarre infection could help cancer research.

Orange Cats Are Genetically Unlike Any Other Mammal and Now We Know Why

The iconic coats are due to a mutation not seen in other animals.

Giant solar panels in space could deliver power to Earth around the clock by 2050

A new study shows space solar panels could slash Europe’s energy costs by 2050.

Our Primate Ancestors Weighed Less Than an Ounce and Surprisingly Evolved in The Cold – Not The Tropics

New research overturns decades of assumptions about how – and where – our lineage began.

Frozen Wonder: Ceres May Have Cooked Up the Right Recipe for Life Billions of Years Ago

If this dwarf planet supported life, it means there were many Earths in our solar system.