Quantcast
ZME Science
  • News
  • Environment
  • Health
  • Future
  • Space
  • Features
  • More
    • About
    • The Team
    • Advertise
    • Contribute
    • Our stance on climate change
    • Privacy Policy
    • Contact
No Result
View All Result
ZME Science

No Result
View All Result
ZME Science
No Result
View All Result
Home Science News

AI passes math test like an average high-school student

Researchers from the University of Washington and the Allen Institute for Artificial Intelligence (AI2) have developed a computer software that scored 49% on high-school geometry SAT tests - an average score for a human, but a great one for current AIs.

Mihai Andrei by Mihai Andrei
September 22, 2015
in News, Technology
Reading Time: 2 mins read
A A
Share on FacebookShare on TwitterSubmit to Reddit

Researchers from the University of Washington and the Allen Institute for Artificial Intelligence (AI2) have developed a computer software that scored 49% on high-school geometry SAT tests – an average score for a human, but a great one for current AIs.

Image via Pixabay.

Considering how computers work, you’d think they should ace any test – especially math test – but the key difference is the way the test was presented. It wasn’t presented in a binary form or a form that the AI would naturally understand and perform well. It was presented in actual text, just like a regular student would receive it. This mean that it understood not only the explanations, but also the accompanying diagrams and charts.

“Unlike the Turing Test, standardized tests such as the SAT provide us today with a way to measure machines ability to reason and to compare its abilities with that of a human,” said Oren Etzioni, CEO of AI2. “Much of what we understand from text and graphics is not explicitly stated, and requires far more knowledge than we appreciate. Creating a system to successfully take these tests is challenging, and we are proud to achieve these unprecedented results.”

The AI is called GeoS, and its breakthrough is indeed significant. While software programmers have no problem putting things into a perspective that software can understand and crunch, they struggle quite a lot when they have to make a computer understand things like a human.

ADVERTISEMENT

It works by reading and interpreting the text and diagrams, then matching it with possible logical solutions and puts them through its geometry solver. It then compares its solution with multiple choice options given in the paper.

Sorry to interrupt, but you should really...

...Join the ZME newsletter for amazing science news, features, and exclusive scoops. More than 40,000 subscribers can't be wrong.

   

“We’re excited about GeoS performance on real-world tasks,” said Ali Farhadi, a senior research manager at AI2. “Our biggest challenge was converting the question to a computer-understandable language. One needs to go beyond standard pattern matching approaches for problems like solving geometry questions that require in-depth understating of text, diagram and reasoning.”

GeoS is just one of the many projects which are currently trying to take different human exams. The Allen Institute’s Project Aristo is trying to master fourth grade science, while Fujitsu and IBM are working on passing the University of Tokyo entrance exam.

ADVERTISEMENT

Tags: artificial intelligencesatuniversity of washington
ShareTweetShare
Mihai Andrei

Mihai Andrei

Andrei's background is in geophysics, and he's been fascinated by it ever since he was a child. Feeling that there is a gap between scientists and the general audience, he started ZME Science -- and the results are what you see today.

ADVERTISEMENT
ADVERTISEMENT
  • News
  • Environment
  • Health
  • Future
  • Space
  • Features
  • More

© 2007-2019 ZME Science - Not exactly rocket science. All Rights Reserved.

No Result
View All Result
  • News
  • Environment
  • Health
  • Future
  • Space
  • Features
  • More
    • About
    • The Team
    • Advertise
    • Contribute
    • Our stance on climate change
    • Privacy Policy
    • Contact

© 2007-2019 ZME Science - Not exactly rocket science. All Rights Reserved.