Squishy computers now enable the first fully soft robots

Harvard researchers devised a rubber computer which could lead to all sort of wacky soft robotics.

Thermal diode could allow computers to one day function on heat alone

No more fans for us!

Emotional computers really freak people out — a new take on the uncanny valley

Be like us, but not us.

AI can write new code by borrowing lines from other programs

So it begins.

NASA creates computers that can survive on Venus, 30 years after the last landings

They’ll have to face boiling acids and extreme pressures.

Scientists develop memory chips from egg shells

Omelette du RAM.

Atomic-sandwich material could make computers 100 times more energy efficient

Don’t need a calculator to know that’s a lot.

New method developed to encode huge quantity of data in diamonds

Data is a girl’s best friend.

Here’s why there was no Twitter on Friday — it’s way scarier than you think

Hordes of zombie gadgets had something to do with it.

IBM Scientists make phase-changing Artificial Neurons to mimic the Computer Power of Human Brain

Science is getting closer to a computer that mimics the human brain.

By 2040 our computers will use more power than we can produce

I didn’t know they could do that.

People pick up and use discarded USB drives they find almost half the time

Portable data storage, such as USB drives, might not be quite as useful or sought after as they once were but they remain an undeniably handy method to carry your data around.

Researchers devise AI that allows machines to learn just as fast as humans

From its first try, a computer can now draw handwritten characters from an unfamiliar language just as well as humans can.

Hard to crack and easy to remember password? Try a poem

“Please enter a strong password”, is now an ubiquitous greeting whenever we try to register online. Security experts advise we use long passwords at least 12 characters in length, which should include numbers, symbols, capital letters, and lower-case letters. Most websites nowadays force you to enter a password under some or all of these conditions. Moreover, the password shouldn’t contain dictionary words and combinations of dictionary words. Common substitution like “h0use” instead of “house” are also not recommended – these naive attempts will fool no automated hacking algorithm. So, what we end up at the end is a very strong password, like the website kindly asked (or forced) us to do. At the same time, it’s damn difficult if not impossible to remember. People end up endlessly hitting “recover password” or, far worse, write down their passwords in email or other notes on their computer which can easily be recovered by any novice hacker.

Celebrating Ada Lovelace: the first computer programmer (XIXth century)

In 1847, at the tender age of 27, Ada Lovelace became the world’s first programmer, more than a hundred years before the first computer was actually introduced.

First ever optical chip to permanently store data developed

Material scientists at Oxford University, collaborating with experts from Karlsruhe, Munster and Exeter, have developed the world’s first light-based memory banks that can store data permanently. The device is build from simple materials, in use in CDs and DVDs today, and promises to dramatically improve the speed of modern computing.

Prism-like bar code pattern might help make computers that use light instead of wires

A breakthrough in optical communications has been reported by Stanford engineers who used a complex algorithm to design a prism-like device that splits light into different colours (frequencies) and at right angles. This is the absolute first step towards building a circuit, and ultimately a computer, that uses light instead of wires to relay signals. This way, much more compact and

First computer made out of carbon nanotubes spells silicon demise in electronics

In an inspiring breakthrough, Stanford researchers have created the first ever working computer made entirely out of carbon nanotubes. The technology is still very infant, as the computer  operates on just one bit of information, and can only count to 32. Theoretically, however, it can be scaled up to perform billions of operations given enough memory.  With more refining, computers such