For the first time, my colleagues and I have built a single electronic device that is capable of copying the functions of neuron cells in a brain. We then connected 20 of them together to perform a complicated calculation. This work shows that it is scientifically possible to make an advanced computer that does not rely on transistors to calculate and that uses much less electrical power than today’s data centers. Our research, which I began in 2004, was motivated by two questions. Can we build a single electronic element – the equivalent of a transistor or switch – that… This story continues at The Next Web
Analog computing with neuron-like devices could efficiently solve problems traditional computers struggle with
(Texas A&M University) In the September issue of the journal Nature, scientists from Texas A&M University, Hewlett Packard Labs and Stanford University have described a new nanodevice that acts almost identically to a brain cell. Furthermore, they have shown that these synthetic brain cells can be joined together to form intricate networks that can then solve problems in a brain-like manner.
Neuroscientists say your brain operates in a regime termed the “edge of chaos,” and it’s actually a good thing.It’s a state that allows for fast, efficient analog computation of the kind that can solve problems that grow vastly more difficult as they become bigger in size.“No one had been able to show chaotic dynamics in a single scalable electronic device,” says Suhas Kumar, a researcher at Hewlett Packard Labs, in Palo Alto, Calif. Until now, that is.He, John Paul Strachan, and R. Stanley Williams recently showed that a particular configuration of a certain type of memristor contains that seed of controlled chaos.What’s more, when they simulated wiring these up into a type of circuit called a Hopfield neural network, the circuit was capable of solving a ridiculously difficult problem—1,000 instances of the traveling salesman problem—at a rate of 10 trillion operations per second per watt.(It’s not an apples-to-apples comparison, but the world’s most powerful supercomputer as of June 2017 managed 93,015 trillion floating point operations per second but consumed 15 megawatts doing it.
Dumping Moore's Law is perhaps the best thing that could happen to computers, as it'll hasten the move away from an aging computer architecture holding back hardware innovation.That's the view of prominent scientist R. Stanley Williams, a senior fellow in the Hewlett Packard Labs.Moore's Law is an observation made by Intel co-founder Gordon Moore in 1965 that has helped make devices smaller and faster.It predicts that the density of transistors would double every 18 to 24 months, while the cost of making chips goes down.Every year, computers and mobile devices that are significantly faster can be bought with the same amount of money thanks in part to guidance from Moore's Law.That's a challenge facing all top chip makers including Intel, which is changing the way it interprets Moore's Law as it tries to cling on to it for dear life.
More

Top