Neuroscientists say your brain operates in a regime termed the “edge of chaos,” and it’s actually a good thing.
It’s a state that allows for fast, efficient analog computation of the kind that can solve problems that grow vastly more difficult as they become bigger in size.
“No one had been able to show chaotic dynamics in a single scalable electronic device,” says Suhas Kumar, a researcher at Hewlett Packard Labs, in Palo Alto, Calif. Until now, that is.
He, John Paul Strachan, and R. Stanley Williams recently showed that a particular configuration of a certain type of memristor contains that seed of controlled chaos.
What’s more, when they simulated wiring these up into a type of circuit called a Hopfield neural network, the circuit was capable of solving a ridiculously difficult problem—1,000 instances of the traveling salesman problem—at a rate of 10 trillion operations per second per watt.
(It’s not an apples-to-apples comparison, but the world’s most powerful supercomputer as of June 2017 managed 93,015 trillion floating point operations per second but consumed 15 megawatts doing it.