WordPress Ad Banner

A Computer Breakthrough Helps Solve a Complex Math Problem 1 Million Times Faster

Reservoir computing, a machine learning algorithm that mimics the workings of the human brain, is revolutionizing how scientists tackle the most complex data processing challenges, and now, researchers have discovered a new technique that can make it up to a million times faster on specific tasks while using far fewer computing resources with less data input.

With the next-generation technique, the researchers were able to solve a complex computing problem in less than a second on a desktop computer — and these overly complex problems, such as forecasting the evolution of dynamic systems like weather that change over time, are exactly why reservoir computing was developed in the early 2000s.

These systems can be extremely difficult to predict, with the “butterfly effect” being a well-known example. The concept, which is closely associated with the work of mathematician and meteorologist Edward Lorenz, essentially describes how a butterfly fluttering its wings can influence the weather weeks later. Reservoir computing is well-suited for learning such dynamic systems and can provide accurate projections of how they will behave in the future; however, the larger and more complex the system, more computing resources, a network of artificial neurons, and more time are required to obtain accurate forecasts.

However, researchers know only how reservoir computing works, not what goes on inside. The artificial neural networks in reservoir computing are constructed on mathematics, and it appears that all the system required to run more efficiently was a simplification. A team of researchers led by Daniel Gauthier, lead author of the study and professor of physics at The Ohio State University, was able to do just that, dramatically reducing the need for computing resources and saving significant time.

When the concept was put to the test on a forecasting task, it was discovered that the next-generation reservoir computing technique was clearly superior to others, according to the study published in the journal Nature Communications.

Depending on the data, the new approach proved to be 33 to 163 times faster. However, when the work objective was changed to favor accuracy, the new model was 1 million times faster. This increase in speed was made possible by the fact that next-generation reservoir computing requires less warmup and training than previous generations.

“For our next-generation reservoir computing, there is almost no warming time needed,” explained Gauthier, in a press release. “Currently, scientists have to put in 1,000 or 10,000 data points or more to warm it up. And that’s all data that is lost, that is not needed for the actual work. We only have to put in one or two or three data points.”

Furthermore, the new technique was able to attain the same accuracy with only 28 neurons, as opposed to the 4,000 required by the current-generation model.

“What’s exciting is that this next generation of reservoir computing takes what was already very good and makes it significantly more efficient,” Gauthier stated. And it looks like this is only the beginning. The researchers plan to test the super-efficient neural network against more difficult tasks in the future, expanding the work to even more complex computer issues, such as fluid dynamics forecasting.

“That’s an incredibly challenging problem to solve,” Gauthier said. “We want to see if we can speed up the process of solving that problem using our simplified model of reservoir computing.”