A computer breakthrough helps solve a complex math problem 1 million times faster

Computational reservoirs, a machine learning algorithm that mimics the workings of the human brain, revolutionize the way scientists deal with the most complex data processing challenges today, researchers have discovered a new technique which can do it up to a million times faster on specific tasks, while using much fewer computing resources with less data entry.

With next-generation technology, researchers have been able to solve a complex computational problem in less than a second on a desktop computer — and these overly complex problems, such as predicting the evolution of dynamical systems such as time-varying times, are why the tanks were developed in the early 2000s.

These systems can be extremely difficult to predict, such as “butterfly Effect“is a well-known example. The concept, which is closely related to the work of mathematician and meteorologist Edward Lorenz, essentially describes how a butterfly fluttering its wings can affect the weather weeks later. Tank calculations are suitable for studying such systems dynamics and can provide accurate predictions of how they will behave in the future, however, the larger and more complex the system, the more computational resources, a network of artificial neurons, and the more time it takes to obtain accurate predictions.

However, researchers only know how the tank calculations work, not what happens inside. Artificial neural networks in tank calculations are built on mathematics, and it seems that the whole system needed for more efficient operation is a simplification. A team of researchers led by Daniel Gauthier, lead author of the study and professor of physics at Ohio State University, was able to do just that, drastically reducing the need for computing resources and saving significant time.

When the concept was tested in a forecasting task, it was found that the computing technique of the next generation tank was clearly superior to the others, according to a study published in the journal Nature Communications.

Depending on the data, the new approach turned out to be 33 to 163 times faster. However, when the work goal was changed in favor of accuracy, the new model was 1 million times faster. This increase in speed has been made possible by the fact that next-generation computing tanks require less warm-up and training than previous generations.

“Our next-generation computing tanks take almost no heating time,” Gauthier explained. in a press release. “Currently, scientists have to enter 1,000 or 10,000 data points or more to warm it up. And that’s all the data that is lost that isn’t needed for the actual work. We just have to put one or two or three points of data. “

In addition, the new technique was able to achieve the same accuracy with only 28 neurons, as opposed to the 4,000 required by the current generation model.

“The most exciting thing is that this next generation of tank calculations accepts what was already very good and makes it much more efficient,” Gauthier said. And this seems to be just the beginning. The researchers plan to test the overactive neural network against more difficult tasks in the future, extending the work to even more complex computer problems, as a prediction of fluid dynamics.

“This is an extremely challenging problem to solve,” Gauthier said. “We want to see if we can speed up the process of solving this problem using our simplified tank calculation model.”

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button