AlphaTensor: Artificial Intelligence Discovers Optimized Algorithms

Science Alpha Tensor

Artificial intelligence discovers optimized algorithms

The sought-after matrix multiplication algorithms are also used to process images on smartphones

The sought-after matrix multiplication algorithms are also used to process images on smartphones

Source: DeepMind

You can listen to our WELT podcasts here

To display embedded content, your revocable consent to the transfer and processing of personal data is required, as the providers of the embedded content as third-party providers require this consent [In diesem Zusammenhang können auch Nutzungsprofile (u.a. auf Basis von Cookie-IDs) gebildet und angereichert werden, auch außerhalb des EWR]. By turning the switch to “on” you agree to this (it can be withdrawn at any time). This also includes your consent to the transfer of certain personal data to third countries, including the US, in accordance with Article 49 (1) (a) GDPR. You can find more information about this. You can withdraw your consent at any time via the switch and via privacy at the bottom of the page.

Whether it’s weather forecasting or speech recognition, algorithms play a central role in many computer applications. Researchers now present an artificial intelligence that they independently find and optimize. AlphaTensor has already discovered more than 70 improved algorithms.

dAlphaZero’s record hunt continues. But the artificial intelligence (AI) based software is no longer trying to figure out the best ways to play chess or Go. Instead, the slightly modified program is now called AlphaTensor and it looks for the shortest algorithms for certain computer operations: the multiplication of matrices, which is used, among other things, for weather forecasting. AlphaTensor has already discovered more than 70 improved algorithms, a group from Google-owned DeepMind in London reported in the journal Nature.

Algorithms are mathematical calculation methods to solve a problem. Such methods have been used for thousands of years, the researchers say, and now play a central role in many computer functions such as image processing.

“Improving the efficiency of fundamental computation algorithms can have far-reaching implications, as it can affect the overall speed of a large number of computations,” write the authors, led by Alhussein Fawzi. For example, the matrix multiplication algorithms that AlphaTensor searches for are used to process images on smartphones, recognize voice commands, generate images for computer games, compress data and videos, and much more.

read also

Illustration of SLS rocket and Orion spacecraft on the mobile launcher for Artemis I. Last Updated: September 30, 2020 Editor: Jennifer Harbaugh

AlphaTensor must now not only prove the correctness of known algorithms, but also actively search for the shortest possible algorithms – and thus increase the efficiency of the calculations. In fact, the AI ​​independently found many algorithms that are now considered the shortest for multiplying two matrices of a certain size. But beyond that, AlphaTensor discovered computational methods that were better than those previously devised by humans.

A look back: In 1969, the German mathematician Volker Strassen showed that a relatively simple arithmetic operation such as the multiplication of two simple matrices can be performed in seven arithmetic steps instead of eight. This caused a stir among mathematicians. Other algorithms at the time also optimized roads.

Although attempts have been made since then to achieve further improvements, they have so far been unsuccessful, according to the researchers: “To our knowledge, we are improving Strassen’s two-stage algorithm for the first time since its introduction in 1969 for multiplying four by four matrices,” the researchers write.

read also

Pluto combo Neil deGrasse Tyson

For example, if you multiply a four by five matrix by a five by five matrix, the traditional algorithm has 100 calculation steps. Mathematicians were able to reduce this number to 80, but only AlphaTensor found an algorithm with only 76 steps. With larger matrices, the improvement potential is usually much greater, the team writes.

“Building on our research, we hope to advance a larger body of work — the application of AI to help society solve some of the most pressing challenges in math and science,” some authors say in a quoted DeepMind release.

Holger Hoos of the Rheinisch-Westfälische Technische Hochschule Aachen (RWTH) considers the work methodologically “interesting without a doubt”, but not as groundbreaking. According to the expert, the matrix multiplication approach could be very interesting for algorithmists and mathematicians working in this field. “But I don’t see any signs of a breakthrough in automatic algorithm construction.”

You can listen to our WELT podcasts here

To display embedded content, your revocable consent to the transfer and processing of personal data is required, as the providers of the embedded content as third-party providers require this consent [In diesem Zusammenhang können auch Nutzungsprofile (u.a. auf Basis von Cookie-IDs) gebildet und angereichert werden, auch außerhalb des EWR]. By turning the switch to “on” you agree to this (it can be withdrawn at any time). This also includes your consent to the transfer of certain personal data to third countries, including the US, in accordance with Article 49 (1) (a) GDPR. You can find more information about this. You can withdraw your consent at any time via the switch and via privacy at the bottom of the page.

“Aha! Ten Minutes Of Everyday Knowledge” is WELT’s knowledge podcast. Every Tuesday and Thursday we answer everyday questions from science. Subscribe to the podcast at Spotify, Apple Podcasts, deezer, Amazon music, Google Podcasts or directly through RSS feed.

Leave a Reply

Your email address will not be published. Required fields are marked *