Algorithms are like a parent to a computer. They, in turn, tell the computer how to use information to make it work.
The more efficient the algorithm, the less work the computer needs to do. For all the technological advances in computer hardware, and for the most controversial age of Moore’s law, computer performance is only one aspect of the picture.
There is a second trend behind the scenes – the algorithm is improving, so less computing power is needed. While the efficiency of the algorithm may be less than obvious, you will definitely notice if your trusted search engine is suddenly one-tenth, or if moving through large databases is like drowning.
This prompted scientists from the MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) to ask – how fast can algorithms improve?
The available information on this question included specific algorithms (case studies) that were considered to be broad-based. In response to this lack of information, the team set out to extract data from 57 textbooks and 1,110 research papers to examine history as algorithms improved. Some research papers have directly reported how good the new algorithms are, while others need to rebuild the authors using “fake code” to describe the basic details.
In all, the team looked at 113 “algorithm families,” a set of algorithms that solved the same problem that was so important in computer science textbooks. For each of the 113 teams, a new algorithm for the problem was developed, followed by a special note and a special note of the more efficient ones. It has been in operation since the 1940s and has been divided over decades, with the team developing an average of eight algorithms per family, of which one couple has improved its effectiveness. The team also created an algorithm- Wiki.org to share this collected knowledge database.
Scientists have evaluated how quickly these families have improved by focusing on the most analytical algorithms: how quickly they can solve the problem (the “worst time complex” in a computer). What emerged was a tremendous change, but also important insights into how the evolutionary algorithm was developed for computer science.
For major computer problems, 43 percent of the algorithm families had year-over-year improvements equal to or greater than the profits made by Moore’s law. Widespread improvement from algorithm to performance in 14% of cases They outperformed those who came with improved hardware. The profits from algorithmic improvements were particularly large for big data problems, so the importance of those improvements has grown in recent decades.
The authors note that the biggest change came when an algorithm family moved from profit to polynomial complexity. The amount of effort to solve a problem is like someone trying to figure out the key combination. If you only have a 10-digit dial, the task is simple. With four dials, such as a bicycle lock, it is very difficult for anyone to steal your bike, but it is still possible to try every combination. With 50, it’s almost impossible – it takes so many steps. Here are some of the most common problems with computers: As they grow, they become more and more capable. Finding a large number of algorithms often solves that problem, making it possible to deal with problems in a way that hardware upgrades are not possible.
As Moore’s law quickly enters international discussions, the researchers say computer users need to turn to performance-enhancing algorithms. The team confirms that the findings are historically the largest profit from the algorithm, so there is potential. But if the profit comes from the algorithm rather than the hardware, they look different. Hardware upgrades from Moore Law will be carried out more efficiently over time, and breakthroughs for algorithms often come in larger steps.
“This is the first paper to show how fast algorithms can be improved on a wide range of examples,” said Neil Thompson, MIT researcher at the MIT School of Science and Slovenian Management School and senior author of the new paper. In our analysis, we were able to say how much more work could be done using the same computer power after an algorithm was developed. When problems add billions or trillions of data points, algorithmic improvements become more important than hardware upgrades. At a time when computer environment is on the rise, this is a way for businesses and other organizations to improve in the future.
Thompson wrote the paper from a visiting MIT student, Yashri. The paper is printed in IEEE Processes. The work has been replicated by the Tides Foundation and the MIT initiative on the digital economy.
.