We know for example that Moores law states that the number of transistors on a chip doubles every 1.8-2 years (and hence computing power has been approximately increasing at this rate). This got me thinking about compiler optimizations. Are compilers getting better a making codes run faster as time goes on? If they are is there any theory as to how this performance increase scales? If I were to take a piece of code written in 1970 compiled with 1970 compiler optimizations would that same code run faster on the same machine but compiled with todays optimizations? Can I expect a piece a code written today to run faster in say a 100 years solely as the result of better optimizations/compilers (obviously independent of improvements in hardware and algorithm improvements)?
Are compilers getting better at optimizing code over time, and if so at what rate?
138 Views Asked by James At
1
This is a complex, multi-faceted question, so let me try to hit on a few key points: