In a 1965 paper, Gordon E. Moore, co-founder of Intel Corporation, had an interesting observation. Noticing a trend related to the number of components per integrated circuit for minimum cost per component developed between 1959 and 1964, Moore initially noted that the number of components would double every 12 months.
Although he later slowed his rate of increase in complexity to “a doubling every two years, rather than every year,” this observation made by Moore turned into what today is known as Moore’s Law: the rule that the number of transistors in a dense integrated circuit doubles approximately every two years. In layman’s terms, this means that the overall processing power of computers should continue to double every two years.
But according to a recent article in Wired, we may be witnessing the end of Moore’s Law.
As Wired notes, “every six months, a team of supercomputing academics compiles a list of the most powerful computers on the planet.” That list, called the Top500 list, keeps track of the most powerful computers on the planet. There are some great computers being made, but they’re popping up less frequently, and the smaller systems at the bottom of the list aren’t catching up as fast as they usually have.
The big conclusion from Wired is that the chip-making engine “may be sputtering,” which could cause a slowdown in the continual release of faster and cheaper computing power that we’ve become so used to, both at the consumer level and in the enterprise.
Technology progresses so quickly that it’s hard to say with complete accuracy exactly where computing will be in the next two, ten, or even fifty years. What we can say is that we’re very excited to see where computing goes even just over the next ten years. Here at Telx, we’re always on the cutting edge of data center services, and you can absolutely expect that to continue to be true—no matter how much faster computers continue to become.