ARM 2015 Moore's Law-min

Moore’s Law and its importance – Know The Basics

Moore’s Law is perhaps the most important law in the world of computing and technology today. Moore’s Law is an observation that was made by Gordon Moore, the co-founder of Intel, in 1965. Moore observed that the number of transistors that engineers would be able to fit on a square inch worth of integrated circuits had effectively increased by 2x every single year since the very first development of the technology known as the IC, or integrated circuit. He predicted that this doubling would continue for a short time, but that eventually we would hit the limit of what was possible in terms of progress.

Most people believed that Moore’s Law would hold true for a decade or two, but it actually continued for much longer than that. Even now, engineers and scientists have managed to achieve double density every eighteen months, which means that while progress has slowed, it has not halted. George Moore himself has endorsed a revised version of Moore’s Law, which sets eighteen months as the doubling period. It is now widely believed that with the current pace of technical advancement in the world of miniaturization we will see data densities continue to double once every eighteen months for another couple of decades.

Why Does Moore’s Law Matter?

Moore’s Law matters because integrated circuits are used heavily in computers. The central processing unit, or “brain” of desktop computers and laptops is the most important part when it comes to computational power. Early CPUs performed at speeds that were measured in Megahertz, and as more transistors were added to those chips, speeds increased from 4-8Mhz, to tens of megahertz, then hundreds, and then thousands. A thousand Megahertz is called a Gigahertz, and current processors run at speeds of around 4Ghz. Current processors have several cores, each running at such high speeds, and each core can perform multiple computing tasks.

Moore’s Law is what allows chip makers to produce faster and faster chips. As chips get faster, the computing tasks they can take on become more complex. This means we can enjoy better looking video games, faster image and video processing tools, more complex data-crunching for projects such as SETI and the investigation of the human genome for cancer research, and more productive office tools. If CPU and GPU progress stalls, then that will hold back progress in the world of software too.

There are some scientists that believe Moore’s Law will soon come to an end because it is becoming increasingly difficult to manage the heat problems that occur when you increase the density of the transistors on a die. Those scientists and engineers are now hunting for other ways to make chips faster, and they are also investigating more sophisticated cooling options. If Moore’s Law does come to an end, this will be no reason to despair, however. Once the current transistor paradigm hits its limit, we can be confident that new and even more exciting and high performance ways of solving computational problems will be developed by engineers.

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published.