The most popular and comprehensive Open Source ECM platform
Moore’s Law: Are the Current Alternatives Enough?
Semiconductor chips are continually shrinking and performing faster, something predicted by More’s Law. Taiwanese chip maker TSMC has announced plans to manufacture a 1-nanometer size chip in 2027. However, it is becoming increasingly difficult technically and from a cost perspective to continue to shrink the size of chips.
As Moore’s law comes to an end, scientists and researchers are investigating alternatives. One is the use of Application-Specific ICs (ASICs) which are chips that are architected to optimize processing for specific types of problems. ASICs, for example, might be dedicated to image processing or used for running the computations for an AI algorithm. Another alternative is to design 3D ICs where chip wafers are layered and interconnected vertically. Migration from silicon to different materials is also being considered. Alternative materials being considered include carbon nanotubes and graphene. And finally, there are alternatives to binary digital computers which include neuromorphic systems and quantum computing.
But what if these alternatives don’t pan out?
In 2012, Bill Dally, chief scientist at Nvidia, said that “I drive a 1964 car. I also have a 2010. There’s not that much difference — gross performance indicators like top speed and miles per gallon aren’t that different. It’s safer, and there are a lot of creature comforts in the interior. We’ll start to look like the auto industry.”
Neil Thompson, an MIT research scientist at the Computer Science and Artificial Intelligence Laboratory (CSAIL), told MIT writer Adam Zewe in an interview said that “the implications are quite worrisome. As computing improves, it powers better weather prediction and the other areas we studied, but it also improves countless other areas… that are critical parts of our economy and society. If that engine of improvement slows down, it means that all those follow-on effects also slow down. Some [might argue that] if one pathway slows down, other ones will compensate… For example, we are already seeing increased interest in designing specialized computer chips as a way to compensate for the end of Moore’s Law. But the problem is the magnitude of these effects. The gains from Moore’s Law were so large that, in many application areas, other sources of innovation will not be able to compensate.
It’s clear Moore’s Law has been slowing since around 2005 when CPU frequency stalled because we couldn’t continue to push power per chip well beyond ~~200 watts or so. Around 1980 it was closer to 2-5 watts for a CPU package, so that ~~50x growth in power wasn’t sustainable. Then instructions per clock slowed. Then running 100% of the transistors at speed is an issue. And finally, it’s clear that even the pace of doubling the transistor density every 2 years is slowing.
It shouldn’t be surprising, no exponential trend lasts forever and Moore’s Law has lasted well beyond what most experts predicted in the 80’s. I remember EE Times surveyed the top technologists at the various foundries to predict the end of transistor scaling, the average was in the early 2000’s, well, it went way longer, so we are lucky perhaps. But it has finally happened, slowly and in stages, but it’s clear now.
The concept of application specific ASICs isn’t new of course, I worked on teams that did imaging specific pipelines for products in the late 90’s and of course you get 10-100x the performance. But most high volume products have already gone that route. And generalizing GPUs and TPUs have done that for much of parallelizable computing, so I think a lot of blood has been squeezed out of that turnip already. But we’ll do more… push it along a bit slower for a decade at least for sure. Then there’s system optimization… there’s more there always. But there’s not more just sitting back and watching your software get 2x faster every 18 months, for free, and watching storage costs drop by 2x every 18 months.
However, I think people won’t really notice as the slowdown will take years and it started over a decade ago… people don’t tend to notice these slow shifts and they believe the hype of nano-this or quantum-that and 3D-whatever and so they get confused between hype and reality.
I agree that the implications on society may be large. So many of recent advancements are based on this exponential computing, storage, and communication growth… we will have to get used to a world of tomorrow that looks more like today than we expected. For someone that grew up with the Jetsons, Star Trek, and 2001, it’s an adjustment.