facebook rss twitter

AI targets, rather than Moore's Law, to drive chip industry

by Mark Tyson on 16 July 2018, 12:11

Tags: IBM (NYSE:IBM), Intel (NASDAQ:INTC), NVIDIA (NASDAQ:NVDA)

Quick Link: HEXUS.net/qadvks

Add to My Vault: x

At a daylong symposium late last week, part of SEMICON West 2018, in San Francisco, many high level engineers, academics and execs talked about the end of Moore's Law being a guiding light for the semiconductor industry. Instead, there should be greater collaboration between materials scientists, hardware engineers, devices, software and systems makers to advance in directions beyond Moore's Law. The EETimes attended the symposium and shared numerous quotes from speakers on the day.

Of course Moore's Law isn't dead for some, as big hitters such as Nvidia, Qualcomm, and IBM can all afford to write cheques for the $100 million necessary to tape out a 7nm chip. However, Berkeley professor emeritus David Patterson claimed that "Ninety-five percent of architects think the future is about special-purpose processors," like the Google TPU that Patterson had a hand in developing. Looking at the current semiconductor industry landscape, Patterson said "I think this is what the end of Moore’s Law looks like".

Veteran lithographer Yan Borodovsky, recently retired from Intel, seemed to agree. Borodovsky added that he thinks that AI will be the new guiding light of the semiconductor industry. "I think something beyond today's von Neuman architectures will be helped by something more than Moore. For example, memristor crossbars may become a fundamental component for neuromorphic computing… the world beyond Moore's Law may be about how many kinds of synapses you can put in a given area and how complex they are," pondered the lithography expert.

Bill Dally, chief scientist at Nvidia, also took part in the symposium and thought that AI will serve as a performance driver across many industries. Automotive intelligence, an area of particular business interest to Nvidia, requires a lot of processing power for its computer vision. The AI sets performance targets but also suggests new technological directions to reach them. Dally also talked about how neural nets can be radically simplified to require less compute muscle (e.g. SqueezeNet) - that's advancement on the software side of the equation.

Great architectural potential is seen in packaging technology - think about logic and memory stacking, multi-chip technologies. Furthermore, new materials could hold huge promise.

John Kelly III, IBM's evangelist for a new cognitive era, added to the debate by talking about the potential of quantum computing. If the new materials and AI approach don't take over as the guiding light of the industry there is still a good chance that Quantum computers could "do computations in seconds that today’s computers could never do…. beyond AI" he enthused.

In related industry news from Friday, Facebook hired Shahriar Rabii to be a vice president and its head of silicon. Rabii previously worked at Google, where he helped lead the team in charge of building chips for the company’s devices, including the Pixel smartphone's custom Visual Core chip, noted Bloomberg. The new role for Rabii was pondered over by the source and it was thought possible the aim could be a processor that was Oculus related, and or silicon to power Facebook AI.



HEXUS Forums :: 3 Comments

Login with Forum Account

Don't have an account? Register today!
Moore's law kinda became a self-fulfilling prophecy for the longest time, it pretty much became the design goal rather than a mere observation. Now they're running up into hard physics limitations they have to change tact.
aidanjt
Moore's law kinda became a self-fulfilling prophecy for the longest time, it pretty much became the design goal rather than a mere observation.

Agreed, it definitely looks that way. There is always going to be a finite limit to how many transistors they can pack into an area though, electrons have there own unique set of limitations. What is going to be interesting is how Quantum computing is going to work, which isn't limited by the behaviour of electrons. Until they get the right materials though, it's simply not going to happen on any meaningful scale.

Nice to see they noted about software though, that definitely is an area where performance gains could be had. Still, it's been nice being around for most of this…

Agreed, software has become unnecessarily monstrously bloated.