NSF Workshop Summary
The second day of the NSF workshop on role of theory in design automation
Lynn Conway, with Carver Mead, also played a major role in the VLSI revolution. She is the other half of the creators of the Mead-Conway revolution. This workshop would not be happening if it were not for her wonderful work.
Today I want to summarize the last half day of the NSF workshop. We had multiple breakout groups to discuss what should NSF do in this area.
I once stayed at the Hilton next to NSF for another panel. As many hotels like to do, this Hilton had a computer monitor with the events of the day displayed. I looked up and saw the following on the monitor:
Bureau of Prisons: Breakout Sessions 10am-11am.
I cracked up and pointed this out to my colleagues. I think they smiled, but did not find it as funny as I did. Oh well.
Our Group Discussions
There was a discussion on one of the big puzzles in theory : Why do SAT solvers of various kinds work so well in practice? It was pointed out that real-world problems seem to be easy to solve with various SAT solvers. So what does this mean for P=NP?
There was a long discussion on stochastic systems. Perhaps this can be linked with the theorists who work on Rapidly Mixing Markov Processes. Many of the future design problems are stochastic for a variety of reasons. How do we check the properties of such systems? Applications include reasoning about power consumption of chips or biological systems, since such systems are not deterministic.
Vijaya Ramachandran and Suresh Venkatasubramanian and I were charged with putting together a half hour report on what role theory can play in design automation. We divided the report like a bad version of Charles Dickens’ Christmas Carol: I did the past, Suresh the present, and Vijaya the future.
I spoke on why theory played a major role in design automation in the 1980’s. I claimed that it was because theorists actually designed chips, had them fabricated, and tested them. I said that we need to do something like this today, if we want to have theorist get re-involved. There was at best mixed support. The major question was: but chips are very complex This was the same comment that we got in the 1980’s. As Yogi Berra says, “This is like déjà vu all over again.”
Suresh spoke on the present state of theory. He pointed out that now theory has many ideas that could possibly help. They include: randomness, approximate algorithms, streaming, sub-linear methods and others. I think the group liked his presentation much more than mine.
Vijaya spoke last on the future of theory. Well, actually she spoke mostly on her own work on multicore algorithms, which is definitely part of the future of theory. She, and others, are working hard building a powerful theory on how to use multicore processors for real algorithms.
All Hands Discussions
Here are the reports from the four group leaders of the breakout groups. Well not their reports, but some high level view of what they had to say.
Pinaki Mazumder gave an overview of the theory issues in design automation. The theoretical questions in design automation have moved over the years from discrete mathematics to more continuous mathematics. The major areas he talked about were: verification methods based on SAT and other tools; analog design methods based on stochastic differential equations; simulation based on many different tools; and finally the new area of design of biological systems.
Sharad Malik spoke on what are the key research challenges. Scalable design methods are needed. They need new kinds of algorithms including: sub-linear methods, incremental algorithms, parallel algorithms. A surprising issue that they raised that is unusual for theory is they want all their algorithms to be deterministic. The reason is that if one tool is nondeterministic, then the design may change and mess up some later tool. I think this is a unique challenge for theory, since random algorithms are so basic to us.
A very hot topic, this is a bad pun, is that there are power viruses that use power to destroy a chip. The idea is suppose there is a strange program that if executed will cause way beyond the planned power consumption. If this can happen, then a virus can destroy your chip. Quite an interesting security question. Can you say cyber-warfare?
Jason Cong spoke on what is EDA–electronic design automation? Main idea is it’s a set of tools that go from high level description of circuits down to actual physical implementation.
Bob Brayton spoke on curriculum for design automation. The number of students interested in this area is way less in the last few years. Few positions are available either in academia or industry. Future is not clear.
One of the recurrent themes of the discussion is: we have SAT solvers that seem to work on all examples we try them on. How does this fit with the conventional wisdom that PNP? This is a very interesting open problem, that we should work on in the future.
Last post on NSF. Back to usual for next post.