Posted by frank schirrmeister on August 23, 2008
Well, who doesn’t believe in coincidence? First, I meet with CriticalBlue’s Skip Hovsmith and we have a interesting discussion about multi core software development practices (or lack thereof) and on the same day I run across Glenn Perry’s article “Art Imitating Life: Hardware Development Imitating Software Development”. It made me think again about which discipline is really ahead in technology and methodology – hardware design or software design? My answer once again is: It depends!
Skip Hovsmith has been a system-level expert for years. I met him actually back in 1998, when he was at National Semiconductor and he worked closely with Cadence on the Felix initiative, Cadence’s attempt at the time to crack the system-level tools market. Skip is now at CriticalBlue and heavily involved in the Multicore Association Best Practices working group. This working group is on a quite important mission, trying to define best practices around programming for multicore devices. This will be important going forward. My interest in that group stems from the need, to make sure our virtual platforms are modeled with enough fidelity to enable debug issues expressed at the multicore programming level. Example would be semaphores, where one best practice is to always call locks in the same order in order to avoid processes to lock up indefinitely. When discussing techniques how to enter software more efficiently, we ultimately arrived at UML and Skip’s comment struck me as perfectly correct: “UML is great to understand how things work [read: “to reason about things”] but once issues are understood programmers just get on with it and start hacking the code”. This goes back directly to my last post, in which I argued – besides other things – that a development phase only will find adoption if its output can be reused directly in the next.
Then I found Glenn Perry’s article “Art Imitating Life: Hardware Development Imitating Software Development”, in which he argues quite convincingly that software development is ahead because of technologies like object oriented programming. I am a true believer that hardware development has something to learn from software development. Hey, I even wrote articles like “Transferring software engineering methods to VLSI-design: a statistical approach” way back in 1994 as part of my never finished Ph.D. thesis (real life development projects got too interesting; bad excuse, I know).
However, I then went off and contacted people like Tom DeMarco and Fred Brooks, authors of the classic “The Mythical Man Month” and “No Silver Bullet”. I though they would be thrilled that somebody is trying to adopt function points from software engineering into the hardware world. Their response was interesting. “Why are you trying to do this? Isn’t hardware engineering perfect because you have a tape out and cannot fix issues with a new release of the software”.
That’s where these things tie together in my mind. While I agree with Glenn that the technology in software engineering may be more advanced, for example around languages, I am certain that the required methodologies are fundamentally incompatible. The reason? In hardware engineering the project team always has the “Sword of Damocles” hanging over their head. Mess up the tape out and you will cost the company several millions in NRE (Non recurring engineering, or, “N”ever “R”eturn “E”ver). In addition you have to consider the lost product revenue because of the several months the project is now delaying production.
In contrast, in software engineering there is always service pack 2. The requirement to get everything right is not as deadly as it is in hardware engineering. And as a result of all that Skip Hovsmith is perfectly right – “Just get on with it” is unfortunately often the approach taken in software engineering. It looks to like things have to get a lot worse before this approach changes.
Patrick Sheridan is responsible for Synopsys' system-level solution for virtual prototyping. In addition to his responsibilities at Synopsys, from 2005 through 2011 he served as the Executive Director of the Open SystemC Initiative (now part of the Accellera Systems Initiative). Mr. Sheridan has 30 years of experience in the marketing and business development of high technology hardware and software products for Silicon Valley companies.
Malte Doerper is responsible for driving the software oriented virtual prototyping business at Synopsys. Today he is based in Mountain View, California. Malte also spent over 7 years in Tokyo, Japan, where he led the customer facing program management practice for the Synopsys system-level products. Malte has over 12 years’ experiences in all aspects of system-level design ranging from research, engineering, product management and business development. Malte joined Synopsys through the CoWare acquisition, before CoWare he worked as researcher at the Institute for Integrated Signal Processing Systems at the Aachen University of Technology, Germany.
Tom De Schutter
Tom De Schutter is responsible for driving the physical prototyping business at Synopsys. He joined Synopsys through the acquisition of CoWare where he was the product marketing manager for transaction-level models. Tom has over 10 years of experience in system-level design through different marketing and engineering roles. Before joining the marketing team he led the transaction-level modeling team at CoWare.