Posted by frank schirrmeister on November 23, 2009
I am exposed to embedded software and its testing again a lot lately. An article from Jack Ganssle made me think of the old “United Colors of Benetton” commercial (picture from here). Are software and hardware developers perhaps not so different afterall? BTW, before proceeding, if you are not signed up for Jack’s newsletter “The Embedded Muse”, then I’d highly recommend to do it. It’s a great source of information about the embedded world.
In his article on “The Use of Assertions” he refers to a Microsoft paper called “Assessing the Relationship between Software Assertions and Code Quality: An Empirical Investigation”. The paper nicely outlines that the use of assertions increases code quality and gives empirical proof. I include a graphic from page 10 of the paper below. The example nicely shows a high fault density for code that has zero assertion density and higher assertion density for code with low fault density.
So far so good.
The article also compares dynamic and static tools. There is a graph on page 11 which empirically shows that the bug percentages found dynamically via assertions are actually higher that the bug percentages using static analysis tools. Well, at first I was clinching and wondering – wouldn’t it be great if this could be done all statically? There are different embedded software companies out there focusing on static vs. dynamic testing and it reminds a bit of the hardware world where we find bugs using formal and dynamic verification as well. The immediate conclusion could be: Hey, they are not so different after all those hardware and software verification engineers.
However. after a great read of Microsoft’s James Larus, et. al, called “Righting Software” and some of the references from the original Microsoft Paper, it turns out there are are fundamental differences as well. In my mind there is always a difference of verifying “form” vs. verifying “function”. Verifying function means to me to make sure that code is doing what it is supposed to do, “wait until the USB connection is established and then start some other functions”. Verifying form means to me to make sure that the function is expressed the right way and correctly, i.e. all memory allocated used to do a task is indeed freed at the end, all internal and external coding guidelines are met.
It seems to me that dynamic assertions are used in both worlds to make sure function and form is met. I can make sure with an assertion that a specific state is hit – or never hit – as well as indicating that at the of a major process not all memory has been released, or semaphores have been executed in the wrong order, which may result in a risk of locks in a multicore system.
Static verification – at least the ones I encountered – is in the hardware world mostly used to verify function. Certain states have to be hit or are never allowed to become true. Given that there are so many the “state explosion” has always been an issue on the hardware side. From what I can tell the static verification tools in the software world seem to be focused on verification of form mostly. Hence I have not seen any issues around state explosion in the software world.
Some differences, some similarities. Perhaps assertions can be a bridge between HW and SW verification as my colleague Tom Borgstrom (read his Blog “On Verification” here) put it in a discussion recently. Or I may be completely wrong on all this … so I am looking forward to your comments!
Patrick Sheridan is responsible for Synopsys' system-level solution for virtual prototyping. In addition to his responsibilities at Synopsys, from 2005 through 2011 he served as the Executive Director of the Open SystemC Initiative (now part of the Accellera Systems Initiative). Mr. Sheridan has 30 years of experience in the marketing and business development of high technology hardware and software products for Silicon Valley companies.
Malte Doerper is responsible for driving the software oriented virtual prototyping business at Synopsys. Today he is based in Mountain View, California. Malte also spent over 7 years in Tokyo, Japan, where he led the customer facing program management practice for the Synopsys system-level products. Malte has over 12 years’ experiences in all aspects of system-level design ranging from research, engineering, product management and business development. Malte joined Synopsys through the CoWare acquisition, before CoWare he worked as researcher at the Institute for Integrated Signal Processing Systems at the Aachen University of Technology, Germany.
Tom De Schutter
Tom De Schutter is responsible for driving the physical prototyping business at Synopsys. He joined Synopsys through the acquisition of CoWare where he was the product marketing manager for transaction-level models. Tom has over 10 years of experience in system-level design through different marketing and engineering roles. Before joining the marketing team he led the transaction-level modeling team at CoWare.