You may be concerned about the widely published Spectre and Meltdown vulnerabilities affecting most processors, and if your phone and computer are OK. Or more importantly, if you are designing or verifying SoCs, do you have a specter in your design? Let’s first look at what these two vulnerabilities are and how they are affecting your system.
Have you ever witnessed two passionate industry experts debate fundamental approaches of verification? They bring their decades of experience, hundreds of bugs uncovered, and countless successes and failures in order to establish intellectual dominance. Unfortunately, for most of us, the observers, these debates are usually prematurely interrupted by a reality check: a pointy-haired manager who urges them to “take it offline” or we all get “kicked out” of a conference room.
Imagine the scene. It’s Friday night, and you’ve decided to relax and watch a movie. Given the overwhelming amount of choices, you’ve already spent over an hour watching trailers to choose the movie and you’re finally almost ready to go. All that’s left is the popcorn. You go over to the microwave and get it going. For a little while nothing happens, but then you start to hear the pops, slowly at first but then much more rapidly before beginning to taper off. You then have to ask yourself: When do I stop? This is a big question. Take it out too soon and you’re going to break your teeth on those unpopped kernels. Leave it in too long and you risk burning it. How can you know if its popped long enough?
As we have discussed in several of the blogs on this forum, successful deployment of Formal verification requires knowing where and how to use it. Building up an arsenal of techniques that can be applied to deal with complexity and knowing how to use them safely is a necessity for every expert Formal engineer.
Interconnect on a System on Chip (SoC) is like the road network. There is a lot of it but it still doesn’t go everywhere and traffic jams mean that even if there is a road, you may not be able to get to your destination.
Cooking can be a necessity, hobby or calming therapy depending on whom you talk to. Personally speaking, I cook occasionally but even when I am not cooking and I am just a mere silent admirer of this amazing process, onion peeling/cutting/chopping brings tears to my eyes 😀 There are few tricks that one could use to avoid or minimize tears while peeling onions. Some of these tricks work, some don’t.
I recently returned from a very exciting Asia trip where I took the opportunity to visit some of our customers. While I made the mistake of combining too many cities in too few days and had to deal with a stubborn Typhoon that did not respect my aggressive travel plan; I noticed a significant change in customers’ behavior over previous trips.
Someday, in the not too distant future, I will be able to fall asleep, play computer games or write a bestselling novel at the wheel (well 2/3 isn’t bad). Until such time however, I have just the one option – concentrate deeply and blast the speakers with my classic rock and punk collection.
When running Formal Property Verification, we often see goals that are neither proven nor failing (especially on complex properties), which implies inconclusive goals, also referred to as bounded proofs. In these scenarios, what we have at hand is the Formal bounded depth (in terms of clock cycles), associated with such inconclusive properties.
It is commonly believed that Formal property verification is the realm of PhDs and experts with many years of experience and have the magical solution and intellect to solve complex verification problems. I see the application of Formal verification solutions solving design bugs very much like the way supercomputers and artificial intelligence have evolved to beat human intellect and eventually become mainstream. In 1997, Gary Kasparov the chess Grand Master lost to IBM’s Deep Blue supercomputer. Kasparov and other chess masters blamed the defeat on a single move made by the IBM machine, where the computer made a sacrifice that seemed to hint at its long-term strategy. Kasparov thought the move was too sophisticated for a computer and got side tracked. Indeed, fifteen years later, the designers of Deep Blue acknowledged and attributed Gary’s loss to a software bug in Deep Blue that misled him. Either way, this was a lesson that as humans, we tend to blow things way out of proportion and could sometimes make wrong assumptions.