Posted by patsheridan on March 8, 2017
Closing the Loop with Key Performance Indicators
In Software Modeling Goes Mainstream, Ed Sperling recently wrote how chipmakers are applying use case modeling techniques to better understand the interactions between software and hardware and how they impact system performance and energy efficiency.
As the software content for multicore SoCs grows, these interactions are becoming increasingly complex. For system designers and SoC architects, the challenge is to predict how well their next generation product will meet the demanding requirements of the application as early as possible. In addition, system-level goals must be expressed in metrics that can be tracked throughout the development process. We call these metrics “Key Performance Indicators”, or KPI.
KPI enable system designers to specify system requirements in a clear and concise manner, from the perspective of the application use case (the workload). Here are a few simple KPI examples:
Well written KPI are clear expressions of system-level deadlines that must be met for the end-product to deliver the desired user experience. The graphic below provides a more detailed illustration for a common mobile application processor use case and KPI:
Are software modeling techniques available today to enable critical use cases and their corresponding KPI to be completely executable, for early analysis, without having to run the actual software? The point of today’s blog, of course, is to answer this question with an emphatic YES.
Application workload models, such as task graphs, capture the processing and communication requirements of the use case, enabling architecture simulation results to be compared to the target KPI in a highly productive and automated fashion. This enables system designers and SoC architects to close the loop on their specifications much earlier in the development cycle.
For example, the next graphic below shows the results as 3 different SoC architecture configurations are simulated in Synopsys Platform Architect. In each case the application workload model is the same, a task graph representation of the Chrome Browser in Android use case:
The 3 charts show the CPU load imposed by the browser use case over time, where each color represents the contribution of one Android process in the browser application. As processing resources are added to the architecture, the system’s ability to execute the browser use case improves.
For 2 of the 3 scenarios the KPI deadline is clearly met. However, the speed-up is not simply a linear function of the number of cores: the trough in the CPU utilization indicates that the dependencies between the processes limit the available task-level parallelism of the browser application. This analysis reveals clues about where further system optimization is possible, to reduce power and cost while still achieving the overall KPI performance goal.
After the SoC architecture is finalized, critical application use cases and their KPI can be tracked throughout the hardware and software development process to ensure system specifications are met. And, because task graphs are workload models and not the actual software, systems design teams can more easily share them with their semiconductor suppliers as executable specifications of their use cases (and corresponding KPI), benefiting collaboration in the supply chain.
So set yourself a deadline! Use Key Performance Indicators (KPI), use case workload modeling, and early architecture analysis to close the loop on your next generation architecture.
Patrick Sheridan is responsible for Synopsys' system-level solution for virtual prototyping. In addition to his responsibilities at Synopsys, from 2005 through 2011 he served as the Executive Director of the Open SystemC Initiative (now part of the Accellera Systems Initiative). Mr. Sheridan has 30 years of experience in the marketing and business development of high technology hardware and software products for Silicon Valley companies.
Malte Doerper is responsible for driving the software oriented virtual prototyping business at Synopsys. Today he is based in Mountain View, California. Malte also spent over 7 years in Tokyo, Japan, where he led the customer facing program management practice for the Synopsys system-level products. Malte has over 12 years’ experiences in all aspects of system-level design ranging from research, engineering, product management and business development. Malte joined Synopsys through the CoWare acquisition, before CoWare he worked as researcher at the Institute for Integrated Signal Processing Systems at the Aachen University of Technology, Germany.
Tom De Schutter
Tom De Schutter is responsible for driving the physical prototyping business at Synopsys. He joined Synopsys through the acquisition of CoWare where he was the product marketing manager for transaction-level models. Tom has over 10 years of experience in system-level design through different marketing and engineering roles. Before joining the marketing team he led the transaction-level modeling team at CoWare.