From Silicon To Software

 

Top 4 Takeaways from ARC Processor Virtual Summit 2020

ARC Summit 2020

Last week marked one of our most popular events of the year: the annual ARC Processor Virtual Summit where Synopsys experts, users and ecosystem partners discussed the latest technologies and trends in processor IP, software development and programming tools for developing modern ARC processor-based SoCs.

As the name implies, this year the free two-day event was virtual and featured in-depth information from industry leaders on the latest processor IP solutions optimized for embedded designs targeting automotive, artificial intelligence and internet of things (AIoT) and high-end embedded applications.

Here are some of our biggest takeaways:

  1. By 2025, every car will likely have connected features in it.

Connected vehicles technology

Associate Partner at Porsche Consulting, Inc. Frank McCleary, kicked off the event with a keynote that discussed how automotive development organizations can accelerate silicon chip design and address functional safety and reliability throughout the development life cycle. McCleary touched on the four megatrends influencing the automotive industry (autonomous vehicles, increased connectedness, electrification and services) that will drive how OEMs are investing resources, putting products forward to customers and focusing on future developments. Ultimately, vehicles will have to become integrated into the environment: vehicle-to-grid, vehicle-to-network and vehicle-to-vehicle infrastructure, etc. In a traditional automotive development environment, each system domain creates requirements for that specific system and function (e.g., anti-lock braking system, traffic jam pilot, lane assist, etc.) in isolation, and at a certain point they would come together in an integration where there would be a number of issues that would have to be resolved. The complexity of vehicle design in the future will require OEMs to modify their approach by simplifying their electronic architecture by decreasing the number of electronic control units per vehicle, taking software development in-house so they are not just a pure integrator but a developer, and having one simplified architecture across the entire product portfolio.

  1. We’re living in a unique era for edge AI and computer vision.

What is computer vision

Berkeley Design Technology, Inc. (BDTI) President and Founder of the Edge AI and Vision Alliance Jeff Bier presented the Day 2 keynote that featured his views on the most important edge AI and vision trends: deep learning and new algorithms; data at scale to train deep neural networks; fast, cheap, energy-efficient processors for widespread deployment; and cloud computing that simplifies development and scaling. Opportunities abound today for computer vision because of the aforementioned trends as well as the influx of capital (global revenues set to grow to $33.5 billion by 2025, according to Omdia) and talent fueling innovation in this space. However, computer vision isn’t without its challenges; in the Edge AI and Vision Alliance’s Computer Vision Developer Survey (November 2019), developers have increasingly ranked dataset curation, dataset annotation and dataset sourcing as some of the most challenging areas of computer vision product development.

  1. AIoT will fundamentally transform how we interact with our homes, offices and even cities.

AIoT definition

This was the first year that the Synopsys ARC Processor Virtual Summit featured a dedicated AIoT track. AIoT is a relatively new acronym that merges Artificial Intelligence (AI) and Internet of Things (IoT) to define a smart, connected network of devices that communicate over 5G networks and utilize large volumes of data that were too big for normal processing methods. Relevant for segments such as wearables, smart home and smart city, AIoT is leading to a more connected future with applications such as autonomous vehicles, natural language processing, super 8K resolution and more. At this year’s ARC Processor Virtual Summit, attendees had the chance to hear from experts on AIoT subjects such as power and cost-efficient geolocation solutions, implementing machine learning to make sense of sensor data, and using Google’s TensorFlow Lite to run machine learning models on microcontrollers for ultra-low power endpoint AI applications.

  1. ADAS ICs are integrating more ASIL compliant safety solutions.

ADAS system

While autonomous vehicles have the potential to save almost 300,000 lives each decade in the United States, driverless cars still seem a long way off despite decades of development. Fergus Casey, Synopsys Director of R&D for Processor IP, described the challenges that SoC designers and OEMs face when developing self-driving vehicles, from understanding how a pedestrian looks to software/silicon, to processing an entire scene (panoptic segmentation). Casey spent time discussing how an integrated ASIL-D compliant safety manager lowers system costs, reduces power and area and enhances real-time response rates compared to discrete safety solutions, resulting in a win-win solution for the semiconductor SoC provider, Tier 1 suppliers and OEMs. The Synopsys DesignWare® ARC® HS46FS Processor, one of the many products in the ARC Functional Safety Processor portfolio, simplifies development of high-performance safety-critical applications and accelerates ISO 26262 certification of automotive SoCs.

You can view the on-demand presentations from the ARC Processor Virtual Summit here.

As we wind down from the ARC Processor Virtual Summit, we are gearing up for our next virtual event, Tech Insights 2020, which will focus on critical aspects for state-of-the-art designs at emerging and established nodes. Find more information and register here.