91³Ô¹ÏÍø

Why Attend?

Join us for the ARC? Processor Virtual Summit to hear our experts, users and ecosystem partners discuss the most recent trends and product developments in ARC-based processor solutions.  This multi-day event will provide you with in-depth information from industry leaders on the latest processor IP solutions.

Whether you are a developer of chips, systems or software, the ARC Processor Virtual Summit will give you practical information to help you create more differentiated products in the shortest amount of time. 

Automotive

Comprehensive solutions that help drive security, safety & reliability into automotive systems

AIoT

Power-efficient HW/SW solutions to implement a combination of artificial intelligence (AI) & internet of things (IoT) technologies in next-gen SoCs

High-Performance Embedded

91³Ô¹ÏÍø to meet performance and data protection requirements of high-end embedded designs

Keynote Spotlight

Frank McCleary

Associate Partner, Porsche Consulting, Inc.

 

 

 

 

Accelerating Development of Functionally Safe Automotive Systems

The increasingly complex electronics hardware and software architectures of next-generation autonomous, connected, and electric vehicles represent new and daunting challenges for automotive engineering teams. This keynote by Porsche Consulting, Inc. will discuss how automotive development organizations can accelerate silicon chip design with automotive-grade IP, speed software development with virtual prototypes, and address functionality safety and reliability throughout the development lifecycle.

 


 

 

 

 

Key Trends in the Deployment of Edge AI and Computer Vision

With edge AI and computer vision technologies advancing rapidly, it can be difficult to see the big picture. This keynote will describe the four most important edge AI and vision trends that are influencing the future of the industry: deep learning; streamlining edge development; fast, cheap, energy-efficient processors; and new sensors. Bier will highlight key implications for technology suppliers, solution developers and end-users. In addition, he will illustrate each of these trends with technology and application examples.

 

Jeff Bier

Founder, Edge AI & Vision Alliance / President, BDTI

Breakout Sessions

Day One

10:00 a.m. ¨C 10:40 a.m. PT

A Single SoC Architecture for Managing the Varying Performance Requirements of Multiple Automotive Applications

Presenter: Konrad Walluszik, Concept Engineer, Infineon

A key trend in the automotive industry is to develop safer, smarter and more eco-friendly cars. Accomplishing this requires innovative semiconductor products that can address a variety of automotive use cases such as domain controllers, e-mobility and advanced driver assistance systems (ADAS).
This presentation describes the challenges of addressing varying performance workloads with a homogenous SoC family targeting different automotive application domains. Leveraging the capabilities of a highly-configurable ARC Vector DSP solution allows scalable SIMD performance, supplemented by a uniform ecosystem to address the family concept. Based on application examples the presentation will show how corresponding challenges are solved by optimized ARC EV processors.

10:40 a.m. ¨C 11:20 a.m. PT

Safe & Secure SoC Architectures for Autonomous Vehicles

Presenter: Fergus Casey, R&D Director, Synopsys

Let's face it: People are bad drivers. The Driver is the biggest uncertainty factor in cars, and computer vision is helping to eliminate human error and make the roads safer. Autonomous vehicles are expected to save almost 300K lives each decade in the United States, but after 4-5 decades of autonomous car proof of concepts and years of development, driverless cares still seem a long way off. This presentation will describe the challenges that SoC designers and OEMs face when developing self-driving vehicles, from understanding how a pedestrian looks to software/silicon, to understanding an entire scene. It will then describe the key milestones that the industry, and each chip design, must reach on the road to autonomous driving, and how to know when you've reached them.

 

11:20 a.m. ¨C 12:00 p.m. PT

How to Execute AUTOSAR Classic Projects from a Tooling Perspective on the ARC Functional Safety Processor IP

Presenter: Chris Thibeault, Head of Partner Management ¨C Americas, Elektrobit

Electronic control units (ECUs) empower vehicle functionality, and tooling is an essential aspect of AUTomotive Open System ARchitecture (AUTOSAR) Classic for ECU development.  In this presentation, attendees will learn how the AUTOSAR Standard has evolved. In addition, best practices to set up a typical configuration and integration workflow, including tooling aspects of the AUTOSAR methodology using Elektrobit¡¯s solution for AUTOSAR Classic, as well as configuring AUTOSAR basic software modules and translating the code running on Synopsys¡¯ Functional Safety Processor IP will be covered.


AIoT + -

10:00 a.m. ¨C 10:40 a.m. PT

Using TensorFlow Lite for Microcontrollers for Ultra-Low-Power Endpoint AI Applications

Presenter: Samuel Kuo, Senior R&D Director for ASIC Development at Himax Technologies Inc

Endpoint AI applications are exploding.  Always-on, battery-constrained endpoint devices are now processing vision, voice, and environmental sensor data locally.  Himax¡¯s WE-I Plus ASIC provides a complete solution for ultra-low-power endpoint AI applications. The WE-I Plus ASIC integrates a 400MHz ARC EM9D processor with FPU, internal 2MB SRAM and 2MB Flash.  It can manage larger neural networks for vision applications, or multiple neural networks for sensor fusion applications.  Synopsys¡¯ embARC Machine Learning Inference (MLI) library boosts machine learning performance and power efficiency by utilizing the EM9D¡¯s signal processing hardware features, such XY memory and Address Generation Units (AGUs).

TensorFlow Lite for Microcontrollers (TFLM) is a popular open source framework designed to run machine learning models on microcontrollers.  The WE-I Plus ASIC supports TFLM and MLI optimized kernels to provide an end-to-end solution.  TFLM support enables Himax¡¯s customers to leverage re-trainable vision, voice and vibration examples to quickly and easily build endpoint AI applications.

 

10:40 a.m. ¨C 11:20 a.m. PT

Designing a Geolocation Solution that Addresses IoT Power and Cost Challenges

Presenter: Rabih Chrabieh, Co-founder and CTO, Nestwave

Many view location tracking to be one of the killer apps for IoT.  But advanced logistics, transportation, smart city and smart factory applications strain the current class of solutions, often designed with smartphone or vehicle navigation in mind.  These solutions fall short in terms of power consumption, cost and coverage indoors.  Based on advanced signal processing algorithms and a hybrid device/cloud architecture, Nestwave has developed a low-power geolocation solution that eliminates the need for a dedicated positioning chipset.  When combined with the efficiency of the ARC EM DSP core and its extensions, geolocation performance is improved with substantially reduced frequency requirements, lowering overall power consumption and providing additional processor headroom for other tasks. 

 

11:20 a.m. ¨C 12:00 p.m.. PT

Smart Sensors Using Machine Learning to Embed Intelligence into the World

Speaker: Pete Warden, Technical Lead of TensorFlow Micro Project, Google

In this talk Pete Warden will discuss how machine learning can drive embedded products by making sense of sensor data. With practical examples of applications using neural network models, he'll show how you can recognize speech, gestures, and even people, all using low-power devices like ARC processors. With an overview of Google's TensorFlow Micro framework, he'll demonstrate how easy it is to get started with embedded ML, and how to customize it to your own requirements.

 

 


10:00 a.m. ¨C 10:40 a.m. PT

Estimating Power Early & Accurately for Smart Vision SoCs

Speaker: Derya Eker, Engineering Manager, Synopsys

Today¡¯s high-end SoCs need to handle increasingly compute-intensive workloads but must carefully balance power-to-performance tradeoffs. The demand for wide deployment of artificial intelligence (AI) and deep learning is surging. Face recognition is paramount in mobile phones and extending to smart wearables. Identifying objects and surroundings in augmented- and virtual-reality headsets further push the envelope. Self-driving cars apply deep learning to interpret, predict and respond to data coming from surroundings for safer, smarter autonomous driving.

 

10:40 a.m. ¨C 11:20 a.m. PT

Breaking the Tera FLOP Barrier with Synopsys ARC VPX DSP Processor

Speaker: Graham Wilson, Sr Product Marketing Manager, Synopsys

More and more applications are using floating point data type computation and some of these applications, especially Automotive ADAS RADAR and LiDAR are pushing very high levels of computation and data throughput. These applications requires a DSP system for computation of signal processing algorithm up to and over 1 Tera FLOP. This presentation details the level of computation available from the VPX5 DSP processor for floating point algorithm computation and shows how the core offers SoC developers scalable performance to achieve Tera FLOP level of performance. 

 

11:20 a.m. ¨C 12:00 p.m. PT

Implementing High-Performance Real-time Designs 

Speaker: Carlos Basto, Principal Engineer, Synopsys

The usage of real-time operation is common in embedded applications.  Most of these functions are not obvious to the user of an electronic product, but they are essential to the proper operation of the product.  The requirements can vary from hard real-time, to firm real-time and soft real-time depending on the application.  The capabilities and functionality of embedded processors are fundamental to the implementation and operation of real-time applications.  This presentation will look at the requirements for real-time applications and how they can be addressed with embedded processors from both the perspective of hardware and software.  Focus will be given to how processors should be configured and the capabilities and instructions that can and cannot be used depending on the level of real-time operation.   


Day Two

10:00 a.m. ¨C 10:40 a.m. PT

SoC Level Safety Management, A Software View

Speaker: Anatoly Savchenkov, Software Engineering Manager, Synopsys

Increasing complexity of automotive ICs consisting of multiple heterogeneous processors and accelerators, I/O interfaces and custom hardware blocks raises unprecedented challenges for safety architects. Safety management tasks including safe boot, periodic safety testing and handling of runtime safety escalations traditionally are done in hardware making them expensive and not easily reusable in multiple designs. This presentation describes how Synopsys safety hardware architectures are enabled by ARC embedded safety software to deliver increased usability, extensibility, and robustness for SoC level safety management tasks.

 

10:40 a.m. ¨C 11:20 a.m. PT

Implement ASIL D-Compliant ARC Processor IP Using Synopsys¡¯ Native Automotive Design Solution

Speaker: Shiv Chonnad, Sr. Staff Functional Safety Engineer, Synopsys

Next-generation autonomous driving and advanced driver-assistance systems (ADAS) applications require complex safety-critical electronic components. The SoC designs used in these electronics should adhere to the ISO 26262 functional safety (FuSa) standard to achieve the highest automotive safety integrity level (ASIL). Synopsys offers the broadest portfolio of silicon-proven automotive-grade IP, which is ISO 26262 certified up to ASIL D, for use when developing safety-critical SoCs. Synopsys¡¯ new native automotive RTL-to-GDSII solution, driven by FuSa intent, enables designers to efficiently implement and verify FuSa mechanisms in order to achieve target ASIL with improved quality-of-results and ease-of-use. The 91³Ô¹ÏÍø Group IP team has successfully leveraged the native automotive RTL-to-GDSII solution. This presentation will describe the ASIL D-compliant ARC processor IP and the new native RTL-to-GDSII solution, the resulting implementation flow with an ARC HS46 dual-core lock-step (DCLS) processor, and benefits for the SoC designer.

 

11:20 a.m. ¨C 12:00 p.m. PT

Addressing the Challenges of RADAR, LiDAR & Vision Sensor Fusion for Next-Generation Automotive ADAS Systems

Speaker: Pieter van der Wolf, Principal R&D Engineer, Synopsys

Automotive ADAS systems use multiple sensing technologies, RADAR, LiDAR and Imaging to create a 360 degree view of surroundings. Each different sensing technology has its own advantages to environmental conditions. Complexity is increasing in ADAS systems, while demanding a reduction in component cost. This presentation will go through the computation capabilities of various Synopsys ARC processors as well as discuss the use of cross computation and sensor fusion functionality to improve the quality of sensor detected object data in automotive ADAS systems.


AIoT + -

10:00 a.m. ¨C 10:40 a.m. PT

Solving the Challenges of the Evolution Drive from Bluetooth LC3 to LC3Plus Codecs

Speakers: Michael Rougeux, Director of Software Engineering, T2 Labs, and Graham Wilson, Sr. Product Marketing Manager, Synopsys

In January of 2020, the Bluetooth SIG announced that the Bluetooth LE Audio standard will include the Low Complexity Communications Codec (LC3), with several companies announcing available solutions for the Bluetooth LC3. The Bluetooth LE Audio standard with LC3 is expected to gain large amounts of traction in the industry, enabling the next-generation of Bluetooth audio solutions. Technology leaders in the industry are also looking at the next evolution step of the Bluetooth LC3 with the LC3 Plus codec specification. LC3Plus comprises all features of the Bluetooth LC3 plus an advanced PLC for transmission robustness, lower delay modes down to 5ms at a 2.5ms packet size, and higher resolution audio modes. LC3Plus enables a new range of applications as well as technical challenges. This joint presentation between T2 Labs and Synopsys details the new target applications of LC3 and LC3Plus and the technical challenges for proposed solutions of LC3 and LC3Plus codecs.

 

10:40 a.m. ¨C 11:20 a.m. PT

Trends in Machine Learning for Edge Applications

Speaker: Pierre Paulin, Director of R&D, Embedded Vision, Synopsys

Embedding computer vision and deep learning at the edge remains challenging today because of 1) huge computational and memory requirements, and 2) the accelerating pace of innovation for algorithms that perform modern vision and sensing tasks. CNN graphs particularly are rapidly evolving to improve the accuracy and speed of learning and inference. Mapping vision and deep learning algorithms to low-power embedded platforms is challenging because they are often very demanding on compute, bandwidth, and accuracy.

In this presentation, we will discuss the latest trends in machine learning for edge applications, including emerging models like EfficientNet and new techniques like Transformers. We will review the key challenges and opportunities for embedded implementation with a focus on new techniques for bandwidth optimization. This presentation will also explain how the latest trends are shaping enhancements to the DesignWare EV Embedded Vision Processor IP family.

 

11:20 a.m. ¨C 12:00 p.m. PT

A Brief History of Time ¨C of a Bluetooth Controller

Speaker: David Peavey, System Engineer, Intel

The Bluetooth Core Specification has undergone several rapid changes over the past two decades.  The addition of Bluetooth low energy in 2010 further accelerated the evolution of the technology. Bluetooth is widely used today in PCs, mobile phones, headsets, wearables, and a myriad of IoT devices.  As expected with any immensely successful technology in an increasingly competitive landscape, the bar is being raised on performance and low power consumption requirements for each successive product generation. The Bluetooth controller ¨C considered the ¡°brains¡± of the Bluetooth system ¨C has constantly adapted and kept up with the requirements.  This paper traces how the growth of the Bluetooth specification, use cases, and Key Performance Indicators (KPIs) evolved the controller hardware architecture over the years.  Despite the ever-increasing performance and use case demands, the march towards lower and lower power consumption is relentless.


10:00 a.m. ¨C 10:40 a.m. PT

The Future of High-Performance Embedded Processing:  How You Get from Here to There

Speaker: Paul Stravers, Principal R&D Engineer, Synopsys

The requirements for embedded applications are changing as high-performance infrastructure spreads from the cloud to the edge and end points of the internet.  High-end performance requirements are rapidly increasing but there are limits on power and area in embedded applications that must be accounted for.  This is restricting what can be done to increase processor performance and is driving the use of configurable multicore solutions and specialized hardware accelerators.  This presentation will look at the future structure of embedded processors that deliver the performance, scalability and flexibility needed to address the ever-increasing performance requirements for storage, automotive, networking, mobile and other high-end embedded applications.

 

10:40 a.m. ¨C 11:20 a.m. PT

What¡¯s New In Zephyr For ARC Processors

Speaker: Alexey Brodkin, Software Engineering Manager, Synopsys

"The Zephyr Project strives to deliver the best-in-class RTOS for connected resource-constrained devices, built to be secure and safe" is the project's vision statement. And this is being achieved with continuous improvements made by the huge community of developers from different companies and locations around the globe. These improvements are not just fixes for particular platforms or use-cases, but they also include a well-planned development of higher-level substances such as an approach to safety certification, as well as  build & test infrastructure enhancements. During this presentation we will discuss the most important and interesting changes that were made in the Zephyr project last year and we will look at how mutual efforts between Synopsys engineering and the Zephyr community can help designers of ARC processor-based systems build complex and secure products based on Zephyr RTOS.

 

11:20 a.m. ¨C 12:00 p.m. PT

High Bandwidth Data Protection Case Study for the Supercomputing Era

Speakers: Andrew Elias, Sr. Security Architect, Synopsys & Craig Forward, Senior Security Products Development Lead, Synopsys

Cloud computing is going through a significant overhaul and continues to grow globally with increasing presence of hyperscale cloud providers for big data and analytics. In-house data centers are increasingly going off premises, resulting in the co-location of data centers that manage and store data for companies and application developers to improve scalability and reduce IT costs. This huge, and growing, amount of data with confidential and critical information must be protected. New laws and regulations to comply with data privacy rules put additional pressure on solution providers to secure their systems and data.

This presentation will highlight a complete, high-grade security use case that addresses the performance and latency needs for supercomputing applications starting with the SoC. Attendees will learn how to protect data at high speed based on AES-GCM encryption while managing the keys and other security services with a hardware secure module with root of trust.


Your Embedded Edge Starts Here!

Get in-depth knowledge on  the latest Processor IP Trends:

  • Automotive Security, Safety and Reliability
  • Artificial Intelligence
  • Machine Learning
  • High-Performance Embedded
  • IoT