91³Ô¹ÏÍø

Low Power Q&A with Dr. Renu Mehra, R&D Group Director

Synopsys Editorial Team

May 12, 2021 / 9 min read

As an R&D group director in the Synopsys Silicon Realization Group, Dr. Renu Mehra is responsible for one of the most enduring solutions in the EDA industry¡ªDesign Compiler, the leading RTL synthesis product. Her innovations in this area are the result of a deep passion for technology and her commitment to customer success. Indeed, Dr. Mehra is a pioneer in design automation for low-power design and power management, and has made considerable contributions to Synopsys, to the industry, and also to advancing other women in STEM fields. For her work thus far, she was recognized with the YWCA Silicon Valley Tribute to Women Award in 2020. At this December¡¯s Design Automation Conference in San Francisco, she¡¯ll be presented with the 2021 Marie R. Pistilli Women in Engineering Achievement Award. We recently talked with Dr. Mehra about her journey so far and how power management needs to continue evolving.

Q&A with Dr. Renu Mehra

Q: What has inspired your work in power management?

A: I started working on low-power technologies in the early ¡®90s as a Ph.D. student at UC Berkeley. My research group designed a computational tablet called the , featuring human interfaces like speech and handwriting recognition. We recognized that incorporating so much computation into this device would not be viable without a serious low-power strategy. A large part of our team focused on hardware, software, and EDA techniques for low power. That was my start in the area.

Eventually, I joined the Synopsys Power Compiler team and starting working on clock gating, which resulted in the industry¡¯s first automated clock gating solution. Clock gating still endures as a leading power optimization technique after decades of use. Fast forward a couple decades and we see that the power problem has only became worse. The mobile revolution of the 2000s resulted in computers inside our phones with limited battery lives, keeping the pressure going for ever lower power technologies. In the last decade or so, we¡¯ve seen an explosion of compute-intensive applications and a ubiquity of electronic IoT devices in our lives, making the power problem even bigger. Indeed, the computational resources needed to generate a best-in-class AI model has doubled every 3.4 months, on average, according to OpenAI, an AI R&D company.

As computation capabilities of our CPUs, GPUs, FPGAs, and ASICs increase, so do the possibilities of what we could achieve with this technology. We can now use the power of AI and machine learning to create immense change. Our biggest social media platforms are built on huge data centers with a foundation of AI-based algorithms. Thanks to the computation capabilities of today¡¯s computers, along with AI-based algorithms, we now have achievement that we previously could not have envisioned:

  • Image recognition
  • Computer vision, a key technology for self-driving cars
  • Speech recognition and natural language processing
  • Intelligent robots and drones

We need to continue to focus on the power optimization challenge in order to fuel these advancements as well as emerging innovations such as blockchain technology and the non-fungible tokens (NFTs) that are starting to take off. My feeling is that if we continue to solve and mitigate the power problem, we will enable these huge advancements, making life truly better for the world.

NFT

Technologies ranging from blockchains and NFTs to mobile devices and data centers are all driving demands for power optimization.

Q: How are today¡¯s hyper-convergent IC designs impacting the power equation?

A: With smaller and smaller node geometries at 7nm and beyond, designs are becoming more wire-dominant and physical effects such as wiring delay, voltage drop, crosstalk, and process variability have become much stronger and have a significant impact on the design.

What is the impact on power?

First, wire-dominated design results in dynamic power becoming the more dominant component compared to leakage. Dynamic power is intrinsically hard to optimize for due to the linear dependence on the switching activity, which is difficult to capture accurately. For one, it takes long simulations to capture, and importing switching activity from simulation into the synthesis tool significantly complicates the design flow. Second, even if this whole process is set up, it¡¯s not clear what type of workloads should be used to capture switching activity. Typical workloads may not account for worst-case situations and the chip may burn out in that case. But if we use worst-case loads, this might result in overdesigning of the chip.

Also, factors like process variability and voltage drops have the impact of making the power calculations more sensitive, as well as harder to estimate and optimize earlier in the design flow. They also lead to delay changes that can result in higher impact of glitches on the total power of the design. Glitch power is highly elusive and difficult to optimize for, but with the smaller geometry and more variability in delay, it is becoming a bigger part of the overall design.

Another impact of hyper-convergence is that many back-end effects are being brought up in the front end (left-shift), and we need to start thinking of the impact these have on power up-front. A classic example is concurrent clock and data, CCD, or useful skew technology that deliberately skews the clocks to registers to optimize timing. This impacts power, since it can help us use lower drive strength cells, and also spreads out the peak current spike at the clock edge. So, should you use CCD on a register to reduce power, or use multibit cells that might have a low-power footprint but will not allow clock skewing?

With hyper-convergence, different optimizations impact each other a lot more, and we need to account for multiple effects at the same time.

Q: What¡¯s the next area for innovation in RTL synthesis?

A: Power optimization will continue to be an important piece for RTL synthesis. In addition, at advanced nodes, we need to account for more and more physical effects and address these earlier in the flow. As designs become wire dominated, our optimization must be aware of wire layers, as well as via, routing, and floorplan impacts. We need to focus on new metrics such as congestion. However, that does not mean that logic optimization is a solved problem. As we see larger and larger blocks, the traditional logic synthesis algorithms will hit a wall and require significant new innovations. This is what is great about the Design Compiler team¡ªwe have persistently been able to bring new technology to market while still providing the stability of a mature tool.

Q: What are your proudest achievements at Synopsys so far?

A: I¡¯ve had several roles within Synopsys, and my proudest is my role in designing an EDA solution for power management of world-class chips. Most complex chips today do not operate at full strength all the time, so different parts of a chip can be shut down when not in use. Also, the different parts may not all need to operate at the same speed, so, to save power, the voltage can be dropped for the blocks that have low speed requirements.

The methodology to achieve this type of optimization was mostly manual and ad hoc in the early 2000s, and there was no automation. We developed the concept of well-defined power domains on a design which can operate at different voltages and can be shut down independently if needed. Along with other experts in the industry, I was the founding member of the IEEE 1801 committee which created the Unified Power Format, or UPF.  I collaborated with people across the industry who felt that in order to take power-managed designs to the next level, it is important to have a common way to specify intent and to come up with an automated approach to design such chips. We donated the initial ideas we developed at Synopsys and helped make UPF into an industry standard.

Within Synopsys, I led the initiative for UPF implementation and worked with different product teams across the company to truly come up with a consistent methodology that would work across many tools, including simulation, static checking, formal verification, synthesis, and place and route. With many hours of meetings under our belts, along with many friends made along the way, we now have comprehensive support for UPF across the major products from Synopsys. It¡¯s been quite a journey.

Q: Considering all that you¡¯ve accomplished already in your career, what has been the most rewarding?

A: For me, the most rewarding part of my journey so far is to work on technically challenging problems and see them being deployed in real customer designs. The other thing I really like is the opportunity to work with people. Our environment is truly collaborative, and I am lucky to work with some of the brightest people in the industry. We are solving some very challenging problems together and it¡¯s really motivating. With newer and more junior folks in my team, I have a chance to set an example, to explain to them why their work matters, and to motivate them. Their energy always inspires me. The different perspectives that the diverse workforce at Synopsys fosters is highly motivating.

Q: What is driving your passion toward advancing women in STEM fields?

A:  My parents had three daughters, and they never let us feel that being a woman was a barrier to achieve our dreams. My dad was a Ph.D. in mechanical engineering with an avid interest in physics¡ªhis interests in science really rubbed off on me. This was what I wanted to do! I am thankful for the inspiration in my own home, but I know that other women do not have this type of support in their families. Over thousands of years, women have had the primary responsibility to build and mold the next generation. This capacity to think further and to work for the overall social good, along with the unique level of empathy required, makes women truly suitable as leaders of tomorrow, where fewer jobs will be based on physical labor and more will call for technical and leadership skills. Women can and must have a spot on the decision-making table for the future of technology.

Q: What advice do you have for women who want to make a difference in a technical field?

A: My most important learning has been to not take ¡°no¡± for an answer but use it as a starting point for a discussion, a chance to rethink your strategy and come back with better arguments. In her book, ¡°Lean In,¡± Sheryl Sandberg describes a situation where if men are told not to do something they are passionate about, a large percentage will give it another try. Women, however, will ¡°do what they are told,¡± and may be abandoning the activities in much higher percentages. It¡¯s important to understand that if you are passionate about something, and you are told it¡¯s not a good idea, then pursuing it is not being pushy, it¡¯s being persistent. We must be persistent as we strive towards ideas that we believe in.

Q: What do you aspire to accomplish next?

A: I have the honor of leading the R&D team for the world¡¯s more advanced and enduring RTL synthesis product, Design Compiler.  As we go forward, and prepare Design Compiler for 3nm and beyond, there are many interesting and challenging problems to address. With wire-dominated designs, congestion will be a big issue. We are also seeing that, as customers move to smaller geometries, they¡¯re able to fit in more gates in the same area. But many of these designs then become pin access and wire limited. This is an important challenge to address.

Also, as customers cram more gates into a chip, we are handling much larger designs with each new generation, so runtime and TAT for their overall design become a big issue. For hyper-convergent designs, this becomes even more important, since the more metrics and effects we are trying to account for, the more we add to the complexity of the optimization algorithms.

The traditional metrics of area and timing continue to be important care-abouts, but power becomes a more significant part of the overall equation. There are so many opportunities to optimize power using synthesis, restructuring, sequential optimization, multibit, CCD, and placement, to name a few techniques. In addition, the challenges related to glitches and switching activity calculation continue to require resolution in order to produce chips that are optimal for the workloads they operate on.

So, life will definitely continue to be quite interesting in the EDA industry.

Continue Reading