Cloud native EDA tools & pre-optimized hardware platforms
As chip designers strive to find new ways to extract superior performance, more functionality, and the lowest power consumption from their advanced node designs (under tight timeframes and budgets, no less), they are increasingly turning to cloud-based electronic design automation (EDA) solutions. For many designers, the cloud is a path to drive continued innovation and productivity as the benefits of Moore¡¯s law begin to wane.
While in-house compute resources are limited, the cloud provides the flexibility to scale design and verification capabilities as needed when needed in a secure environment, facilitating faster time to results, enhanced quality of results, and better cost of results. Chip design teams get access to the most advanced compute and storage resources, reduce their own system maintenance costs (or eliminate them if they go all-in on the cloud), and benefit from fast ramp-up and flexible pay-as-you-go models that support the burst usage periods that are common in some phases of the chip design process.
Migrating design and verification tasks to the cloud involves careful examination of the underlying technologies and the cloud infrastructure to achieve the desired results, including better power, performance, and area (PPA) and reduced verification times. Here are four key considerations to keep in mind while evaluating cloud-based EDA technologies:
In this blog post, I¡¯ll discuss each of these key considerations in more detail, providing some insights as well as questions to help guide your evaluation and decision-making process. I¡¯ll also share how Synopsys, as a provider of cloud-optimized and cloud-native EDA solutions, embarked on our own innovation journey to deliver design and verification technologies optimized for the cloud.
The four key considerations that will play a significant role in shaping your cloud journey are also areas of opportunity for innovation in EDA. By addressing each of these points, we can mitigate concerns about migrating EDA workloads to the cloud. Whether solutions are natively designed for the cloud or optimized for the cloud, vendors who apply a thoughtful approach on the matter while collaborating closely with cloud infrastructure vendors can provide confidence that cloud-based design and verification will yield optimal results in terms of faster, better, and cheaper semiconductors. Here¡¯s a look at the four key considerations in greater detail.
One of the key reasons to migrate EDA workloads to the cloud is to take advantage of the ability to scale compute infrastructure up and down as needed. Some compute-intensive tasks¡ªsuch as power estimation and noise analysis, as well as formal verification¡ªare better suited than others to be broken up into smaller parts across massively distributed compute and storage resources, which is a native cloud approach.
It¡¯s also important to evaluate the compute resources available: How would you access them? What kinds of storage technologies are used? What kinds of cloud instances are available? What kinds of file system options are provided? Where are the data centers located and what is the latency?
To effectively manage distributed workloads, EDA flows are ideally supported by robust job scheduling with streamlined storage. Design and verification solutions that accommodate the addition of cores on-the-fly provide the elasticity for such burst tasks.
An example of a highly scalable, cloud-based solution for signoff physical verification is Synopsys IC Validator. While physical verification has long relied on a single-CPU approach, in the last decade, we¡¯ve seen a need emerge to utilize a distributed architecture for this task. Our IC Validator engineering team has been innovating, improving its architecture to scale to thousands of cores available on the cloud and implementing a distributed scheduler to make the most efficient use of these compute resources. With its elastic CPU management technology, you can add or remove resources on-the-fly. IC Validator can complete the largest advanced-node physical verification jobs within hours versus what could have taken weeks earlier.
The availability of hybrid scaling is another cloud advantage, providing the flexibility to run certain workloads on premises or on the cloud based on the needs of the task at hand. For example, consider library characterization, a highly parallelizable task that requires a substantial amount of compute resources. Designers often struggle with resource planning for library characterization because of how unpredictable the periods of demand for compute resources are. Utilizing the cloud and a continuous deployment pipeline for library characterization can reduce turnaround time from weeks to days via on-demand access to as much compute resources as needed when needed. Synopsys PrimeLib provides a library characterization solution that scales beyond 10,000 parallel jobs on the cloud, delivering 10x faster turnaround time.
Re-architecting EDA solutions to leverage cloud architectures natively is very important. This is no different than how EDA products embraced multi-threaded and multi-processing opportunities in the past. Embracing new technologies like distributed computing, distributed storage, and micro-service architectures, while keeping in mind the cost of results by building technologies like checkpointing to leverage spot-pricing etc., will go a long way from optimized to native cloud journey.
The efficiency with which data is moved in and out of the cloud, as well as determining what data is moved, are key considerations as you plan to migrate EDA workloads to the cloud. Uploading is free but downloading can be very expensive. How should data transfer be managed? Can it be done incrementally? Where does the decision-making lie? What are the tradeoffs of different types of data storage models? The answers to these questions will impact not only project costs, but also time to market. After all, solutions that minimize data transfer will foster increased productivity and, ultimately, faster time to results.
There are many models to manage data between on-premises and cloud environments depending on the use model. The simplest and most performant model is to migrate the required data to the cloud. Other models involve keeping the data synchronized between on-prem and cloud though caching appliances. However, the first step of this data management process is to determine and catalog all the dependencies for a design to ensure that the cloud environment can seamlessly replicate the on-prem environment.
Ideally, cloud data transfer should be fast and resilient, resulting in no data loss when the information is migrated from on-premises storage to the cloud. Cloud storage, in turn, must offer the elasticity to scale in and out based on the demands of the design and verification tasks. Synopsys has architected seamless data transfer technology and processes that negate the need to do any brute force work to move data in and out, resulting in accelerated data transfer. Several of our products have already been rearchitected to leverage these fast data transfer options to ease customer migration.
When it comes to storage, customers have various options to choose from. Cloud providers allow flexibility on the type of storage, and cost plays a significant factor in the decision-making process. EDA solutions need to be architected with storage efficiency in mind. They need to tap into in-memory database options, cold storage versus hot storage techniques, and more while keeping the customer¡¯s total cost of ownership in mind. Many Synopsys products have been re-architected to leverage distributed storage, block storage, and in-memory compute, providing order-of-magnitude turnaround time improvements.
One of the reasons why the semiconductor industry has been somewhat slow to adopt cloud technologies is due to valid concerns over data security. A robust data governance plan that specifies who gets access to what kinds of data, complemented by strong identity and access management measures, is the first step to ensuring that sensitive data doesn¡¯t fall into the wrong hands.
Cloud infrastructure vendors should protect their data centers with modern cloud security, as well as cloud-native processes and technologies (not to mention redundancy for high system uptime). Vendors should be expected to build security into their infrastructures and applications from the ground up, while also ensuring operational security. For their part, EDA vendors can ensure that their workloads will run securely by working closely with cloud security vendors to adapt their technologies in a way that prevents data leakage. Look for EDA vendors that are employing measures such as encryption, next-gen monitoring, and troubleshooting tools that are adapted to cloud environments.
As a leader in software integrity solutions, Synopsys can help you migrate to the cloud securely. Our proven security solutions include static application security testing, interactive application security testing, software composition analysis, and threat modeling services.
EDA workloads are defined by high-performance computing and NFS-heavy storage, with heavy dependency on the hardware, system libraries, and tools environment. The environment is typically heterogeneous, with varied combinations of operating systems, tools, and library versions. Different products require different sets of shared libraries.
But for the customer, these parameters should not negatively impact the cloud user experience. Chip designers accustomed to how their EDA flows work on-premise should rather have a much better user experience on the cloud¡ªwith much faster turnaround times thanks to the advantages of on-demand, real-time provisioning. Design teams should also factor in the investment of time that it takes to get set up on the cloud, from establishing the network and connectivity to managing their firewall.
Questions to consider include: How easy is it for the design team to access the cloud-based tools and the associated data? Are there robust visualization and analytics tools to provide a clear picture of how and what resources are being used¡ªand the associated costs? How much faster can certain tasks be completed?
Synopsys works closely with cloud providers to ensure our solutions can scale up to large capacities, maintain high data processing efficiency, and perform well on newer hardware. Understanding that design requirements are unique, we also provide our customers with flexibility in business models, licensing, and use cases.
Another boon to the cloud customer experience is the emergence of artificial intelligence (AI) in the chip design process, which complements design expertise to provide a substantial productivity boost. Synopsys DSO.ai, the industry¡¯s first autonomous AI application for chip design, massively scales the exploration of options in chip design workflows, while automating less consequential decisions. The technology increases the efficiency of on-premises compute environments and scales compute on-demand in the cloud to further enhance PPA.
Fast, secure, and efficient EDA cloud solutions ¡ª that¡¯s what it takes to accelerate your cloud journey, and that¡¯s what Synopsys is committed to providing. Our cloud-optimized flows are designed to enhance the silicon design and verification process. In addition, Synopsys works closely with public cloud vendors¡ªMicrosoft Azure, Google Cloud Platform, and Amazon Web Services to optimize your experience on their platforms. And, since we play an integral role in facilitating the development of high-performance computing (HPC) SoCs for cloud vendors, we understand what these designers require from an EDA perspective to produce chips for HPC workloads.
Increasingly large computational demands coupled with an unwavering push for shorter design and verification cycles are placing a huge amount of pressure on the semiconductor industry. EDA cloud solution from Synopsys offer a way forward, to push continued innovation in getting to faster, cheaper, and better chips.