New tools are required to verify SoCs with 100+ billion transistors.
As technology nodes shrink, the number of transistors on chips continues to grow at an exponential rate. For example, Apple’s M1 Ultra GPU with 64 cores packs in a whopping 114 billion transistors, which is 100 times more than a decade ago with the Apple A7 core.
With advancements in Serdes, computer architectures (ARM, RISC-V, X86), memory, and network-on-chip (NoC) interconnect, along with artificial intelligence/ machine learning (AI/ML) algorithms, the possibilities afforded by today’s system solutions implemented as system-on-chip (SoC) devices are far beyond those offered by SoCs built only ten years ago. As part of this, performance and power requirements are more aggressive than in the past, further adding to architectural and design complexity.
How about verification? Has it scaled the same way as design? The languages employed for verification have evolved from hardware/ structural-oriented hardware description languages (HDLs) like VHDL and Verilog to aspect/object- oriented languages like UVM, SystemVerilog, and SystemC/C++. Meanwhile, verification platforms evolved from exclusively software simulation to a combination of simulation and formal verification along with hardware accelerated verification (HAV), which includes emulation and FPGA prototyping. Given the rapid increase in design size and complexity, verification can no longer be a “one size fits all” approach.
HDL simulation still forms the crux of IP, subsystem, and even full SoC verification provided the design is small. Another aspect of verification is co-simulation, which means running embedded software code. In this case, verification engineers need to act like firmware developers and write pseudo firmware either in C or C++ to verify the design along with a conventional UVM testbench. Although these simulations are slow, they must be performed before tape-out.
The term “shift left” refers to the practice of moving things like performance evaluation and verification as early as possible in the development process. Achieving a shift left in SoC verification is achieved by employing HAV in the form of emulation and/or FPGA prototypes to create virtual versions of the design under test (DUT). These environments allow real-world software workloads and industry benchmarks to be run prior to silicon availability, thereby enabling more testing, which increases the quality of the design. In the not-so-distant past, installing, integrating, and managing these platforms required a large team. More recently, new tools and flows and simplified integration have greatly facilitated the adoption of these techniques.
All of these HAV platforms have issues, including modelling and integrating analog modules, and synthesizing and mapping the design into the emulator/ FPGA prototype. Despite these challenges, HAV is the only way to go for today’s high-capacity, high-performance SoCs because it allows firmware and software development earlier in the development cycle, including elements that are critical to the tape- out of the design, such as BootROM code validation.
Verification has evolved over the years, albeit at a slower pace than design. The latest and greatest verification environments take full advantage of developments like Portable Stimulus, which allows the same stimulus to be used across all platforms, advances in formal algorithms to converge
on evaluating different paths in the design sooner, advancements in simulators using AI algorithms to generate random stimulus based upon coverage output, running simulations in the cloud using virtual licenses for verification IPs (VIPs) on an “as-needed” basis, and several others. As a result of all this, SoC verification is no longer a one-dimensional game.