Choosing the Right RISC-V Core

This article is compiled from semiengineering by the semiconductor industry (ID: ICVIEWS)


There is a long way to go to improve the RISC-V ecosystem.

As more companies become interested in RISC-V ISA-based devices, and more cores, accelerators, and infrastructure components are offered commercially or as open source, end users face increasingly difficult challenges: How to make sure they make the best choice.

Each user will likely have a set of needs and concerns that almost equate to the flexibility of RISC-V products, extending well beyond traditional PPA metrics to safety and security issues or quality considerations. This may include the adaptation of verification appendages, which enable schema extensions and necessary verifications to accompany them.

Traditionally, three levels of prototyping have been deployed—virtual prototyping, simulation, and FPGA prototyping, including hybrids between them. Each platform is then used for various purposes including software verification, architectural verification, hardware functional verification, performance analysis, etc.

The design and software ecosystem for RISC-V is building, but the configuration and verification ecosystem is lagging behind, and the industry needs to build new technologies. The flexibility of RISC-V presents significant verification challenges beyond any required for fixed processor verification. RISC-V not only makes the co-development of hardware and software possible but also necessary.


Co-development of hardware and software

In the past, the hardware was selected and the software to run on it was developed. With RISC-V, hardware is typically driven by software. "The first thing you have to choose is which standard RISC-V option you want," said Simon Davidmann, founder and CEO of Imperas Software. "The RISC-V feature set currently has 200 or 300 options. How do you know if an algorithm will benefit from a floating point unit or SIMD, a hardware multiplier, or even a vector engine? It has to be calculated for the type of application or work you want that processor to do. Hardware capabilities that will be needed and can be afforded. That becomes a challenge in itself."

Prototypes are needed to make these types of tradeoffs. "If the designer's goal is to evaluate performance and fit-for-purpose, then virtual prototyping is the only viable option," said Steve Roddy, Quadric's Chief Marketing Officer. “Building a hardware prototype takes 10 to 50 times more time than creating a SystemC model of a subsystem or an entire SoC. A SystemC virtual prototype often runs fast enough to answer performance questions, such as within acceptable accuracy, that can be achieved with this processor core Throughput in frames per second, or what is the peak and average bandwidth requirement for feature X."

Getting the accuracy right can be difficult. "It's all about accuracy and the ability to build models quickly," said Frank Schirrmeister, vice president of IP solutions and business development at Arteris. "The correct accuracy depends on the problem requirements, and generating those requirements is not trivial. If you are an ASIP provider, you will be able to generate these from any template you have. Depending on the problem you may need pipeline accuracy, you may need memory accuracy, It doesn’t need to be completely accurate, but when it comes to CAD departments, they’re too afraid to answer the wrong questions.”

But accuracy is a tradeoff with speed. "While some virtual prototypes are cycle accurate, they often run too slowly to achieve the necessary software throughput," said Imperas' Davidmann. "The highest performance virtual prototypes are not performance engines because they don't model the processor pipeline. They look at it from a software perspective, where you can compile it and run it on hardware, and you can look at instruction counts or approximate timing Estimate to get an idea of ​​approximate performance. That should be enough to make this architectural decision."

It usually requires several prototypes. "We typically do prototypes for two reasons," said Venki Narayanan, senior director of software and systems engineering for Microchip Technology's FPGA business unit. “One is architectural verification to make sure we meet all performance metrics and requirements and functional verification. Another reason is embedded software and firmware development. Verification requires different levels of prototyping techniques, most commonly developed using our own FPGA Simulation platforms for architectural and functional verification. We also use architectural models such as QEMU to build virtual platforms for performance verification and embedded software development."

The number of possibilities is increasing. "There are a number of ways companies can use RISC-V for prototyping today," said Mark Himelstein, chief technology officer at RISC-V International. “These range from manufacturer-grade single-board computers to enterprise LINUX-capable motherboards. Emulation environments such as QEMU allow developers to develop software before the hardware is complete, and they range from embedded SoCs (from companies like Espressif and Telink) to FPGAs ( from companies like Microsemi), off-the-shelf parts everywhere, to the forthcoming Horse Creek development boards from Intel and SiFive.”

Back to the performance/accuracy tradeoff. "Physical prototypes require more design work because of the connection and synthesis of true RTL, but they offer higher accuracy and throughput," said Quadric's Roddy. “Physical prototyping in an FPGA system, whether developed locally or from a large EDA company, takes effort to implement. But it runs an order of magnitude faster than a SystemC model and several orders of magnitude faster than a full gate-level simulation. Design Teams typically move from a C-based model to a physical model during the IP selection process to validate the actual design after IP selection and as a platform for system software development."

Once you know which feature sets you want in your hardware, you can see if someone has already created a solution that meets most of your needs. "It's quite possible that all the vendors are out there and there will be a commercial solution that has the type of thing that users are looking for," Davidmann said. "But with RISC-V, that solution doesn't have to be accepted as-is. RISC-V A big part of the value of the V is the freedom to change, modify and add as many different things as you want."


Choose an implementation

There are many ways to implement a set of functions, such as the number of pipeline stages or speculative execution. Each will make a different trade-off between power, performance, and area. "The ISA flavor, whether it's RISC-V, Arm, Cadence's Xtensa, Synopsys' ARC, doesn't really affect the modeling and prototyping goals and tradeoffs," Roddy said. “Regardless of processor brand, system architects need to answer questions about SoC design goals. On a technical level, the RISC-V trend is really stabilizing in the market relative to modeling and profiling tool support. There are many interactions Competing core suppliers, each with different implementations and processor characteristics. As the main system CPU, it does not have as long a life as Arm, so there are fewer ecosystem players that are well-proven in the EDA space, and there is a lot of support from various Out-of-the-box modeling support for off-the-shelf RISC-V cores from RISC-V vendors. As a configurable, modifiable core, the RISC-V world lags behind the level of instruction set automation that Tensilica spent 25 years building. Therefore, RISC-V Less modeling support as off-the-shelf building blocks and less automation as a platform for instruction set experimentation."

This is only one aspect of the implementation that needs to be evaluated. How is its quality? How do I revalidate it if I want to modify it?

Performance is the easiest of these to evaluate. "It's no different than choosing any traditional processor vendor," Davidman said. "They'll tell you how many Dhrystones per watt this core delivers, and they'll give you typical processor profiling numbers that show that's how fast this microarchitecture is going. Familiar with the data and will talk to them and get that information. There may be many options available in the datasheet, check it out on the datasheet on the vendor's website.

At this level, you probably need loop precision. "Most people put it into an emulator and run enough data through it to make a reasonable decision," Schirrmeister said. "I don't think there's going to be an upgrade to virtual prototyping any time soon. Some companies are talking about FPGA prototyping, where they can have their own Single board solution. Depending on the question you need to answer, you can decide to configure it, generate it, then pump it into the FPGA to run more data through it, and use the appropriate software routines on top of it. The industry has enough The quick access to simulators and prototyping makes this possible. The basic problem is to make this decision based on as accurate data as possible, but when it comes to trying to make that decision, there may not be that accurate data."

Many of these prototypes must contain more than just a processor. "Virtual platforms provide the ability to integrate with other external physical hardware functions, such as memory and sensors, that operate in a real environment," Microchip's Narayanan said. “Hybrid systems can combine a virtual platform with physical prototyping of other external functions. FPGA simulation and prototyping can help find timing-related bugs, such as race conditions, because this is more accurate and the external functions are running at high speed.”


verify

Because processor design has long been done in-house, there is no public verification ecosystem for building processors, and the capabilities of RISC-V require more flexible verification solutions than ever before. This creation has only just begun.

"There are industry metrics like Dhrystones or CoreMark, so people can compare performance," Davidmann said. "But how can you compare verification quality? There needs to be a level playing field, and there needs to be some quality metrics for verification."

This is where the open source movement can help. "If you look at the RISC-V ecosystem, there are a lot of very experienced processor developers," Schirrmeister said. "There are two extremes. One is I get a core from a vendor, and if it doesn't work, it's their fault. On the other hand, I have complete freedom to do everything myself. The balance is between these two Somewhere in between the extremes develop. What you get is a certain amount of validation from your provider, and then scaling is your own responsibility."

This is where metrics come in. "ISA compatibility is only the first rung on a ladder of complexity that only a few companies have climbed," said Dave Kelf, CEO of Breker Verification Systems. "Prototyping may be the only way to fully ensure reliable operation of a processor, but driving these prototypes with real workloads only scratches the surface of real processor coverage. Competitive efforts are inconsistent."

What are these indicators? "In the OpenHW quality group, we're trying to figure out what those metrics should be," Davidmann said. "This includes things like functional coverage, because it's more than a simple statement. For a high-quality processor, you need a lot more than that. There needs to be a way of verifying that you can be confident that your comparison to the reference coefficients covers everything .Functional coverage just shows that you have tests in place, but this has to be combined with some form of method of comparison against known references.We will add fault injection techniques to be able to determine if your testbench actually detects a problem .”

7f8302c0aa7a23c2cd89d683d05a6b20.png

This requires a set of tools. "As the RISC-V ecosystem matures, commercial implementations are starting to support defined market segments," said Ashish Darbari, founder and CEO of Axiomise. “We see support for markets that require functional safety compliance, such as automotive. We see support for IoT, which requires security. RISC-V vendors are investing in advanced verification technologies, including for architectural modeling Virtual prototyping and performance.Tools are now available for early adoption of formal methods to eliminate bugs and avoid bug insertion early in the design process, as it is difficult for designers to catch corner case bugs through simulation on the processor-memory interface .”

One of the necessary tools is the ability to generate test cases based on a feature list or set of functions. "Automatically generating test content to drive prototypes and keeping verification complexities in mind is key," said Breker's Kelf. "These generative mechanisms are now starting to appear on the market."


in conclusion

An ecosystem is only as good as its weakest part, which for RISC-V is the EDA toolchain. This is done for two reasons. First, until recently, there was no commercial market for processor verification tools. While they existed in the past, they either disappeared or were dissolved into traditional processor companies. Second, the flexibility of the RISC-V ISA creates a new approach to system-level optimization that requires a new set of tools. It will take time to understand this opportunity, and it will take time for the commercial tools to properly address it to emerge.

END

Welcome to join Imagination GPU and artificial intelligence communication group 2

0d6eb61679019950ac3468ca74b4cbb9.jpeg

Join the group, please add the editor WeChat: eetrend89

(Please add company name and title)

recommended reading

Dialogue with the Chairman of Imagination China: Using GPU as the fulcrum to strengthen software and hardware collaboration to facilitate digital transformation

【Download】IMG DXT GPU makes ray tracing at your fingertips

0f3f66a94cedd8d5ad9278e127917889.png

Imagination Technologies  is a UK-based company dedicated to the research and development of chips and software intellectual property (IP). Products based on Imagination IP are used in the phones, cars, homes and workplaces of billions of people around the world. For more information on cutting-edge technologies such as the Internet of Things, smart wearables, communications, automotive electronics, and graphics and image development, welcome to Imagination Tech!

Guess you like

Origin blog.csdn.net/weixin_49393016/article/details/129134734