Experts At The Table: Esl Standards
SLD: System-level design goes faster with standards
. Where are we today?
Schirrmeister: In chip design there is a difference between designing the IP and integrating the IP. There are standards for designing the IP, and going toward implementation there is the SystemC synthesizable subset, which feeds into high-level synthesis. Thats one standard. When it comes to integration, you need to be able to simulate. Thats where transaction-level simulation based on TLM 2.0 comes in. You need to be able to express topology, which is where the SPIRIT standard comes in. Then there are other standards for data capture. Then there is the whole embedded software world, where there are 10 or more standards for debugger integration.
Perrier: For us, the standard that is really important is TLM 2.0. It has given very good results.
SLD: How is TLM 2.0 working out?
Sanguinetti: Theres always a tension between current practice and standardization. If a standard is going to be successful, theres always some precursor work that existed to set it in that direction. A standard then coalesces. Its not that common to have a case like Verilog where it became a standard. In the case of SystemC, there were several things that did the same things in the same way. TLM 2.0 is the same. People have been doing transaction-level modeling with SystemC in a non-standard way. They made their own. OSCI gets involved and comes out with a standard, and now everyone has to move to that. In a lot of cases, you can argue that the standard du jure is inferior to some of the technology.
Kaiser: It is a long time from the time a new standard is introduced to when it is implemented. Last year we saw a lot of new standards with IP-XACT and TLM 2.0, and UPF and CPF. Now they will be implemented by design teams. It seems that after this period, some of these standards will converge, as we have seen with Accellera and SPIRIT. UPF and CPF will likely converge. So will some of the other ESL standards.
Schirrmeister: This is like driving an automobile. Standards bring the car from second to third gear. Proprietary techniques take you from first gear to second gear. With Synopsys there were tools in C that were merged into SystemC. Its really the second to third gear where you want to make sure the proprietary tools dont remain a niche market. With TLM, for 10 years we had the tools to do virtual platforms, but it turned out we always needed a custom interface to go from one tool to the next. Were finally moving that from the second to third gear, and hopefully its like a Ferrari so it happens really fast. SystemC is the platform, but all the interfaces today may not have found a working group. We are trying to gauge with proprietary work the user interest.
SLD: Frank (Schirrmeister) said the best way to get to a standard is from a collaborative base. John (Sanguinetti) said the best way is from a proprietary base. Whos right?
Sanguinetti: Most standards evolve from initial efforts that are proprietary.
Perrier: Some company has to invest at the beginning to do something. Companies dont get together and say, Were going to create a standard today.
Schirrmeister: If you have enough user pressure and proprietary technology, thats when standards work. You need to get over the niche market. Thats why we did SystemC back in 1998. Later we got together with companies like CoWare to drive TLM interoperability platforms. Those proprietary tools are good for the very early adopters, but once you hit the plateau you need to make sure standardization kicks in to get it into the mainstream.
Kaiser: The market doesnt like a too-perfect world. There needs to be competition between standards. That is the case with CPF and UPF and between System Verilog and SystemC. Competition is required.
by: CoFluent Design
Consumers And Experts Have Plenty Of Great Things To Say About Train Any Pet