The upcoming significant changes in analog IC design

Recently, foreign technology media semi engineering and Movellus CEO Mo Faisal Synopsys Hany Elhak, Executive Director of Product Management, Cedric Pujol, Product Manager of Shide Technology, and Pradeep Thiagarajan, Chief Product Manager of Siemens EDA Custom IC Verification, had in-depth discussions on the impact of heterogeneous integration on internal simulation tools and how it can change the design process. The following is an excerpt from the conversation.

SE: Through collaborative design of heterogeneous integration, collaborative design is not only about analog and digital, but also includes encapsulation, interconnection, and data transmission. What impact does this have on simulation design?

Elhak: Simulation and digital are still key elements - allowing digital layout wiring and analog layout to communicate with each other. However, you also need to extend it to the encapsulation and mediation layers.

Pujol: That's why flexibility is key to all workflows. But most importantly, we are still dealing with the new generation of engineers who have received training in Python and other languages, which may be less common in traditional processes, and they hope to bring their Python knowledge to chip design. Some clients have told us that we have 100 EDA engineers who can develop processes, and you need to adapt to them so that we can optimize them on top. Therefore, as an EDA supplier, we need to provide flexibility so that we can adapt to their workflow. They want to own it. They will rely on our optimizer to complete approximately 80% of the work. But for some key issues, they hope to use their knowledge because they know they can make a difference in this area - reducing heat and lowering power.

Thiagarajan: There is another larger dependency factor that we have not yet mentioned. During the development process, the simulation designer meets with the PDK (Process Design Kit) of the temporary factory to face the challenges of evolution. Simulation designers may start with the v0.5 version of PDK to design a perfect power amplifier - validated in every corner with perfect performance - and then they receive the updated version v0.6 or v0.7 of PDK, and something in the device changes. Next time when simulating, everything will go wrong. And now design companies are trying to speed up the production speed, so they are doing early design. However, at advanced stages, this technology may change even in v1.0 versions, and the missing yield will be directly reflected in the simulated design, which looks like a simulated failure. There are huge dependencies in the ecosystem, and I don't know how to solve them.

SE: In the past, simulation designers were resistant to using EDA tools because they did not bring them significant benefits. Has this situation changed?

Elhak: There have indeed been significant changes. This transformation is happening in traditional analog IC companies. Although many readers of this attribute may ask, 'Are we modernizing?' in the field of simulation, this is a fact. The company is transitioning from internal simulators, reliability analysis environments, and change analysis tools to commercial EDA. The reason is that advanced nodes bring new problems that traditional tools cannot handle - including device models, variations, and different types of reliability issues brought by FinFET and advanced nodes. The cost of updating and maintaining these tools is becoming increasingly high. I have personally witnessed multiple such transformations with my own eyes.

Thiagarajan: This is precisely why establishing a cooperative relationship between large EDA companies and contract manufacturers in the early stages of the development cycle will be helpful.

Pujol: What you said is very correct. The self-developed tool has been used for a long time. Ten years ago, most of the problems we saw in the RF field involved extracting critical paths. At that time, we were discussing three or four nodes and less than 10 ports. Five years ago, this number skyrocketed to possibly 60 ports, followed by 200. Now we need to extract over 1000 ports for grounding. And we haven't talked about true high frequency yet - maybe just 28Ghz. The frequency will quickly reach 300GHz, or even 1THz, and the situation will be even worse at that time. The self-developed process cannot effectively handle these issues. You need to rely on databases with traceability and other features, which is where the difficulty lies. They want optimizers on Python, but still rely on EDA tools. Therefore, they require many companies to provide APIs in their tools so that they can put their own things in the GUI, but due to everything changing too quickly, they still need to rely on EDA tools because it is difficult to use their self-developed tools anymore.

Faisal: We lack talent in simulation design. There are not enough new simulation designers to receive training or graduate from school. Meanwhile, the demand for simulation design is constantly increasing. So how can we solve this problem? There are not enough simulation engineers in the world, and there are not even new people interested in becoming electrical engineers and simulation designers. On the other hand, excellent simulation designers will perform manual calculations, knowing the expected results, and then validate them with tools. This can be done at the sub block level, but not at the system level. However, at the critical block level, they typically know the expected results with error ranges, and then simulate them for validation. Because if they can't do it, then you'll get some fresh engineers who only know how to run scans, and then they'll try and error again. In the laboratory, this is a good quality, but trial and error without knowing the direction during design may waste a lot of time and resources.

SE: So, where does most of the time go in simulation design? Is it on the front-end? Or is it still being verified? How does it compare to digital design?

Elhak: Due to the changes we have seen today, the workload at each different stage has changed. Traditionally, simulation design is carried out quickly in the pre layout stage. After I design the schematic, I will simulate it to verify the correctness of the design, and then start laying out and extracting parasitic parameters. If there are any problems, I will start solving them, conduct more overall simulations, and then complete it. Nowadays, on advanced nodes, design parameters are of the same order of magnitude as parasitic parameters. This is not only a surge in the number of parasitic parameters, but also an increase in the importance of parasitic parameters with the emergence of advanced nodes. It's not just a matter of changing the outcome by 5% or 10%, but also changing the behavior of the circuit. Due to the extremely small size of transistors, the design parameters are of the same order of magnitude as these parasitic parameters. Therefore, there is a huge difference between pre layout simulation and post layout simulation. So the design cannot be done in a traditional way. Before you have the layout, you cannot verify the circuit, which changes how the layout should proceed. It must be carried out gradually. You need to estimate parasitic parameters. You need to verify it during the design process. Traditionally, design progresses quickly and then a significant amount of time is spent on validation. The situation is changing now. The design time is increasing, and validation is carried out as the design progresses.

Thiagarajan: It must enter the post layout stage, and you need to go further. For the post layout stage, you must perform full power ground extraction. This is necessary. In this era, the voltage in the circuit is truly important. If you are trying to design a 1V circuit power supply, and you have done a perfect design and block level full simulation, then you rise to the next higher level, guess what will happen? Your C4 may be at a completely different point. It will have so much IR voltage drop that when you see that circuit, it's no longer 1V. So it is necessary to validate the layout at the block level, but you must advance the EM and IR analysis a lot. Usually, during the design cycle, people will first complete the schematic, pass through, layout, pass through, and then conduct EMIR analysis as they approach the wafer stage. So you will discover problems, and then this is a race against time. You must include EMIR analysis in your schedule to ensure that your circuit voltage is present and there are no issues due to interconnection with another block that the designer does not own.

Elhak: This is a great viewpoint. For example, power distribution networks (PDNs) are becoming larger and therefore more difficult to simulate. But it's not just that. As you said, it is not just a source of electromigration and IR voltage drop issues. It is actually changing the behavior of design. It is a very large parasitic network, and you even need to consider it in the circuit functionality. So, it's not just because it's bigger, we need to simulate it for a longer time. We need to simulate more frequently than in the past. Previously, this was a matter in the approval stage. I will do my power integrity simulation. That's when I need PDN. Now, it's part of the design and must be simulated by the design. How to accelerate traditional transient simulations in the presence of large PDNs - and accurately complete them together with design, rather than using a two-step approach as is typical in EMIR - is a key technological change today. For example, you can use GPU to simulate PDN. All these technologies are accelerating the simulation process, not just because the PDN is bigger. A larger PDN is a given fact, as both the number of parasitic parameters and the circuit itself are larger. But we must simulate from the beginning to the signing.

Pujol: It is still a layout driven design. Although the schematic is good, it is almost useless in many cases. We used to have a certain margin of voltage, but now we don't have that margin. The voltage is decreasing, you need to consider this. If you only make schematics, it's almost a dead end. We talked about the demand for excellent simulation designers. They need to consider the layout and how it is built. If you only have a small schematic, what you will get will be absolutely different from what you will ultimately get. One of the missing things is knowledge transmission. We have many EDA tools, but no knowledge transfer tools. You don't have a bunch of schematics or layouts, and then you port them node by node. There are great tools today that can build this kind of node to node transmission and optimize it through AI, but knowledge has not been transmitted. This will get worse. So ultimately, when we have less voltage and delay, it becomes more complex. We have been dealing with this issue in RF for many years. RF engineers know what to avoid when placing things on a schematic because you will have coupling and all these issues. This is the same method we need to adopt.

Faisal: This is a very big problem. I will expand 'knowledge transfer' to 'experience transfer'. Many of our intuition about design comes from pain. It originated from the struggle of not converging in the laboratory and simulation. The more automation we have, the further away we are from real problems. Therefore, there is a danger that the new generation of engineers will grow up in the world of social media and automation, believing in everything, while in the real world, the results will be completely different. Once you experience a chip failure, you will know that your schematic simulation is actually lying. Some of them are difficult to teach. People must experience it firsthand.

Pujol: We need such an environment for collaborative design. If there is no knowledge transmission, it is a black box. And the listing time will not increase, it will only continue to shorten.