As Chips Scale Down, Leakage Current Goes Up. How Are Developers Responding?
Leakage current is yet another hurdle as Moore's law marches on. Manufacturing techniques, design methods, and research projects are taking on the challenge.
Scaling down the transistor to the next smallest node often sounds like a fool-proof way to improve IC performance. In reality, however, scaling introduces many hardships for IC designers. One of the most notable challenges with scaling is the increased prominence of leakage current, which significantly contributes to overall chip power consumption.
Subthreshold leakage current in an NMOS device. Image used courtesy of ResearchGate and Udaiyakumar et al.
For this reason, low-leakage techniques are becoming increasingly sought after in all-digital design. As a recent example, Spectral Design & Test (SDT) claimed last week that its low-leakage SRAM yielded high performance for mmWave applications—specifically addressing the large beamforming data storage needs of mmWave antenna company Mixcomm's 5G beamforming front-end ICs.
Built on a 45nm RFSOI process, SDT's SRAM also uses a proprietary memory methodology that includes low power retention modes based on source biasing design techniques to minimize leakages.
How else do designers address the modern challenges of increased leakage current?
The Rise of Leakage Current
As chips scale down, leakage current—and specifically subthreshold leakage—become a prominent source of power consumption in ICs. When developers scale down transistors, they also tend to scale down supply voltages to minimize dynamic power consumption.
However, scaling down supply voltage negatively affects circuit performance with respect to speed. A common way to overcome this is to lower the threshold voltage accordingly. As designers lower the threshold voltage, they decrease the subthreshold region—the result being more subthreshold leakage at more frequent intervals.
This is why developers have mostly stopped scaling down supply voltages: at a certain point, the increase in subthreshold leakage is not worth the decrease in dynamic power consumption.
Static power consumption is becoming increasingly relevant as chips scale down. Image used courtesy of Actel
Increased integration is another reason subthreshold leakage has become so prominent. A single transistor’s subthreshold leakage could often be on the order of picoamps (E-12). However, once billions of transistors are added to one chip, the overall contribution of each transistor’s leakage current starts to add up significantly.
For example, if each transistor exhibited 10 pA of leakage current, and there were 10 billion transistors in the IC, the overall consumption from leakage alone would come out to 100 mA.
Another technique that semiconductor engineers have developed to minimize the effects of leakage current is silicon-on-insulator (SOI) technology.
Conventional bulk MOSFET (left) vs. fully-depleted SOI (right). Image used courtesy of STMicroelectronics
In SOI designs, an insulator layer is placed directly beneath the channel, isolating the body from the channel. The isolation of the body from the channel turns out to have many benefits, including lowering the parasitic capacitance between the source and the drain.
Important to this conversation, SOI techniques also significantly reduce leakage currents since the insulator confines the electrons to flow from the source to drain, removing leakages through the body.
Low-leakage As a Popular Value Proposition
Given the relationship between power consumption and leakage current, it's no surprise that many companies and researchers across the industry tout low leakage as a key value proposition in product releases. For instance, Magnachip Semiconductor recently announced an LDO linear regulator that is said to provide a low-leakage current of 75 uA in deep sleep mode, extending battery life and efficiency.
The new Magnachip LDO linear regulator is built for UFS-based multi-chip packages in smartphone designs. Image used courtesy Magnachip Semiconductor
Additionally, researchers at the École Polytechnique Fédérale de Lausanne (EPFL) and IBM Research Europe recently created the so-called first silicon-based hybrid device that merges conventional MOSFETs with III-V tunnel FETs. Participating researcher Clarissa Convertino reported to TechXplore, "Tunnel FETs provide lower leakage and good performance at low voltages levels, while MOSFETs are faster (at the same dimension and bias) and provide greater current drive."
She continues, "The developed fabrication flow is identical for both devices except for a single masking and epitaxy step, opening up for manufacturing of truly hybrid logic blocks."
How do you safeguard against leakage current in your designs? Have you seen this well-established design principle become a more pressing conversation in recent years? Share your thoughts in the comments below.