Abstract:
System-on-chip (SoC) processor cores experience high-frequency supply voltage (VDD) droops when the current in the power delivery network abruptly changes in response to ...Show MoreMetadata
Abstract:
System-on-chip (SoC) processor cores experience high-frequency supply voltage (VDD) droops when the current in the power delivery network abruptly changes in response to workload variations, thus degrading performance and energy efficiency. Previous adaptive circuit techniques aim to reduce the effects of VDD droops by sensing the VDD variation with an on-die monitor and adjusting the clock frequency (FCLK) [1-2] or by directly modulating the phase-locked loop (PLL) clock output with changes in the core VDD to implicitly adapt FCLK [3]. The adaptive response time and complex analog circuits limit the benefits of these techniques for a wide range of FCLK and VDD operating conditions. The adaptive clock distribution (ACD) [4-5] exploits the path clock-data delay compensation during a VDD droop to enable a sufficient response time to proactively adapt FCLK. Although the ACD mitigates the impact of VDD droops on performance and energy efficiency, the previous designs require extensive post-silicon tester calibration of the dynamic variation monitor (DVM) to accurately detect the onset of the VDD droop. Since SoC cores operate across a wide range of FCLK, VDD, temperature, and process conditions, the DVM requires a unique calibration for each operating point, thus resulting in prohibitively expensive test time for high-volume products. This paper describes an ACD design in a 16nm [6] test chip with an auto-calibration circuit to enable in-field, low-latency tuning of the DVM across a wide range of operating conditions to maximize the ACD benefits, while eliminating the costly overhead from tester calibration.
Published in: 2015 IEEE International Solid-State Circuits Conference - (ISSCC) Digest of Technical Papers
Date of Conference: 22-26 February 2015
Date Added to IEEE Xplore: 19 March 2015
ISBN Information: