Abstract:
We propose a novel scalable Bayesian optimization method with truncated subspace sampling (tSS-BO) to tackle high-dimensional optimization challenges for large-scale anal...Show MoreMetadata
Abstract:
We propose a novel scalable Bayesian optimization method with truncated subspace sampling (tSS-BO) to tackle high-dimensional optimization challenges for large-scale analog circuit sizing. To address the high-dimensional challenges, we propose subspace sampling subject to a truncated Gaussian distribution. This approach limits the effective sampling dimensionality down to a constant upper bound, independent of the original dimensionality, leading to a significant reduction in complexity associated with the curse of dimensionality. The distribution covariance is iteratively updated using a truncated flow, where approximate gradients and center steps are integrated with decaying prior subspace features. We introduce gradient sketching and local Gaussian process (GP) models to approximate gradients without additional simulations to mitigate systematic errors. To enhance efficiency and ensure compatibility with constraints, we utilize local GP models for the selection of promising candidates, avoiding the cost of acquisition function optimization. The proposed tSS-BO method exhibits clear advantages over state-of-the-art methods in experimental comparisons. In synthetic benchmark functions, the tSS-BO method achieves up to 4.93\times evaluation speedups and a remarkable over 30\times algorithm complexity reduction compared to the Bayesian baseline. In real-world analog circuits, our method achieves up to 2\times speedups in simulation number and runtime.
Date of Conference: 25-27 March 2024
Date Added to IEEE Xplore: 10 June 2024
ISBN Information: