Loading [a11y]/accessibility-menu.js
Pruning Based Training-Free Neural Architecture Search | IEEE Conference Publication | IEEE Xplore

Pruning Based Training-Free Neural Architecture Search


Abstract:

Neural Architecture Search (NAS) plays an important role in searching for high-performance neural networks. How-ever, NAS algorithms are slow and require a terrific amoun...Show More

Abstract:

Neural Architecture Search (NAS) plays an important role in searching for high-performance neural networks. How-ever, NAS algorithms are slow and require a terrific amount of computing resources, because they need to be trained on supernet or dense candidate networks to obtain information for evaluation. If the high-performance network architecture could be selected without training, it would eliminate a signif-icant part of the computational cost. Therefore, we propose a zero-cost metric called EX-score, which can represent the ex-pressivity of the network and rank the untrained architectures. To further reduce cost, we design a pruning based zero-cost neural architecture search framework (PZ-NAS) using EX-score. PZ-NAS can prune the initialised supernet rapidly and obtains hundreds of times faster speed performance, whilst archieving comparable accuracy property on CIFAR-IO and ImageNet.
Date of Conference: 18-22 July 2022
Date Added to IEEE Xplore: 26 August 2022
ISBN Information:

ISSN Information:

Conference Location: Taipei, Taiwan

Contact IEEE to Subscribe

References

References is not available for this document.