Filters








1 Hit in 4.9 sec

SCARLET-NAS: Bridging the Gap between Stability and Scalability in Weight-sharing Neural Architecture Search [article]

Xiangxiang Chu, Bo Zhang, Qingyuan Li, Ruijun Xu, Xudong Li
2021 arXiv   pre-print
To discover powerful yet compact models is an important goal of neural architecture search. Previous two-stage one-shot approaches are limited by search space with a fixed depth.  ...  With an evolutionary search backend that incorporates the stabilized supernet as an evaluator, we derive a family of state-of-the-art architectures, the SCARLET series of several depths, especially SCARLET-A  ...  As we are focusing on the two-stage weight-sharing neural architecture search method, we rely on the supernet to evaluate models.  ... 
arXiv:1908.06022v6 fatcat:hilaqbep6vhibj6x3xaqvoy2za