Title | KNAS: Green Neural Architecture Search |
Authors | Xu, Jingjing Zhao, Liang Lin, Junyang Gao, Rundong Sun, Xu Yang, Hongxia |
Affiliation | Peking Univ, Sch EECS, MOE Key Lab Computat Linguist, Beijing, Peoples R China Peking Univ, Ctr Data Sci, Beijing, Peoples R China Alibaba Grp, Hangzhou, Peoples R China |
Issue Date | 2021 |
Publisher | INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139 |
Abstract | Many existing neural architecture search (NAS) solutions rely on downstream training for architecture evaluation, which takes enormous computations. Considering that these computations bring a large carbon footprint, this paper aims to explore a green (namely environmental-friendly) NAS solution that evaluates architectures without training. Intuitively, gradients, induced by the architecture itself, directly decide the convergence and generalization results. It motivates us to propose the gradient kernel hypothesis: Gradients can be used as a coarse-grained proxy of downstream training to evaluate random-initialized networks. To support the hypothesis, we conduct a theoretical analysis and find a practical gradient kernel that has good correlations with training loss and validation performance. According to this hypothesis, we propose a new kernel based architecture search approach KNAS. Experiments show that KNAS achieves competitive results with orders of magnitude faster than "train-then-test" paradigms on image classification tasks. Furthermore, the extremely low search cost enables its wide applications. The searched network also outperforms strong baseline RoBERTA-large on two text classification tasks. Codes are available at https://github.com/Jingjing -NLP/KNAS. |
URI | http://hdl.handle.net/20.500.11897/641458 |
ISSN | 2640-3498 |
Indexed | EI CPCI-S(ISTP) |
Appears in Collections: | 信息科学技术学院 计算语言学教育部重点实验室 其他研究院 |