A framework for exploring and modeling neural architecture search methods
Вантажиться...
Дата
2020-05-14
Автори
Radiuk, Pavlo
Hrypynska, Nadiia
Назва журналу
Номер ISSN
Назва тому
Видавець
CEUR-WS
Анотація
For the past years, many researchers and engineers have been developing and optimising deep neural networks (DNN). The process of neural architecture design and tuning its hyperparameters remains monotonous, timeconsuming, and do not always ensure optimal results. In his regard, the automatic design of machine learning (AutoML) has been widely utilised, and neural architecture search (NAS) has been actively developing in recent years. Despite meaningful advances in the field of NAS, a unified, systematic approach to explore and compare search methods has not been established yet. In this paper, we aim to close this knowledge gap by summarising search decisions and strategies and propose a schematic framework. It applies quantitative and qualitative metrics for prototyping, comparing, and benchmarking the NAS methods. Moreover, our framework enables categorising critical areas to search for better neural architectures.
Опис
http://ceur-ws.org/Vol-2604/paper70.pdf
Ключові слова
deep neural network, AutoML, neural architecture search, scheme modelling, efficient neural network
Бібліографічний опис
Radiuk P. M., Hrypynska N. V. A framework for exploring and modeling neural architecture search methods // CEUR-Workshop Proceedings. 2020. Vol. 2604. P. 1060-1074.