Neural Architecture Search
Succesful Examples
- Succesful NAS example in image classification: Barret Zoph, Vijay Vasudevan, Jonathon Shlens, and Quoc V. Le. Learning transferable, architectures for scalable image recognition. In Conference on Computer Vision and Pattern Recognition, 2018.
- Succesful NAS example in image classification: Esteban Real, Alok Aggarwal, Yanping Huang, and Quoc V. Le. Aging Evolution for Image Classifier Architecture Search. In AAAI, 2019.
- Succesful NAS example in object detection: Barret Zoph, Vijay Vasudevan, Jonathon Shlens, and Quoc V. Le. Learning transferable, architectures for scalable image recognition. In Conference on Computer Vision and Pattern Recognition, 2018.
- Succesful NAS example in semantic segmentation: Liang-Chieh Chen, Maxwell Collins, Yukun Zhu, George Papandreou, Barret Zoph, Florian Schroff, Hartwig Adam, and Jon Shlens. Searching for efficient multi-scale architectures for dense image prediction. In Advances in Neural Information Processing Systems 31, pages 8713{8724. Curran Associates, Inc., 2018.
Search Space
- A predefined search space defines which neural network architectures can be represented in principle
- Reducing the size of the search space to simplify the search is possible, e.g. by incorporating prior knowledge about typical properties of neural network architectures that are well-suited for the particular task in question
- Incorporating prior knowledge introduces a human bias that may prevent finding novel architectural Building blocks that go beyond the current human knowledge
Related Work
- Elsken, T., Metzen, J.H., Hutter, F., Neural Architecture Search: A Survey, Journal of Machine Learning Research, 2019
[ PDF (~ 0,5 MB) ] - Novel neural architectures enabled the success of deep learning in many fields
- Employed neural networks architectures are often developed manually by human experts that is time-consuming and error-prone
- Deep learning success has been accompanied by a rising demand for architecture engineering, where increasingly more complex neural architectures are designed manually
- Automated Neural Architecture (NAS) search methods aim to solve this problem as a process of automating Architecture engineering
- NAS methods can be categorized in (a) search space, (b) search strategy, and (c) performance estimation strategy
- NAS methods have outperformed manually designed architectures on some tasks
- NAS can be seen as subfield of AutoML and has signicant overlap with hyperparameter optimization
Social Media
Interesting discussions at our BigBrain Workshop today about #AI methods & #HPC processing in #neuroscience & Neural Architecture Search & modular supercomputing by @fz_juelich @fzj_jsc @Haskoli_Islands @uni_iceland @uisens @DEEPprojects @helmholtz_ai – https://t.co/xGUtQsWuC1 pic.twitter.com/r8p34BPBdP
— Morris Riedel (@MorrisRiedel) August 20, 2019