AutoML is an emerging topic in machine learning in order to automate the end-to-end process of applying machine learning in real world, where the problem of neural architecture search (NAS) is one of the key challenges. Mathematically, NAS can be naturally cast as a large-scale discrete optimization problem, with exponential search space and enormous evaluation complexity. Interestingly, some heuristic methods like ENAS [ICML’18], DARTS [ICLR’19], and NAO [NeurIPS’18] have found neural architectures outperforming human design, in both computer vision and natural language processing. However, optimization is still hard when searching architectures in deep learning. How to relax the discrete architecture continuously and how to deal with the second order derivatives are difficult in NAS. I will give a brief introduction of the AutoML project in our company and highlight some challenges we are facing.