dc.description.abstract | In recent years, machine learning has gained a wide range of applications in various fields and has achieved outstanding results. Among the various approaches in the machine learning field, deep learning is the one that has received the most attention; deep learning can process vast amounts of information quickly, permeates, and changes our daily lives. In deep learning, designing excellent neural network architecture is very important, however, designing an excellent architecture requires not only deep learning and expertise in the relevant field but also sufficient experience in the target task area. Therefore, there is a lot of research on generating neural network architectures automatically, however, such search methods are very consuming computing resources. Therefore, in this paper, we propose a new approach and call GANAS, the model extends conditional GANs into the realm of NAS, with the ultimate goal of generating neural network architectures using trained well generators, the best feature of this method is that different neural network architectures are generated according to the data, thus saving the time of designing network architectures, and at the same time, only a small amount of computational resources is needed to achieve our task. | en_US |