• Home
  • Glossary
  • Architecture Search (Neural Architecture Search/NAS)

Architecture Search (Neural Architecture Search/NAS)

Published:

Architecture search, often called neural architecture search (NAS), is a method for automatically designing neural network structures instead of relying on guesswork or manual experimentation. The idea is to define a set of possible building blocks (such as layer types, connections, and widths) and let an algorithm explore different combinations to find what works best.

NAS uses various search strategies, including reinforcement learning, evolutionary algorithms, or gradient-based methods, to test many candidate architectures. Because training every candidate to full completion would be extremely slow, NAS often uses shortcuts like partial training or shared weights to estimate performance quickly. The goal is to discover models that are more accurate or more efficient than hand-designed networks.

Follow us on Facebook and LinkedIn to keep abreast of our latest news and articles