One of the latest trends in machine learning revolves around using algorithms to automatically create neural networks, a type of machine learning system, according to a post from MIT news. Doing so could be more accurate than humans, but it is very “computationally expensive.” So why do it?
One of these “neural architecture search (NAS)” algorithms developed by Google took 48,000 GPU hours to produce one neural network, the MIT story says. That network was then used for tasks related to image classification. And while Google has the means to do that, the necessary hardware is beyond reach for most.
A NAS Algorithm That Can ‘Beat’ Google’s
The MIT article references a soon-to-be-presented research paper which theorizes on an algorithm that can directly learn neural networks in only 200 GPU hours, rather than 48,000.
This could enable smaller developers to more easily design neural network architectures which run fast on hardware applications, the article says.
“In their work, the researchers developed ways to delete unnecessary neural network design components, to cut computing times and use only a fraction of hardware memory to run a NAS algorithm. An additional innovation ensures each outputted CNN runs more efficiently on specific hardware platforms — CPUs, GPUs, and mobile devices — than those designed by traditional approaches. In tests, the researchers’ CNNs were 1.8 times faster measured on a mobile phone than traditional gold-standard models with similar accuracy.” — Post from MIT news
For each network layer, the NAS algorithm uses information sampled from the architecture to create another architecture that runs as quickly as possible, the article says. In those experiments, the researchers found it ran twice as fast as a gold-standard model on mobile devices.