Note — May 19, 2019

A New Way to Build Tiny Neural Networks Could Create Powerful AI on Your Phone

Seen in → No.80

Source → technologyreview.com/s/613514/a-new-way-to-...

Overview of a research paper proposing that neural networks could be a lot smaller than they are now. Interesting read to better understand these networks and for the whole pruning concept. Not part of the results but it’s a bit disturbing when, in creating these neural networks, the term “lottery ticket” is used and we still don’t know exactly how things work. This smaller networks possibility, paired with the advances in synthetic data, and that some AIs can be based on much smaller data sets for training, points the way (I reckon) to AIs which would require smaller investments and be possible in more places / companies. Promising to somehow get out of the influence of Big Tech but also very disquieting for the same ethical fears and problems we are already dealing with.

[T]he researchers have made a simple but dramatic discovery: we’ve been using neural networks far bigger than we actually need. In some cases they’re 10—even 100—times bigger, so training them costs us orders of magnitude more time and computational power than necessary. […]

Whereas a tiny neural network may be trainable in only one of every five initializations, a larger network may be trainable in four of every five. Again, why this happens had been a mystery, but that’s why researchers typically use very large networks for their deep-learning tasks. They want to increase their chances of achieving a successful model. […]

He believes it would dramatically accelerate and democratize AI research by lowering the cost and speed of training, and by allowing people without giant data servers to do this work directly on small laptops or even mobile phones.