Large-scale distributed network depth

Paper: https://www.cs.toronto.edu/~ranzato/publications/DistBeliefNIPS2012_withAppendix.pdf

Translation: the depth of large-scale distributed network

Summary:

Recently, unsupervised learning and deep learning of features related work has shown that a large model train can significantly improve the performance of the model. In this paper, we consider the use of tens of thousands of CPU cores to a depth network training problem billions parameters. We have developed a software framework DistBelief famous, you can use a cluster composed of thousands of machines to train the large models.

Guess you like

Origin www.cnblogs.com/yangwenhuan/p/11319711.html