[Artificial Intelligence] deep learning framework

Foreword

Excerpt from "deep learning Pytorch".

In the initial stage of deep learning, each depth study researchers need to write a lot of code duplication. To improve efficiency, the researchers will be written as a framework for these codes onto the Internet so that all researchers together with; then the Internet appeared in different frameworks. Over time, several of the most useful framework to be a lot of people use in order to pop up, then introduce the world's most popular at present several deep learning framework.

TensorFlow

First it introduced Google open source TensorFlow, this is a mathematical calculation using open source software developed in C ++ language, used in the form of data flow diagrams (Data Flow Graph) is calculated. A mathematical operation node in the graph, the line identified in FIG interaction between the multidimensional data array (tensor). TensorFlow flexible architecture that can be deployed in one or more CPU, GPU desktop and server, or using a single API applications in a mobile device. TensorFlow was initially carried out by the team of researchers and Google Brain depth for machine learning and neural network research and development, currently almost can be used in a variety of fields after the open.

TensorFlow is currently the largest number of users around the world, the most extensive framework for a community, because Google company produced, all maintenance and updates are more frequent, and with Python and C ++ interfaces, tutorials are also very well. Meanwhile, many paper reproduction of the first edition are based on TensorFlow written, so is the depth of the learning community framework default boss.

Because of its underlying language too, there are a lot of functions based on third-party libraries TensorFlow abstract TensorFlow will be packaged to make it simple. Currently several relatively well-known is Keras, Tflearn, tfslim and TensorLayer .

Caffe

And TensorFlow fame as large as the depth learning framework Caffe , developed by the University of California, Berkeley Phd Jia Yang Qing, stands Convolutional Architecture for Fast the Feature Embedding , is a clear and efficient open source deep learning framework, currently consists of Berkeley visual learning center (Brekeley Vision and Learning Center) for maintenance.

Its name can be seen to support its network of convolution particularly good, but also written in C ++, but does not provide a Python interface provides only C ++ interface.

Caffe is so popular, is because a lot before the game ImageNet inside network use are written by Caffe, so if you want to use these games inside the network model can only use Caffe, which also led to a lot of people go directly to the Caffe the framework below.

Caffe drawback is not flexible enough, while the high memory usage, provide only C ++ interface. Currently Caffe upgraded version Caffe2 already open, and fixes some issues, while the project level has been further provided.

Theano

Theano was born in 2008 in Montreal Institute of Technology, which is derived a lot of deep learning Python packages, including the most famous Blocks and Keras. The core Theano is a mathematical expression compiler, he knows how to get your structure, and make it efficient code that uses numpy, efficient local libraries, such as BLAS and native code (C ++), all in the CPU or GPU It may run fast. It is a learning process for the depth of large-scale neural network algorithm to calculate learned specially designed, is one of the first such library (development started in 2007), is considered to be depth study and research and development of industry standards.

But the researchers developed Theano mostly went to Google to participate in the development TensorFlow, so to some extent in terms of TensorFlow like Theano children.

Torch

Torch is a scientific computing framework to support a large number of machine learning algorithms, which have been born a decade, but the real benefit from the potential Facebook Open depth learning modules and the expansion of a large number of Torch. Torch is characterized by particularly flexible, but another is to use a special about Lua, under the current depth study most of the Python programming language environment, Lua programming language with a frame has more disadvantages, this minority language increases the cost of learning to use the Torch of this framework.

PyTorch's predecessor is the Torch, Torch and its underlying framework of the same, but using Python rewrite a lot of content, not only more flexible and dynamic graphic support, but also provides a Python interface.

PyTorch

PyTorch by Torch7 team developed, can be seen from the name, it's Torch with the exception that PyTorch uses Python as a development language. The so-called "Python first", the same state that it is a priority to Python depth learning framework, not only for powerful GPU acceleration, and also supports dynamic neural network, which is now a lot resides framework such as TensorFlow, etc. are not supported.

PyTorch both can be seen as adding GPU support of numpy, but can also be seen as a neural network has a strong depth automatic derivation function.

Feature

PyTorch has the following characteristics:

  • GPU support
  • Dynamic Neural Network
  • Python priority
  • Experience Imperative
  • Easy expansion

MXNet

MXNet is lead author Li Mu, the first is the sheer enthusiasm of a few people holding the technology and development made up of interest in the project, has now become the official framework of the Amazon, has a very good support distributed, but particularly good performance, memory footprint low, while its development language interface not only has Python and C ++, as well as R, MATLAB, Scala, JavaScript, etc., it can be said to meet people to use any language.

But the disadvantage is also obvious MXNet, the tutorial is not perfect, people use enough lead to less community, but there is little competition and papers are based on an annual MXNet achieve, which makes MXNet promotion efforts and awareness is not high.

Guess you like

Origin www.cnblogs.com/lianshuiwuyi/p/11011917.html