Caffe Complete Installation Guide (GPU) on

0. write in front


This article starts with Caffe's dependent library, and completes the installation of the entire environment required by the Caffe framework by compiling source code, and explains the entire development steps, striving to be easy to understand, so that Caffe can be used by more people Accept.
It should be noted that source code compilation is relatively cumbersome work. The author has tested many times to ensure that the standards used in the versions of the libraries used in this article do not conflict. If you need to change the version, please consider it yourself.

The version of the compilation tool used in this article is:

make 3.82
cmake 1.14.7
gcc/g++ 6.5.0
cuda 10.1

1. Caffe dependency package installation


1.1. ProtoBuffer


ProtoBuffer is a protocol interface developed by Google that can exchange memory with non-volatile storage media (such as hard disk files). ProtoBuffer is widely used in Caffe source code as the carrier of weights and model parameters. Generally, developers have their own likes and dislikes about parameter management. Some like TXT for easy modification, some like BIN for efficient reading and writing, and some like the intuitive image of graphical configuration. Inconsistent parameter management brings many problems. For example, different members of a project team must agree on a unified parameter scheme, or communication protocol, to facilitate module integration. The ProtoBuffer tool perfectly solves this problem. Users only need to create a unified parameter description file (proto), and then use protoc compilation to automatically generate key codes such as protocol details, saving a lot of development and debugging time. Using ProtoBuffer can also transfer the same data structure across languages ​​(C++/Java/Python), making team collaboration more efficient.

Note: Sometimes the files generated by the old version of ProtoBuffer will have various error messages that are difficult to troubleshoot when used in the new version, so it is recommended to use the same version of ProtoBuffer in the environment that needs to run Caffe.

installation process

# 从git上下载合适的版本,本文使用的是2.5.0版
wget https://github.com/google/protobuf/releases/download/v2.5.0/protobuf-2.5.0.tar.gz
tar zxvf protobuf-2.5.0.tar.gz
cd protobuf-2.5.0/
./configure --prefix=/your/install/path
make
make install

After the installation is complete, an executable file will be generated under the /your/install/path/bin folder, so add this folder to the $PATH variable

1.2. Boost


Students who have experienced C++ should know the Boost library. It is a powerful, well-structured, cross-platform, open source and free library. It is called "C++ quasi-standard library". It uses a lot of modern programming techniques and covers string processing. , regular expressions, containers (not Docker) and data structures, concurrent programming, functional programming, generic programming, design pattern implementation and many other fields, making C++ development more flexible and efficient. Please refer to http://www.boost.org/ for more details.

In Caffe, the smart pointer in Boost is mainly used, which has its own reference counting function, which can avoid memory leaks or multiple releases when sharing pointers.

In addition, pycaffe uses Boost Python to realize the connection between C/C++ and Python language, which is convenient for Python to call modules designed by C/C++.

installation process

#下载合适的版本,本文使用的是1.72,由于boost在git上的源码安装步骤较为繁琐,这里使用SOURCEFORGE的源码包
#下载页面为:https://sourceforge.net/projects/boost/files/boost/1.72.0/
tar zxvf boost_1_72_0.tar.gz
cd boost_1_72_0
./bootstrap.sh --with-libraries=system,thread,python3
./b2
cp -r boost /your/install/path/include/
cp -r stage/lib/* /your/install/path/lib/
history

Note: If you are not using the system default python (that is, not in the /usr/bin directory), you need to manually specify the CPLUS_INCLUDE_PATH environment variable to the python include directory you are using, for example:

export CPLUS_INCLUDE_PATH=/你的anconda路径/include/python3.7:$CPLUS_INCLUDE_PATH

1.3. GFLAGS


GFLAGS mainly plays the role of command line parameter parsing in Caffe, which is similar to the function of ProtoBuffer, but the parameter input source is different. For how to use GFLAGS, please refer to tools/caffe.cpp in the Caffe source code

installation process

# 从git上下载合适的版本,本文使用的是2.2.0版
wget https://codeload.github.com/gflags/gflags/tar.gz/v2.2.0
tar zxvf gflags-2.2.0.tar.gz
cd gflags-2.2.0/
mkdir build
cd build/
cmake ..
ccmake ..

ccmake calls up the configuration interface and modifies it as follows:
insert image description here

make -j4
make install

1.4. HAWTHORN


The GLOG library is a practical library developed by Google for recording application logs. It provides an interface based on C++ standard input and output streams. You can choose different log levels when recording, which is convenient for separating important logs from ordinary logs.

GLOG mainly plays the role of recording logs in Caffe, which is convenient for developers to view the intermediate output generated during Caffe training, and decide how to adjust parameters to control convergence based on this information. From the log file, we can easily see the running process of the program, which is convenient for tracking the source code and locating problems. For how to use GLOG, please refer to tools/caffe.cpp in the Caffe source code.

installation process

# 从git上下载合适的版本,本文使用的是0.3.3版
wget https://github.com/google/glog/archive/v0.3.3.tar.gz
tar zxvf glog-0.3.3.tar.gz
./configure --prefix=/your/install/path/
make -j4
make install

1.5. BLAS


The mathematical calculations used in the convolutional neural network are mainly matrix and vector calculations. Caffe calls the corresponding methods in BLAS (Basic Linear Algebra Subprograms, Basic Linear Algebra Subprograms). The most commonly used BLAS implementations are Intel MKL, ATLAS, OpenBLAS, etc. Caffe can choose any of them

installation process

# 从git上下载合适的版本,本文使用的是0.3.3版
wget https://codeload.github.com/xianyi/OpenBLAS/tar.gz/v0.3.9
tar zxvf OpenBLAS-0.3.9.tar.gz
cd OpenBLAS-0.3.9/
make -j4
make PREFIX=/your/install/path/ install

1.6. ZLIB


The first zlib version 0.9 was released on May 1, 1995. zlib uses the DEFLATE algorithm, originally written for the libpng library, and later widely used by many software. This library is free software, licensed with zlib. As of March 2007, zlib is an open source project included in Coverity's US Department of Homeland Security sponsors who have chosen to continue reviewing.
This library will be installed by default on most systems, but it is likely to only contain dynamic libraries without header files. Here We also compile and install it.

installation process

tar zxvf zlib-1.2.11.tar.gz
cd zlib-1.2.11
./configure --help
./configure --prefix=/your/install/path
make -j4
make install

1.7. HDF5


HDF (Hierarchical Data File) is a new data format that can efficiently store and distribute scientific data developed by the National Center for Advanced Computing Applications (NCSA) in order to meet the research needs of various fields. It can store different types of images and digital data files, and can be transmitted on different types of machines, and it also has a function library for unified processing of this file format. Caffe trained models can optionally be saved in HDF5 format or (default) ProtoBuffer format.

installation process

# 从git上下载合适的版本,本文使用的是1.10.3版
wget https://support.hdfgroup.org/ftp/HDF5/releases/hdf5-1.10/hdf5-1.10.3/src/hdf5-1.10.3.tar.gz
tar zxvf hdf5-1.10.3.tar.gz
cd hdf5-1.12.0/
# 要启用c++模块编译,同时指定ZLIB库的位置,本文是和环境文件夹安装在一起的
./configure --prefix=/your/install/path/ --enable-cxx --with-zlib=/your/install/path/include,/your/install/path/lib
#配置完成后现实的配置信息中,出现了 I/O filters (external): deflate(zlib)就表示zlib配置完成
make -j4
make install

1.8. LMDB and LEVELDB


LMDB (Lightning Memory-Mapped DatabaseManager) - a lightning-fast memory-mapped database manager. Its role in Caffe is mainly to provide data management and convert all kinds of raw data (JPEG pictures, binary data) into a unified Key-Value Storage, which is convenient for Caffe's DataLayer to obtain these data.

The LEVELDB library is a data storage method used by earlier versions of Caffe, developed by Google. It is a persistent key-value pair storage method, and the key and value can be arbitrary byte arrays. The storage order of the keys can be determined by a user-defined comparison function.

At present, most routines have used LMDB instead of LEVELDB, but in order to be compatible with previous versions, this dependency library is still compiled into Caffe.

installation process

# dmdb可以从git上下载
git clone https://github.com/LMDB/lmdb.git
cd lmdb/libraries/liblmdb
make -j4
# 将头文件和库拷贝到环境文件夹下
cp lmdb.h /your/install/path/include/
cp liblmdb.so /your/install/path/lib/

# leveldb可以下载包后直接编译
wget https://github.com/google/leveldb/archive/v1.7.tar.gz
tar zxvf leveldb-1.7.tar.gz
cd leveldb-1.7/
make -j4
cp -r include/leveldb /your/install/path/include/
cp libleveldb.so* /your/install/path/lib/

1.9. Snappy


Snappy is a C++ library for compression and decompression, designed to provide high compression speed and reasonable compression ratio. Snappy is faster than zlib, but the file size is 20% to 100% larger.

installation process

tar zxvf Snappy-1.1.7.tar.gz
cd snappy-1.1.7/
mkdir build
cd build/
cmake ..
ccmake ..

The configuration is as follows:
image

It should be noted that the Gflags_DIR parameter needs to be adjusted according to the Gflags package you installed, and it will usually be found automatically

#配置完毕后直接安装
make install

1.10. OpenCV


OpenCV is the world's most popular open source computer vision library, which contains a large number of image processing functions. Although Caffe only uses OpenCV to complete some image access and preprocessing functions, but considering that we will also perform other image processing operations, here we choose the entire installation process to
download
the source code and additional package contrib from the Opencv official website. This article uses opencv_3 .4.9, the main configuration parameters are:

# 编译模式
-D CMAKE_BUILD_TYPE=Release
# 安装路径
-D CMAKE_INSTALL_PREFIX=/your/install/path 
# 是否编译python2接口
-D BUILD_opencv_python2=OFF
# 是否编译python3接口
-D BUILD_opencv_python3=ON
# 是否使用cuda加速,要注意的是cuda9以后不再支持2.0架构,当使用较旧版本的opencv时需要对源码进行修改
-D WITH_CUDA=OFF
# 附加contrib模块,需要额外下载,本文不使用
-D OPENCV_EXTRA_MODULES_PATH=/your/opencv/ext/modules
# python3路径
-D PYTHON3_EXECUTABLE=/your/python/path/python3.x
# 如果使用支持c++11标准的cuda则需要加上下面这条参数
# 这是因为opencv3.4.9还不支持c++11,所以编译的时候会有一些代码通过不了。加上后解决
-D CUDA_NVCC_FLAGS=--expt-relaxed-constexpr

During the configuration process, the download may be stuck due to network problems (such as IPPICV download). At this time, you can download it in advance and then modify the cmake file to load it. The detailed operation can be found by yourself, and this article will not go into details.

Here we do not apply CUDA acceleration, and use cmake with the following parameters:

mkdir build
cd build
cmake -D CMAKE_BUILD_TYPE=Release -D CMAKE_INSTALL_PREFIX=/your/install/path -D BUILD_opencv_python2=OFF -D BUILD_opencv_python3=ON -D WITH_CUDA=OFF -D PYTHON3_EXECUTABLE=/your/python/path/python3.x ..

make -j4
make install

# make安装完成后,如果有需要可以在build/python_loader下安装python接口:
python3 setup install

Finish


So far, all dependencies of caffe have been installed. After using tree to view the directory structure, it is as follows:

├── bin
├── include
│   ├── boost
│   ├── gflags
│   ├── glog
│   ├── google
│   ├── leveldb
│   ├── opencv
│   └── opencv2
├── lib
│   ├── cmake
│   ├── pkgconfig
│   └── python3.6
├── share
│   ├── doc
│   ├── hdf5_examples
│   ├── licenses
│   └── OpenCV
└── soft
    └── caffe-master

In the next chapter we will start the installation of caffe

Guess you like

Origin blog.csdn.net/TchaikovskyBear/article/details/129141958