Ubuntu安装TensorFlow C++

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/MOU_IT/article/details/87976152

本文参考TensorFlow官网的安装方法:https://www.tensorflow.org/install/source

1、安装protobuf

 2、安装bazel

3、下载TensorFlow源码   

4、使用bazel编译TensorFlow,产生我们需要的库文件

5、编译其它依赖项

6、测试


1、安装protobuf

    protobuffer的GitHub官网在这里,安装protobuf要注意protobuf的版本要和TensorFlow的版本相对应,安装过程如下:    

wget https://github.com/protocolbuffers/protobuf/releases/download/v3.6.1/protobuf-cpp-3.6.1.tar.gz
tar -xzvf protobuf-cpp-3.6.1.tar.gz
sudo apt-get install automake libtool
./autogen.sh
./configure
make
sudo make install
sudo ldconfig
# sudo make uninstall 安装错版本后卸载指令
protoc --version  # 查看protobuf版本

  如果遇到报错:protoc: error while loading shared libraries: libprotoc.so.17: cannot open shared object file: No such file or directory,则:

export LD_LIBRARY_PATH=/usr/local/lib/ 

 2、安装bazel

   bazel是Google开源的一套编译构建工具,广泛应用于Google内部,包括TensorFlow项目。修改TensorFlow内部源码,需要使用bazel来编译,故有必要了解下bazel。bazel优点很多,主要有:

      1)构建快。支持增量编译。对依赖关系进行了优化,从而支持并发执行;
      2)可构建多种语言。bazel可用来构建Java、C++、Android、ios等很多语言和框架,并支持mac、windows、linux等不同平台;
      3)可伸缩。可处理任意大小的代码库,可处理多个库,也可以处理单个库;
      4)可扩展。使用bazel扩展语言可支持新语言和新平台。

  官方的安装文档参考这里。安装bazel的主要步骤如下:

  (1)第一种安装方式(这种方式只能安装最新版的bazel):Using Bazel custom APT repository

  Step 1: Install the JDK    

sudo apt-get install openjdk-8-jdk

 Step 2: Add Bazel distribution URI as a package source

echo "deb [arch=amd64] http://storage.googleapis.com/bazel-apt stable jdk1.8" | sudo tee /etc/apt/sources.list.d/bazel.list
curl https://bazel.build/bazel-release.pub.gpg | sudo apt-key add -

Step 3: Install and update Bazel 

sudo apt-get update && sudo apt-get install bazel

    或者只更新bazel:

sudo apt-get install --only-upgrade bazel

(2)第二种安装方式(可以安装任何版本的bazel): Installing using binary installer

    因为bazel的版本要和TensorFlow的版本相对应,所以推荐第二种安装方式,具体的版本对应信息参考这里

Step 1: Install required packages

sudo apt-get install pkg-config zip g++ zlib1g-dev unzip python

Step 2: Download Bazel 

   从bazel的GitHub仓库下载名为bazel-<version>-installer-linux-x86_64.sh的源码。

Step 3: Run the installer

chmod +x bazel-<version>-installer-linux-x86_64.sh
./bazel-<version>-installer-linux-x86_64.sh --user

   --user标记把bazel安装到$home/bin的文件夹中,并设置.bazelrc路径指向$home/.bazelrc 。

Step 4: Set up your environment

export PATH="$PATH:$HOME/bin"

3、下载TensorFlow源码   

git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow

  TensorFlow的默认分支为maser开发分支,可以切换到一个release分支来进行编译:

git checkout branch_name  # r1.9, r1.10, etc.

4、使用bazel编译TensorFlow,产生我们需要的库文件

(1)进入tensorflow文件夹中,首先进行项目配置:

./configure      # 如果只需要配置cpu环境就一直回车

   如果只需要配置cpu环境就一直回车。如果需要GPU的支持,可参考 TensorFlow官网 -> Build from source -> View sample configuration session 设置,主要是 Python 的路径、CUDA 和 CUDNN 的版本和路径以及显卡的计算能力。参考配置如下:  

./configure
You have bazel 0.15.0 installed.
Please specify the location of python. [Default is /usr/bin/python]: /usr/bin/python2.7

Found possible Python library paths:
  /usr/local/lib/python2.7/dist-packages
  /usr/lib/python2.7/dist-packages
Please input the desired Python library path to use.  Default is [/usr/lib/python2.7/dist-packages]

Do you wish to build TensorFlow with jemalloc as malloc support? [Y/n]:
jemalloc as malloc support will be enabled for TensorFlow.

Do you wish to build TensorFlow with Google Cloud Platform support? [Y/n]:
Google Cloud Platform support will be enabled for TensorFlow.

Do you wish to build TensorFlow with Hadoop File System support? [Y/n]:
Hadoop File System support will be enabled for TensorFlow.

Do you wish to build TensorFlow with Amazon AWS Platform support? [Y/n]:
Amazon AWS Platform support will be enabled for TensorFlow.

Do you wish to build TensorFlow with Apache Kafka Platform support? [Y/n]:
Apache Kafka Platform support will be enabled for TensorFlow.

Do you wish to build TensorFlow with XLA JIT support? [y/N]:
No XLA JIT support will be enabled for TensorFlow.

Do you wish to build TensorFlow with GDR support? [y/N]:
No GDR support will be enabled for TensorFlow.

Do you wish to build TensorFlow with VERBS support? [y/N]:
No VERBS support will be enabled for TensorFlow.

Do you wish to build TensorFlow with OpenCL SYCL support? [y/N]:
No OpenCL SYCL support will be enabled for TensorFlow.

Do you wish to build TensorFlow with CUDA support? [y/N]: Y
CUDA support will be enabled for TensorFlow.

Please specify the CUDA SDK version you want to use. [Leave empty to default to CUDA 9.0]: 9.0

Please specify the location where CUDA 9.0 toolkit is installed. Refer to README.md for more details. 
[Default is /usr/local/cuda]:

Please specify the cuDNN version you want to use. [Leave empty to default to cuDNN 7.0]: 7.0

Please specify the location where cuDNN 7 library is installed. Refer to README.md for more details. 
[Default is /usr/local/cuda]:

Do you wish to build TensorFlow with TensorRT support? [y/N]:
No TensorRT support will be enabled for TensorFlow.

Please specify the NCCL version you want to use. If NCLL 2.2 is not installed, then you can use version 1.3 
that can be fetched automatically but it may have worse performance with multiple GPUs. [Default is 2.2]: 1.3

Please specify a list of comma-separated Cuda compute capabilities you want to build with.
You can find the compute capability of your device at: https://developer.nvidia.com/cuda-gpus.
Please note that each additional compute capability significantly increases your build time and binary size. 
[Default is: 3.5,7.0] 6.1

Do you want to use clang as CUDA compiler? [y/N]:
nvcc will be used as CUDA compiler.

Please specify which gcc should be used by nvcc as the host compiler. [Default is /usr/bin/gcc]:

Do you wish to build TensorFlow with MPI support? [y/N]:
No MPI support will be enabled for TensorFlow.

Please specify optimization flags to use during compilation when bazel option "--config=opt" is 
specified [Default is -march=native]:

Would you like to interactively configure ./WORKSPACE for Android builds? [y/N]:
Not configuring 
the WORKSPACE for Android builds.

Preconfigured Bazel build configs. You can use any of the below by adding "--config=<>" to your 
build command. See tools/bazel.rc for more details.
    --config=mkl            # Build with MKL support.
    --config=monolithic     # Config for mostly static monolithic build.
Configuration finished

  (2)使用bazel编译TensorFlow:

       如果只需要考虑CPU,则运行:     

bazel build --config=opt //tensorflow:libtensorflow_cc.so

       如果需要GPU支持,则运行:

bazel build --config=opt --config=cuda //tensorflow:libtensorflow_cc.so

      若在C++环境中需要使用opencv环境,建议使用以下指令编译:(若不使用该指令可能会遇到opencv imread图像失效问题,问题详情见链接) :

bazel build --config=monolithic //tensorflow:libtensorflow_cc.so

    编译完成后,在bazel-bin/tensorflow中会生成两个我们需要的库文件:libtensorflow_cc.so 和 libtensorflow_framework.so。

5、编译其它依赖项(主要是protobuf和eigen)

   先前我们已经安装了protobuf,因此这里只需要安装eigen即可。

# eigen
./tensorflow/contrib/makefile/download_dependencies.sh
cd tensorflow/contrib/makefile/downloads/eigen
mkdir build
cd build
cmake ..
make
sudo make install

     安装完毕后,在usr/local/include目录下会出现eigen3文件夹

6、测试

#include <tensorflow/core/platform/env.h>
#include <tensorflow/core/public/session.h>
#include <iostream>

using namespace std;
using namespace tensorflow;

int main()
{
    Session* session;
    Status status = NewSession(SessionOptions(), &session);
    if (!status.ok()) {
        cout << status.ToString() << "\n";
        return 1;
    }
    cout << "Session successfully created.\n";
}

  把代码放在src文件夹下,编写cmakelist.txt,具体文件结构如下:

├── src
| └── hello.cpp
| 
├── CMakeLists.txt
| 
├── build

    CMakeLists的文件内容为 

cmake_minimum_required (VERSION 2.8.8)                       # 最低版本号
project (tf_test)                                            # 工程名
 
set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} -g -std=c++11 -W")  # 指定编译器
aux_source_directory(./src DIR_SRCS)                        # 将源码目录保存去变量中
link_directories(path_to_tensorflow/tensorflow/bazel-bin/tensorflow)  # 动态链接库目录
include_directories(                                        # 头文件的搜索目录
   path_to_tensorflow/
   path_to_tensorflow/bazel-genfiles
   path_to_tensorflow/bazel-bin/tensorflow
   path_to_tensorflow/tensorflow/contrib/makefile/downloads/absl
   /usr/local/include/eigen3
   ) 
add_executable(tf_test  ${DIR_SRCS})                                # 从源文件编译出目标文件
target_link_libraries(tf_test tensorflow_cc tensorflow_framework )  # 链接动态链接库

接下来执行命令:

cd build
cmake ..
make

会生成一个tf_test可执行文件,执行无误即大功告成

参考:https://blog.csdn.net/zwx1995zwx/article/details/79064064

          https://blog.csdn.net/rockingdingo/article/details/75452711

          https://www.cnblogs.com/seniusen/p/9756302.html

          https://www.cnblogs.com/hrlnw/p/7383951.html

          https://zhuanlan.zhihu.com/p/31283000

          https://www.jianshu.com/p/3d925fe9c3cb?utm_campaign=maleskine&utm_content=note&utm_medium=writer_share&utm_source=weibo

          https://blog.csdn.net/qq_37541097/article/details/86232687

猜你喜欢

转载自blog.csdn.net/MOU_IT/article/details/87976152