caffe2 notes

Disclaimer: This article is a blogger original article, reproduced, please attach Bowen link! https://blog.csdn.net/m0_37263345/article/details/81110503

1, Caffe use in LMDB https://blog.csdn.net/haluoluo211/article/details/54427421

2, Protubuf Google Protocol Buffer (referred Protobuf) is a lightweight, highly efficient structured data storage format
https://www.ibm.com/developerworks/cn/linux/l-cn-gpb/index.html

3, caffe2 compilation

error while loading shared libraries: xxx.so.x "wrong reasons and solutions

https://blog.csdn.net/u012839187/article/details/48025225

4, caffe2 operator-defined network and are using protocol sequence of the protobuf a file, until the real run, he will use c ++ back-end initialized according to protobuf file, then run

5, brew is a collection of intelligent help function, can use it to create a network. When creating the network or operator's help as he can function as help you initialization parameters, define actions, select the machine.

brew is a smart collection of helper functions. You can use all Caffe2 awesome helper functions with a single import of brew module. You can now add a FC layer using:

6, the network is divided into a primary network and network parameter initialization

For example:

workspace.RunNetOnce(train_model.param_init_net)
workspace.CreateNet(train_model.net, overwrite=True)

workspace.RunNet(train_model.net)

7、In our view, 

ModelHelper class should only contain network definition and parameter information. (used to declare network)

Of The  brew Module1 Will have have to Build The Functions and the initialize Network Parameters. (Used to add into the network operator)

For example:

train_model = model_helper.ModelHelper(name="mnist_train", arg_scope=arg_scope)

conv1 = brew.conv(train_model , data, 'conv1', dim_in=1, dim_out=20, kernel=5)

8, network training finished, after testing, to be deployed by the network structure and training to save the resulting network parameters, the next use

For example:

deploy_model = model_helper.ModelHelper(
    name="mnist_deploy", arg_scope=arg_scope, init_params=False)
AddModel(deploy_model, "data")

pe_meta = pe.PredictorExportMeta(
    predict_net=deploy_model.net.Proto(),
    parameters=[str(b) for b in deploy_model.params], 
    inputs=["data"],
    outputs=["softmax"],
)

# save the model to a file. Use minidb as the file format
pe.save_to_db("minidb", os.path.join(root_folder, "mnist_model.minidb"), pe_meta)

 

Guess you like

Origin blog.csdn.net/m0_37263345/article/details/81110503