NCNN quantification of ncnn2table and ncnn2int8

This document describes the quantization process in NCNN tool operations, the first quantization NCNN tool consists of two parts, ncnn2table and ncnn2int8;

Ncnn2table operation required prior ncnn2int8 quantization process, quantization table generation;

The following describes the first step of generating a quantization table:

A, ncnn2table generating quantization table

1, first of all preparatory work, refer to Optimize NCNN deep learning framework optimizer

2, the terminal enters ncnn / build / tools / quantize directory

3、./ncnn2table --image imagepath --param parampath --bin binpath --output newModel.table

  • Note: When you execute the command, in the back can also add mean, norm, size, thread parameters, here in the tool has been set as the default, there is no modification;
  • Note: image here refers to a set of pictures, and the picture is the large number of pictures set;

4, after executing the command, you can see the directory under the original file generated newModel.table quantization table

Two, ncnn2int8 quantify network model

1, executes the executable file ncnn2table, generates a quantization table

2, the terminal enters ncnn / build / tools / quantizw directory

3、./ncnn2int8 [inparam] [inbin] [outparam] [outbin] [calibration table]

4, upon command, to generate param bin and output file in the original file directory (i.e., the model for ncnn int8 quantized)

Guess you like

Origin www.cnblogs.com/wanggangtao/p/11352948.html