SQ-B (T) WN is an incremental quantization algorithm, which will result from the different weights of different quantization error angle, at approximately real weights, quantization scale gradually increases the weight of the final completion of all right.
Training phases of the right weight, are aiming to quantify the timing of the quantization error caused a problem
by gradually quantify weight ratio is more appropriate to quantify all direct direction gradient generated
from the results of the algorithm, it does not look particularly good. 3bits the same time there is no incremental training INQ good
learning Accurate Low-Bit Deep Neural Networks with Stochastic Quantization论文
Guess you like
Origin blog.csdn.net/qq_35608277/article/details/104900136
Ranking