learning Accurate Low-Bit Deep Neural Networks with Stochastic Quantization论文

SQ-B (T) WN is an incremental quantization algorithm, which will result from the different weights of different quantization error angle, at approximately real weights, quantization scale gradually increases the weight of the final completion of all right.
Training phases of the right weight, are aiming to quantify the timing of the quantization error caused a problem
by gradually quantify weight ratio is more appropriate to quantify all direct direction gradient generated
Here Insert Picture Description
from the results of the algorithm, it does not look particularly good. 3bits the same time there is no incremental training INQ good
Here Insert Picture Description
Here Insert Picture Description

Published 452 original articles · won praise 271 · views 730 000 +

Guess you like

Origin blog.csdn.net/qq_35608277/article/details/104900136