batch_size refers to the size of the input data set each time the network is trained in deep learning. When training a neural network, we usually divide the data into small batches for training. Each small batch contains multiple data samples. The size of this small batch is batch_size. Generally speaking, choosing an appropriate batch_size can speed up the training and improve the training effect.
What does batch_size mean?
Guess you like
Origin blog.csdn.net/weixin_35753431/article/details/129566766
Recommended
Ranking