Some evaluation indicators of lightweight deep convolutional models

Some evaluation indicators of lightweight deep convolutional models


Foreword:
The purpose is to share the concept of universal index evaluation and python implementation of some lightweight deep learning models, so as to help everyone understand and facilitate their own reference.
This blog is mainly to explain some indicator concepts of lightweight models and a simple demo implementation of pytorch.


1. The concept of FLOPs

  • FLOPs:
    Note that s is lowercase, which is the abbreviation of FLoating point OPerations (s stands for plural numbers), which means floating-point operands and is understood as the amount of calculation. It can be used to measure the complexity of the model. For the complexity evaluation of neural network models, it should refer to FLOPs, not FLOPS.
  • FLOPS:
    Pay attention to all uppercase, it is the abbreviation of floating point operations per second, which means the number of floating point operations per second, which is understood as the calculation speed. It is a measure of hardware performance. For example, this indicator is used to list the computing power (Compute Capability) of each graphics card on nvidia's official website, as shown in the figure below, but the figure is TeraFLOPS, and the prefix Tera indicates the magnitude: MM, which means 2^12.
    1. Some basic conversion relations࿱

Guess you like

Origin blog.csdn.net/u013537270/article/details/130252208