About Pytorch calculated in accuracy and loss of

These days calculated on accuracy and loss have some doubts, the original is he still not clear.

Examples are given

def train(train_loader, model, criteon, optimizer, epoch):
    train_loss = 0
    train_acc = 0
    num_correct= 0
    for step, (x,y) in enumerate(train_loader):

        # x: [b, 3, 224, 224], y: [b]
        x, y = x.to(device), y.to(device)

        model.train()
        logits = model(x)
        loss = criteon(logits, y)

        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
        train_loss += float(loss.item())
        train_losses.append(train_loss)
        pred = logits.argmax(dim=1)
        num_correct += torch.eq(pred, y).sum().float().item()
    logger.info("Train Epoch: {}\t Loss: {:.6f}\t Acc: {:.6f}".format(epoch,train_loss/len(train_loader),num_correct/len(train_loader.dataset)))
    return num_correct/len(train_loader.dataset), train_loss/len(train_loader)
  • First, such an exercise called a epoch, the total number of samples / batchsize is completed, "a few steps" needed a epoch, corresponding, len (train_loader.dataset) is the total number of samples, len (train_loader) is the number of steps.

Then, the calculation accuracy of the entire train_loader i.e. for loop (steps), each of the number of correctly determined mini_batch added up and then divided by the total number of samples on the list;

  • The calculation of loss are stress, and here we are in the first cross-entropy calculation, with regard to cross-entropy, which is related to the two values, one is given logits model, that is, 10 classes, each class of probability distributions, another is the sample itself

label, in Pytorch, as long as these two values ​​can be input into the calculation of cross-entropy, the method used is nn.CrossEntropyLoss, this method is actually calculate the mean of a minibatch, so the accumulated number of steps divided by subsequent need, that is

The number of minibatch, rather than accuracy, as is the number of samples, it is very important.

Guess you like

Origin www.cnblogs.com/yqpy/p/11497199.html