Eye PyTorch training camp depth of the second period --- 4, autograd and logistic regression

An automatic derivation system ---- torch.autograd

1, torch.autograd.backward  functions: automatic strike gradient

    •   tensors: tensor for the derivation of such loss
    •        retain_graph: Save the calculation of FIG.
    •        create_graph: Create FIG derivative calculation for higher order derivative
    •        grad_tensors: multi-gradient weights

2, torch.aurograd.grad  function: to strike gradient

    •   outputs: Tensor for derivation of Y -----
    •         inputs: gradient tensor requires ----- w, x
    •         create_graph: Create FIG derivative calculation for higher order derivative
    •         retain_graph: Save the calculation of FIG.
    •         grad_outputs: multi-gradient weights

Tips:

  (1) is not automatically cleared gradient  grad.zero_ ()

  (2) dependent on the leaf node of the node, requires_grad defaults to True

  (3) non-leaf node performs in-place operation

    

Second, the logistic regression

Logistic regression model is linear dichotomous

Model expression: y = f (WX + b)

                      f (x) = 1 / (1 + e (-x)) f (x) called the Sigmoid function, also known as Logistic Function

      class={0,0.5>y

         1,0.5<y

  • The difference between the linear regression and logistic regression
    • Linear regression analysis is the independent variable x and (y dependent variable scalar ) Method The relationship between
    • Logistic regression analysis is the independent variable x and dependent variable y ( probabilistic ) method of the relationship between

 

  • Step training machine learning models
    •   data
    •        model
    •        Loss function
    •        Optimizer
    •        Iterative training

 

Guess you like

Origin www.cnblogs.com/cola-1998/p/11695053.html