6. Activation layer

6.1 Nonlinear Activation

① inplace is in-place replacement. If it is True, the value of the variable is replaced. If it is False, a new variable will be created and the value processed by the function will be assigned to the new variable. The value of the original variable will not be modified.

import torch
from torch import nn
from torch.nn import ReLU

input = torch.tensor([[1,-0.5],
                      [-1,3]])
input = torch.reshape(input,(-1,1,2,2))
print(input.shape)

class Tudui(nn.Module):
    def __init__(self):
        super(Tudui, self).__init__()
        self.relu1 = ReLU()
        
    def forward(self, input):
        output = self.relu1(input)
        return output
    
tudui = Tudui()
output = tudui(input)
print(output)

result:

 6.2 tensorboard display

import torch
import torchvision
from torch import nn 
from torch.nn import ReLU
from torch.nn import Sigmoid
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter

dataset = torchvision.datasets.CIFAR10("./dataset",train=False,transform=torchvision.transforms.ToTensor(),download=True)       
dataloader = DataLoader(dataset, batch_size=64)

class Tudui(nn.Module):
    def __init__(self):
        super(Tudui, self).__init__()
        self.relu1 = ReLU()
        self.sigmoid1 = Sigmoid()
        
    def forward(self, input):
        output = self.sigmoid1(input)
        return output

tudui = Tudui()
writer = SummaryWriter("logs")
step = 0

for data in dataloader:
    imgs, targets = data
    writer.add_images("input", imgs, step)
    output = tudui(imgs)
    writer.add_images("output", output, step)
    step = step + 1

operate:

① In the Anaconda terminal, activate the py3.6.3 environment, then enter the tensorboard --logdir=C:\Users\wangy\Desktop\03CV\logs command, assign the URL to the browser's URL bar, and press Enter to view the tensorboard display. Log status.

 result:

 

おすすめ

転載: blog.csdn.net/qq_54932411/article/details/132512717