foreword
Record the usage of tensorboard.
As a digression, sometimes the graphics card is occupied, you can use the PID number of the program that occupies the graphics card, enter a command, probably know where it is actually executed, and then judge who used it
ll /proc/【PID号】
step
First enter in the local terminal
ssh -L 9002:127.0.0.1:9001 username@hostname
Among them username
is your login server 用户名
, hostname
yes 服务器ip
, 9001 is the port occupied by tensorboard to be opened later ( 服务器上的
), you can also change it to other, as long as the port is not occupied by other programs. 9002 is 本地浏览器
the port provided for access.
After the connection is successful, change the path to your log file path, enter
tensorboard --logdir=./ --port=9001
Do not close this terminal after running, enter in the browser
http://localhost:9002/
You can see the data on the tensorboard panel.
common problem
Multiple tensorboard version conflicts
ValueError: Duplicate plugins for name projector
See: pit in tensorboard use (ValueError: Duplicate plugins for name projector)
Use the site package site.getsitepackages() to locate the location of the site-package package, find tensorboard–xxxdist-info and delete it.
SummaryWriter
Recently, I feel that tensorboard is really inseparable...
Here is a small example
from torch.utils.tensorboard import SummaryWriter
# logs存在Tensorboard_logs目录下
writer = SummaryWriter("Tensorboard_logs")
import time
import random
for step in range(0, 100, 2):
time.sleep(3) # 三秒展示一次, 不会太快可以方便查看运行时查看指标动态变化
# 模拟模型预测
output = random.random() * step / 10 + step + random.randint(0, 5)
target = 80
loss = (output - target) ** 2
print('step {}, loss: {}'.format(step, loss))
writer.add_scalar("mse loss", loss, step)
writer.close()
After opening tensorboard
Generated files: