Article Directory
1. nohup
nohup python -u file name> log.log &
2. tmux
Start a session tmux new -s name
Jump out of the session ctrl + b + d
Enter the session again tmux a -t name
View all current sessions tmux ls
Split screen
tmux split-window -h Horizontal split screen
tmux split-window Vertical split screen
Split screen move
ctrl +b up, down, left, and right
Close
ctrl+b +x close the current window
Enter Ctrl + B
and then press N again to switch to the next window.
Enter Ctrl + B
and then press P again to switch to the previous window.
Enter Ctrl + B
and then press & delete the current window. You will be prompted whether to delete, enter y Press Enter to delete the current window. After all windows are deleted, the tmux session is gone.
There is one more question here. Can you run commands in multiple areas without building windows? The answer is definitely yes. It is OK to divide a window into multiple windows.
Type Ctrl + B
and press% again to cut the window vertically.
Enter Ctrl + B
and then click "to cut the window horizontally.
In this way, you can divide into multiple areas, which instantly becomes convenient.
Enter Ctrl + B
and then combine with the arrow keys to select the pane. The green box represents The currently selected window.
Enter Ctrl + B
and then press X to close the current window. The same confirmation is done by typing y and then pressing Enter. The
above is the basic usage of tmux. Master the shortcut keys and commands above. Daily use needs.
Tmux as an excellent terminal tool, there are many things.
3. Calculate the running time of the program
from datetime import datetime
start = datetime.now()
run_program()
end = datetime.now()
print(str(end-start))
4. Calculate model size
pip install torchsummary
import torch
import torch.nn as nn
import torch.nn.functional as F
from torchsummary import summary
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
self.conv2_drop = nn.Dropout2d()
self.fc1 = nn.Linear(320, 50)
self.fc2 = nn.Linear(50, 10)
def forward(self, x):
x = F.relu(F.max_pool2d(self.conv1(x), 2))
x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
x = x.view(-1, 320)
x = F.relu(self.fc1(x))
x = F.dropout(x, training=self.training)
x = self.fc2(x)
return F.log_softmax(x, dim=1)
device = torch.device("cuda" if torch.cuda.is_available() else "cpu") # PyTorch v0.4.0
model = Net().to(device)
summary(model, (1, 28, 28))