Pycharm+AutoDL for deep learning

Prepare

Pycharm requires a professional version and an SSH connection tool is required.

AutoDL registers and creates a container instance.

principle

Pycharm edits the code, uses SSH tools to connect to the cloud server to upload the code, and the cloud server uses the uploaded code and the data set and environment on the existing server for deep learning.

In general, the code is edited and uploaded locally, while the cloud server runs the code.

Therefore, the following three problems need to be solved:
1. Cloud service environment creation
2. Local code upload
3. Data set upload on the cloud server

Cloud service environment creation

An environment can directly select an image when creating a container instance.

To get started we can use the base image, which has some basic packages.

After using it for a period of time, that is, after installing some packages, you can also package the image, customize your own image and then migrate it. The detailed operation will not be repeated.
insert image description here

code upload

AutoDL creates an instance

insert image description here

Login command format = ssh -p Port Username@Host

insert image description here
Select the compiler Interpreter in the remote server, and configure the synchronization mapping of the folder Sync folders
insert image description here
. The result of file synchronization: the files in the local project will be automatically synchronized to /root/autodl-tmp/remotetorch in the remote server. You can also manually
insert image description here
insert image description here
select the file Upload, right-click the file to be uploaded and select Upload to xxx agent in Deployment

Dataset upload on cloud server

1. Server internal data set

You can copy and paste directly into your own project folder
insert image description here

2. Online disk authorization upload

insert image description here
After clicking download, the download path will be prompted
insert image description here

3. Xftp upload

Download the data set locally from the network, and then use Xshell to upload the data set to the server
insert image description here

Effect

insert image description here

other

1. Change name

Modifying the unified name can ensure that there will be no confusion in the case of multiple remote servers.
insert image description here
insert image description here
insert image description here

2. Empty the Trash

sudo rm -rf .local/share/Trash/*

Guess you like

Origin blog.csdn.net/m0_46692607/article/details/129286389