YOLOv8 nanny-level tutorial (training your own data set)

Tip: After the article is written, the table of contents can be automatically generated. How to generate it can refer to the help document on the right

1. Environment configuration

First, it is recommended to use anaconda as your python environment, and the next demo is pycharm

Having learned YOLOv8, everyone is not a noob by default.

conda create -n YOLOv8 python=3.8  #创建YOLOv8的环境

conda activate YOLOv8   #激活环境


After building the environment, install pytorch

I offer two options here to see which
CUDA 11.6
pip install torch==1.12.0+cu116 torchvision==0.13.0+cu116 --extra-index-url https://download.pytorch.org /whl/cu116 
CUDA 11.3
pip install torch==1.12.0+cu113 torchvision==0.13.0+cu113 --extra-index-url https://download.pytorch.org/whl/cu113 

The best version of torch == 1.12.0+

Next install the dependencies

 pip install ultralytics     #YOLOv8高度集成,如果你仅仅只是为了使用,只需要这一个包即可,超级方便。

If you need to modify the code, it will be a bit troublesome. This article introduces how to use it, and how to modify it will be discussed later.

2. Data set preparation

Prepare a data set of the YOLO series. I will not demonstrate the conversion of voc to txt here. Just prepare a ready-made one yourself.

If you have played YOLOv5, the data set can be used directly.

 The data set is placed in the large directory of YOLOv8

The data.yaml file is placed in the directory shown in the figure

 

 The format of yaml can be modeled after the coco.yaml in YOLOv8, which also follows the format of YOLOv5, and there is no problem in the personal test.

3. Effect demonstration

Open pycharm, import the environment, open the project

Ultralytics/cfg/default.yaml Find default.yaml according to the path. The reason for version iteration may be different.

The following are the relevant parameters, you can find out

 

 The task can detect, classify, and divide according to your own needs.

mode: See if you want to train or verify or predict at the moment

model: yolov8n.pt, or yolov8n.yaml If it is yaml, it will be trained from scratch

data: fill in the yaml file name of your training set

works: It is best to set it to 0, some people may report errors

The most basic parameters here are configured successfully.

Next, open the terminal of pycharm, which is the bottom line

 choose this one

 Enter the following code in the terminal where yolo cfg= is fixed

The suffix is ​​the path of default.yaml

yolo cfg=ultralytics/cfg/default.yaml 

 training starts

 

 verify

It should be noted that you need to change the mode to val, and then modify the model parameter in Train settings to the model file you saved for training, as follows:

yolo cfg=ultralytics/cfg/default.yaml 

The verification set test results are as follows:

 

 reasoning

Pull down to find the source and enter the address of the image you need to reason about

yolo cfg=ultralytics/cfg/default.yaml 

 

 

 The result is as follows

 

 For users who do not need to modify YOLO, the highly integrated v8 is particularly convenient and easy to use.

Guess you like

Origin blog.csdn.net/weixin_45303602/article/details/131839905