RuntimeError: stack expects each tensor to be equal size, but got [3, 500, 656] at entry 0 and [3, 500, 666] at entry 1
The reason for this problem is that the size of the image input to the dataloader is not consistent. Pay attention to check whether the train, test, and val data sets have been processed for image size. If not, you can use
transforms.Resize((256, 256))
You can also use RandomResizedCrop, Scale, Centercrop, etc. to change the image size.