The order in which pytorch makes two dataloaders scramble at the same time is the same

class MyDataset(Dataset):
    def __init__(self, datasetA, datasetB):
        self.datasetA = datasetA
        self.datasetB = datasetB
        
    def __getitem__(self, index):
        xA = self.datasetA[index]
        xB = self.datasetB[index]
        return xA, xB
    
    def __len__(self):
        return len(self.datasetA)
    
datasetA = ...
datasetB = ...
dataset = MyDataset(datasetA, datasetB)
loader = DataLoader(dataset, batch_size=10, shuffle=True)

https://discuss.pytorch.org/t/dataloader-shuffle-same-order-with-multiple-dataset/94800/2

Guess you like

Origin blog.csdn.net/aab11235/article/details/116203567