pytorch的dataloader

dataloader.py中_DataLoaderIter

    def __next__(self):
        if self.num_workers == 0:  # same-process loading
            indices = next(self.sample_iter)  # 一个batch中数据的index,列表一共有batch_size个值
            batch = self.collate_fn([self.dataset[i] for i in indices])
            if self.pin_memory:
                batch = pin_memory_batch(batch)
            return batch

猜你喜欢

转载自blog.csdn.net/qq_44167992/article/details/88540958