pytorch CUDA out of memor

pytorch CUDA out of memor

解决方法:

用完把cuda变量 del

显存能增大一倍

    import time
    net = BiSeNet()
    net.cuda()
    net.eval()
    in_ten = torch.randn(8, 3, 640, 640).cuda()

    for i in range(8):
        start=time.time()
        out, out16, out32 = net(in_ten)
        print(time.time()-start,out.shape,out16.shape,out32.shape)
        del out
        del out16
        del out32
原创文章 2935 获赞 1163 访问量 619万+

猜你喜欢

转载自blog.csdn.net/jacke121/article/details/106107365