Python 读取几百 GB 的文件,不爆内存

    buf = ''
    while True:
        while newline in buf:
            pos = buf.index(newline)
            yield buf[:pos]
            buf = buf[pos + len(newline):]
        chunk = f.read(4096) # 每次读取的大小

        if not chunk:
            yield buf
            break
        buf += chunk
        

if __name__ == '__main__':
    # 文件中的分隔符
    flite = r"\n"
    with open("contain.txt") as f:
        for line in myreadlines(f, flite):
            print(line)

本文转载于 https://fishc.com.cn/thread-145508-1-1.html

猜你喜欢

转载自www.cnblogs.com/whx2008/p/12628542.html
gb