Python read hundreds of GB of files, not burst memory

    buf = ''
    while True:
        while newline in buf:
            pos = buf.index(newline)
            yield buf[:pos]
            buf = buf[pos + len(newline):]
        chunk = f.read(4096) # 每次读取的大小

        if not chunk:
            yield buf
            break
        buf += chunk
        

if __name__ == '__main__':
    # 文件中的分隔符
    flite = r"\n"
    with open("contain.txt") as f:
        for line in myreadlines(f, flite):
            print(line)

This article is reproduced in https://fishc.com.cn/thread-145508-1-1.html

Guess you like

Origin www.cnblogs.com/whx2008/p/12628542.html