read, readline, readlines

Original link: http://www.cnblogs.com/qihui/p/4244475.html

reading read specified length, returns the string

readline read each line

readlines read the entire file into memory.

For large files (greater than memory) process, an iterative affect the efficiency of readline line by line. Line to see a foreigner gave a very good approach:

import io

def readInChunks(fileObj, chunkSize=2048):
    """
    Lazy function to read a file piece by piece.
    Default chunk size: 2kB.
    """
    while True:
        data = fileObj.read(chunkSize)
        if not data:
            break
        yield data

f = open('test.txt')
g = open('New Text Document.txt','w')
for chuck in readInChunks(f):
    g.write(chuck)
g.close()

Incidentally, mention next,
the while and for the speed at which the python is not the same.

while pvm byte code is running slower

It is converted to code for the C run faster.

Reproduced in: https: //www.cnblogs.com/qihui/p/4244475.html

Guess you like

Origin blog.csdn.net/weixin_30291791/article/details/94789563