Stream file stream

One: As the largest cache buffer Buffer memory is 1GB, so for reading and writing large files will not work, for this purpose, node.js provides a Stream file stream.

  (1): Stream module provides file stream operation, Node.js there are many object implements this interface. For example, request for initiating a request target server is a HTTP Stream, there stdout (standard output) and the like.

  (2): Stream There are four types: Readable (read file stream), Writeable (writable file stream), Duplex: read-write file stream, Transform: the operation data is written, then read out the results (the converted stream ).

  (3): Node.js in the I / O is asynchronous, so read on disk and network / write to read data required by the callback function, and the callback function needs to be triggered by an event, all of the Stream object city EventEmitter (event trigger) real columns, common events:

    data: when the trigger-readable data.

    end: no trigger can read more data.

    error: triggered when an error occurs accept or writing.

    finish: Fires when all data has been written to the underlying system.

  (4): Flow-readable .js

/*
Flow-readable:
It allows the user to read the data block in the source file carve file, and then read data from the read stream.
fs.createReadStream(path,[,options]);
These parameters have options:
flags: what action the file.
encoding: encoding specified, the default is null, use fs.setEncoding ( 'utf-8');
start: start reading from the start.
end: to read the file until the end.
*/
var fs = require('fs');
var tatal_data = '';
var readStream = fs.createReadStream('a.txt',encoding = '',flags = 'r',start = 'start',end = 'end');
readStream.setEncoding ( 'UTF-. 8' );
 // Note stream is event driven 
readStream.on ( 'Data', function (data_block) {
    tatal_data = tatal_data+data_block;
});
readStream.on('end',function(){
    console.log (tatal_data);
});
readStream.on('error',function(err){
    console.log(err.stack);
}); 
Output:

D: \ Program Files \ nodejs \ chapter> node readable stream .js
by You Often field to do // ?? The error thingyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyyy: in the folder is to say hello, but did not see how to deal with the book.
U
O
P

  5: using the read stream data which is read out using the write data stream will be stored to a file.

  

var fs = require('fs');
var readStream = fs.createReadStream('a.txt');
var writeStream = fs.createWriteStream('D:/Program Files/nodejs/chapter/output.txt');
readStream.setEncoding('utf-8');
readStream.on('data',function(data_block) {
    writeStream.write (data_block); // File stream writing 
});
readStream.on('error',function(err){
    console.log(err.stack);
});
readStream.on ( 'end', function () {
     // will write all data after the trigger end event, will close its file. 
    writeStream.end ();
});
writeStream.on('error',function(write_err){
    console.log(write_err.stack);
});

  6: using pipe () handle large files:

/*
pipe () translated into a tube.
In the node.js on through this tube and is connected to write-readable stream flow.
* / 
Var FS = the require ( 'FS' );
 // source file path 
var srcPath = 'D: / Program Files / NodeJS / Chapter / a.txt' ;
 var distpath = 'D: / Program Files / NodeJS / Chapter / out.txt ' ;
 var readStream = fs.createReadStream (srcPath);
 var writeStream = fs.createWriteStream (distpath);
 // can access to the writable-readable stream pipe by using a function of the flow () 
IF (readStream.pipe (writeStream))
{
    console.log ( 'the file copied successfully' );
}
else
{
    the console.log ( 'file copy failure' );
} 
Output:

D: \ Program Files \ nodejs \ chapter> node pipe handle large files .js
file copy success

 

7: must be familiar with enforcement mechanisms callback function, and event trigger mechanism.

  on This function is used to bind the event, and also to the extensive use of off () unbind events jQuery.js in.

 

Guess you like

Origin www.cnblogs.com/1314bjwg/p/12507971.html