Understand Stream in Node.js

1. What is the concept and function of stream in Node.js?

  • What is a stream? There is water flow in daily life. What we can easily imagine is the faucet, so the water flowing out of the faucet is orderly and directional (flowing from a high place to a low place). Our flow in nodejs is the same, they are also orderly and directional.
  • stream:处理系统缓存的方式
  • The stream in nodejs is readable, or writable, or readable and writable.
    And the stream inherits EventEmitter. Therefore, all streams are real columns of EventEmitter.
1. There are four basic stream types in Node.js:
  1. Readable-Readable stream (such as fs.createReadStream()).
  2. Writable--Writable stream (such as fs.createWriteStream()).
  3. Duplex-A readable and writable stream
  4. Transform-Duplex stream of data that can be modified and transformed during reading and writing.

The biggest role of the stream in Node.js is: in the process of reading a large file, it will not be read into the memory all at once. Only one data source will be read at a time 数据块(收到一个数据,就读取一块,即在数据还没有接收完就开始处理).
Then the data block can be processed immediately in the subsequent process (the garbage collection mechanism will be entered after the data processing is completed) without waiting for all the data.

2. What are the benefits of using streams?
  • The way to read the file in the file system will read the readfile()entire file content into the memory in Buffer format, and then write it into the return object, so the efficiency is very low, and if the file is 1G or more than 2G , Then the memory will be stuck directly, or the server will crash directly.
  • Using the stream createReadStream()method can avoid the situation of occupying a lot of memory. In the process of reading a large file, it will not be read into the memory at one time. Only one data block of the data source is read at a time, which is the advantage of streaming.

Two, commonly used stream operations

1. Read the stream

fs.createReadStream()

var fs = require("fs");
var data = '';

// 创建可读流
var readerStream = fs.createReadStream('input.txt');

// 设置编码为 utf8。
readerStream.setEncoding('UTF8');

// 处理流事件 --> data, end, and error
readerStream.on('data', function(chunk) {
    
    
   data += chunk;
});

readerStream.on('end',function(){
    
    
   console.log(data);
});

readerStream.on('error', function(err){
    
    
   console.log(err.stack);
});

console.log("程序执行完毕");
2. Write stream

fs.createWriteStream()

var fs = require("fs");
var data = '菜鸟教程官网地址:www.runoob.com';

// 创建一个可以写入的流,写入到文件 output.txt 中
var writerStream = fs.createWriteStream('output.txt');

// 使用 utf8 编码写入数据
writerStream.write(data,'UTF8');

// 标记文件末尾
writerStream.end();

// 处理流事件 --> finish、error
writerStream.on('finish', function() {
    
    
    console.log("写入完成。");
});

writerStream.on('error', function(err){
    
    
   console.log(err.stack);
});

console.log("程序执行完毕");
3. Pipeline (pipe)

readerStream.pipe(writerStream)
Provides a mechanism for output stream to input stream. Usually we use to get data from one stream and pass the data to another stream.
Insert picture description here
As shown in the picture above, the file is likened to a bucket filled with water, and the water is the content in the file. We use a tube ( pipe) Connecting two buckets makes water flow from one bucket to the other, so that the process of copying large files is slowly realized.

In the following example, we read the content of one file and write the content to another file.

var fs = require("fs");

// 创建一个可读流
var readerStream = fs.createReadStream('input.txt');

// 创建一个可写流
var writerStream = fs.createWriteStream('output.txt');

// 管道读写操作
// 读取 input.txt 文件内容,并将内容写入到 output.txt 文件中
readerStream.pipe(writerStream);

console.log("程序执行完毕");
4. Chain flow

Chaining is a mechanism that connects the output stream to another stream and creates multiple stream operation chains. Chain flow is generally used for pipeline operations.

  • Compressed file: Create a compressed package zlib.createGzip()
  • unzip files:zlib.createGunzip()

The next step is to use pipelines and chains to compress and decompress files.

  • Compressed file
var fs = require("fs");
var zlib = require('zlib');

// 压缩 input.txt 文件为 input.txt.gz
fs.createReadStream('input.txt')
  .pipe(zlib.createGzip())
  .pipe(fs.createWriteStream('input.txt.gz'));
  
console.log("文件压缩完成。");
  • unzip files
var fs = require("fs");
var zlib = require('zlib');

// 解压 input.txt.gz 文件为 input.txt
fs.createReadStream('input.txt.gz')
  .pipe(zlib.createGunzip())
  .pipe(fs.createWriteStream('input.txt'));
  
console.log("文件解压完成。");

Guess you like

Origin blog.csdn.net/isfor_you/article/details/114060941