Share some front-end interview questions about Node

This article will share some front-end interview questions about Node. There is a certain reference value, friends in need can refer to it, I hope it will be helpful to everyone.


Related recommendation: "nodejs tutorial"

Node front-end interview questions
**1. Why use node? **

Features: simple and powerful, lightweight and scalable. The simple embodiment is that node uses javascript, json to encode, everyone can; the
power is reflected in non-blocking IO, which can adapt to block transmission of data, slower network environment, especially good at high concurrent access; light weight is reflected in node itself It is both code and server. The front-end and back-end use a unified language; the scalability is reflected in the multi-instance, multi-server architecture, and a large number of third-party application components

. 2. What does the node architecture look like?

Mainly divided into three layers , Application app >> V8 and node built-in architecture >> Operating system.
V8 is the environment in which node runs, which can be understood as node virtual machine. The built-in architecture of node can be divided into three layers: core module (implemented by javascript) >> c++ binding >> libuv + CAes + http.

3. What core modules does the node have?

1

EventEmitter, Stream, FS, Net and global objects

4. What global objects does the node have?

1

process, console, Buffer and exports

5. What are the common methods of process?

1

process.stdin, process.stdout, process.stderr, process.on, process.env, process.argv, process.arch, process.platform, process.exit

6. What are the common methods of console ?

1

console.log/console. info, console.error/console.warning, console.time/console.timeEnd, console.trace, console.table

7. What timing functions does node have?

1

setTimeout/clearTimeout, setInterval/clearInterval, setImmediate/clearImmediate, process.nextTick

8 What does the event loop in node look like? The

overall execution order is: process.nextTick >> setImmidate >> setTimeout/SetInterval
link

9. How to use

Buffer in node? Buffer is used to process binary data, such as pictures. mp3, database files, etc. Buffer supports various encoding and decoding, binary string conversion.

**10. What is EventEmitter? **

EventEmitter is a class in node that implements the observer mode. Its main function is to monitor and transmit messages to deal with multi-module interaction issues.

11. How to implement an EventEmitter?

Mainly divided into three steps: define a subclass, call the constructor, inherit the EventEmitter

code demonstration

1

2

3

4

5

6

7

8

9

10

11

12

13

14

var util = require('util');

var EventEmitter = require('events' ).EventEmitter;

 

function MyEmitter() {     EventEmitter.call(this); } // Constructor util.inherits(MyEmitter, EventEmitter); // inherit var em = new MyEmitter(); em.on('hello', function (data) {     console.log('Data of event hello received:', data); }); // Receive the event and print it to the console em.emit('hello','EventEmitter is so convenient to deliver messages!' ); 12. What are the typical applications of EventEmitter? 1) Message transfer between modules





 



 














2) Pass messages inside and outside the callback function
3) Process stream data, because the stream is implemented on the basis of EventEmitter.
4) Observer mode emission trigger mechanism related applications

13. How to capture EventEmitter error events? Just

listen to the error event. If there are multiple EventEmitters, you can also use domain to handle error events uniformly.

Code demonstration

1

2

3

4

5

6

7

8

9

10

11

var domain = require('domain');

var myDomain = domain.create();

myDomain. on('error', function(err){     console.log('error event received by domain:', err); }); // Receive the event and print myDomain.run(function(){     var emitter1 = new MyEmitter ();     emitter1.emit('error','The error event comes from emitter1');     emitter2 = new MyEmitter();













    emitter2.emit('error','The error event comes from emitter2');

});

14. What is the use of the newListenser event in EventEmitter?
newListener can be used to reflect the event mechanism, special applications, event management, etc. When any on event is added to EventEmitter, the newListener event will be triggered. Based on this model, we can do a lot of custom processing.

Code demonstration

1

2

3

4

5

6

7

8

9

var emitter3 = new MyEmitter();

emitter3.on ('newListener', function(name, listener) {     console.log("New event name:", name);     console.log("New event code:", listener);     setTimeout(function(){ console. log("I am a custom delay processing mechanism"); }, 1000); }); emitter3.on('hello', function(){     console.log('hello node'); });

















Stream is a data management mode based on EventEmitter. It is composed of various abstract interfaces, including writable, readable, readable and writable, and convertible.

16. What are the benefits of Stream?

Non-blocking data processing improves efficiency, fragment processing saves memory, pipeline processing is convenient and scalable, etc.

17. What are the typical applications of Stream?

File, network, data conversion, audio and video, etc.

18. How to capture Stream The error event? The

method to monitor the error event is the same as EventEmitter

**19. What are the commonly used Streams and when are they used? **

Readable is a stream that can be read and used as an input data source; Writable is a stream that can be written in It is used as an output source; Duplex is a readable and writable stream, which is used as an output source to be written into, and at the same time as an input source to be read by a subsequent stream. The Transform mechanism is the same as Duplex, which is a two-way stream. When the difference is, Transfrom only needs to implement a function _transfrom(chunk, encoding, callback); while Duplex needs to implement _read(size) function and _write(chunk, encoding, callback) respectively Function.

**20. Implement a Writable Stream? **

Three steps: 1) Construct function call Writable

2) Inherit Writable, insert a code piece here
3) Implement _write(chunk, encoding, callback) function

Code demonstration

1

2

3

4

5

6

7

8

9

10

11

12

13

var Writable = require('stream').Writable;

var util = require('util');

  

function MyWritable(options) {     Writable.call(this, options); } // Constructor util.inherits(MyWritable, Writable); // inherited from Writable MyWritable.prototype._write = function(chunk, encoding, callback) {     console.log("The data to be written is:", chunk.toString()); // The written data can be processed here     callback(); }; process.stdin.pipe(new MyWritable()); // stdin is the input source, MyWritable is the output source 21, what is the built-in fs module architecture Looks like? The fs module is mainly composed of the following parts: 1) POSIX file wrapper, corresponding to the native file operation of the operating system 2) file stream fs.createReadStream and fs.createWriteStream















  









3) Synchronous file reading and writing, fs.readFileSync and fs.writeFileSync
4) Asynchronous file reading and writing, fs.readFile and fs.writeFile

**22. How many methods are there to read and write a file? **

1) POSIX-style low-level read and write
2) Streaming reading and writing
3) Synchronous file reading and writing
4) Asynchronous file reading and writing

23. How to read the json configuration file? The

first method is to use node's built-in require('data.json') mechanism to directly get the js object;
The second is to read the file into the content, and then use JSON.parse (content) to convert into js objects. The difference between the two is that under the require mechanism, if multiple modules load the same json file, one of them changes the js object, and the other changes accordingly. This is caused by the caching mechanism of the node module, and there is only one js module object. ; In the second way, you can change the loaded js variables at will, and the modules do not affect each other, because they are independent and are multiple js objects.

24. What is the difference between fs.watch and fs.watchFile and how to apply ?

fs.watch uses the native mechanism of the operating system to monitor, which may not be applicable to the network file system; fs.watchFile is to periodically check the file status changes and is suitable for the network file system, but it is slower than fs.watch because it is not a real-time mechanism.

25. What does the network module architecture of

node look like? Node fully supports various network servers and clients, including tcp, http/https, tcp, udp, dns, tls/ssl, etc.

26. How does node support https, tls ?

1) openssl generates public key private key
2) The server or client uses https instead of http
3) The server or client loads the public key and private key certificate

27. Implement a simple http server? The

idea is to load the http module, create the server, and listen to the port.

Code demonstration

1

2

3

4

5

6

7

var http = require('http'); // Load http module

 

http.createServer(function(req, res) {     res.writeHead(200, {'Content-Type':'text/html')); / / 200 means the status is successful, and the document type is     res.write('



I am the title!

Such a native, elementary server, can it be used in the next life?!'); // HTML data returned to the client

    res.end(); // End output stream

}).listen(3000); // Bind 3ooo, Please visit http://localhost:3000

**28. Why do you need child-process? ** The

node is asynchronous and non-blocking, which is very effective for high concurrency. However, we have other common requirements, such as interacting with operating system shell commands, calling executable files, creating child processes for blocking access or high CPU computing, etc. Child-process is born to meet these requirements. The child-process, as the name implies, is to delegate the work blocked by the node to the child process.

29. What do exec, execFile, spawn and fork do?

exec can execute various commands in the native way of the operating system, such as the pipeline cat ab.txt | grep hello;
execFile is to execute a file;
spawn is a stream and The operating system interacts;
fork is the time-line interaction between two node programs (javascript).

30. Implement a simple command line interactive program?

spawn

code demonstration

1

2

3

4

5

var cp = require('child_process');

 

var child = cp.spawn('echo', ['hello', "hook"]); // execute command

child.stdout.pipe(process.stdout); // child.stdout is the input stream, and process.stdout is the output stream.

// This sentence means that the output of the child process is used as the input stream of the current program, and then redirected to the current The standard output of the program, that is, the console

**31. How to interact between the two node programs? **

Use fork, as mentioned above. The principle is that the child program uses process.on, process.send, and the parent program uses child.on, child.send to interact.

Code demonstration

1

2

3

4

5

6

7

8

9

10

11

12

13

1) fork-parent.js

var cp = require('child_process');

var child = cp.fork('./fork-child.js');

child.on('message', function(msg){     console.log('Dad accepts from son To the data:', msg); }); child.send('I am your father, I am sending care!'); 2) fork-child.js







 



process.on('message', function(msg){     console.log("The data the son received from the father:", msg);     process.send("I don't care, I want silver coins!"); }); **32. How to make a js file executable like a linux command? ** 1) Add #!/usr/bin/env node to the head of the myCommand.js file 2) chmod command to change the js file executable to 3) into the file directory, the command line input node myComand.js myComand which is equivalent to the 33, child-process and process of stdin, stdout, stderror is the same? the concept is the same, input, output , Errors, all flow. The difference is that in the eyes of the parent program, the stdout of the subprogram is the input stream, and the stdin is the output stream. 34. How to understand the asynchrony and synchronization in the node node is a single thread. Asynchrony is realized through a recurring event queue. Synchronization refers to blocking IO, which can be a big performance problem in a high-concurrency environment. Therefore, synchronization is generally only used when the basic framework is started. It is used to load configuration files, initialize programs, etc. **35, yes Which methods can be used to control asynchronous processes? ** 1) Multi-level nested callbacks 2) Write a separate function for each callback, and then call back inside the function 3) Use third-party frameworks such as async, q, promise, etc. 36. How to bind Assign node program to port 80? 1) sudo






























2) apache/nginx proxy
3) Use the firewall iptables of the operating system for port redirection

37. What methods are there to make the node program automatically restart after encountering an error?

1) runit
2) forever
3) nohup npm start &

38, how to be sufficient Use multiple CPUs?

One CPU runs a node instance

39. How to adjust the memory size of the node execution unit?

Use --max-old-space-size and --max-new-space-size to set the upper limit of v8 memory usage

** 40. The program always crashes. How to find out the problem? **

1) node - prof Check which functions are called more frequently
2) Memwatch and heapdump obtain memory snapshots for comparison, find memory overflow

** 41. What are the common methods to prevent program crashes? **

1) try-catch-finally
2) EventEmitter/Stream error event Processing
3) unified domain control
4) jshint static check
5) jasmine/mocha for unit testing

42. How to debug node program? What are the common methods of

node --debug app.js and node-inspector

43, async, and how to use them respectively ?

async is a js library, its purpose is to solve the problem of difficult control of abnormal flow in js. async is not only applicable in node.js, but also in browsers.
1) After async.parallel executes multiple functions in parallel, call the end function

1

2

3

4

async.parallel([

    function(){ ... },

    function(){ ...}

], callback);

async.series After executing multiple functions serially, call the end function
1

2

3

4

async.series([

    function(){ ... },

    function(){ ...}

 ]);

async.waterfall executes multiple functions in turn, The latter function takes the result of the previous function as the input parameter
1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

async.waterfall([

   function(callback) {        callback(null, 'one', 'two');    },    function(arg1, arg2, callback) {      // arg1 now equals 'one' and arg2 now equals 'two'        callback(null, 'three');    },    function(arg1, callback) {        // arg1 now equals 'three'        callback(null, 'done');    } ], function (err, result) {    // result now equals 'done' }); async.map异步执行多个数组,返回结果数组 1 2 3 async.map(['file1','file2','file3'], fs.stat, function(err, results){   // results is now an array of stats for each file });async.filter filters multiple arrays asynchronously and returns the result array









































1

2

3

async.filter(['file1','file2','file3'], fs.exists, function(results){   // results now equals an array of the existing files }); 44, express project directory Roughly what it looks like 1 app.js, package.json, bin/www, public, routes, views. 45, express commonly used functions express.Router routing components, app.get routing orientation, app.configure configuration, app.set settings Set parameters, app.use uses middleware 46, express how to obtain routing parameters /users/:name uses req.params.name to obtain; req.body.username is to obtain form incoming parameters username; express routing supports commonly used Wildcards?, +, *, and () 47. What are the common methods for express response res.download() pop-up file download res.end() end response res.json() return json Insert the code piece res.jsonp() here Return jsonp res.redirect() to redirect the request

























res.render() Rendering template
res.send() returns various forms of data
res.sendFile returns file
res.sendStatus() returns status

48. What are the common optimization measures for mongodb?
Similar to traditional databases, what are indexes and partitions

49 and mongoose? What features are supported?
Mongoose is the document mapping model of mongodb. It is mainly composed of three aspects: Schema, Model and Instance.
Schema is to define data types,
Model is to bind Schema and js classes together, and
Instance is an object instance.
Common mongoose operations include save, update, find. findOne, findById, static methods, etc.50.

What functions does redis support?

1

set/get, mset/hset/hmset/hmget/hgetall/hkeys, sadd/smembers, publish/subscribe, expire

51. The simplest application of redis

1

2

3

4

5

6

7

8

var redis = require("redis"),

   client = redis.createClient();

 

client.set("foo_rand000000000000", "some fantastic value");

client.get("foo_rand000000000000", function (err, reply) {    console.log(reply.toString()); }); client.end(); 52. What is the difference between apache and nginx? Both are proxy servers with similar functions. Apache application is simple and quite extensive. Nginx has advantages in distributed and static forwarding










Guess you like

Origin blog.csdn.net/zy17822307856/article/details/112925880