About js 0.1 + 0.2! = 0.3 problem

problem:console.log(0.1 + 0.2 == 0.3) // false

Recently I read the book "JavaScript Advanced Programming" and saw such a problem in the book. After sorting out and personal understanding, record it here.

Analyze the cause of the problem:

The numbers we use every day are decimal numbers, while computers use binary numbers. Not all decimals can be accurately expressed in base 2. Converting 0.1 and 0.2 to base 2 is an infinite loop of decimals. When the computer handles this matter, the nearest 0.1 and the nearest 0.2 are used. The decimals are added together and then returned. Although they are infinitely close to 0.1 and 0.2, they still cannot be added accurately to get 0.3. So the result of 0.1 + 0.2 is 0.30000000000000004.
To put it simply, the computer converts the decimal number into the binary system and then into the decimal system when the computer calculates, and there is an error.

How to avoid:

1. When calculating decimals, first multiply by 10 to the nth power, and then divide the result by 10 to the nth power.
2. When the accuracy requirements are not particularly high, you can also use the method of keeping a few decimal places for calculation (toFixed()).

Guess you like

Origin blog.csdn.net/weixin_43299180/article/details/110640332