During Spark development, when using java.math.BigDecimal directly for floating-point multiplication, there is still a lack of precision. But if it is pure JAVA, there is no such problem. Calling JAVA's BigDecimal in Scala has such a problem.
Accidentally found that Scala also implements scala.math.BigDecimal. If you switch to SCALA's own BigDecimal, there is no such problem.
I don't know the deep-seated reason, I'm throwing a brick here, if anyone knows, please enlighten me.
Below is the experimental code.
package com.cisco.test import java.math.BigDecimal object TestSyntax { def main(args: Array[String]): Unit = { val local_price=new BigDecimal(0.015) val exchange_rate = new BigDecimal(2) //0.02999999999999999888977697537484345957636833190917968750 println(local_price.multiply(exchange_rate)) //0.030 import scala.math.BigDecimal println(BigDecimal("0.015") * BigDecimal("2")) println(BigDecimal("0.0") * BigDecimal("0.751879699")) } }