[TensorFlowJS只如初见]实战一·JavaScript原生代码实现梯度下降求最小值

[TensorFlowJS只如初见]实战一·JavaScript原生代码实现梯度下降

  • 问题描述:
    求解y1 = xx -2 x +3 + 0.01*(-1到1的随机值) 与 y2 = 0 的最小距离点(x,y)
    给定x范围(0,3
    不使用学习框架,手动编写梯度下降公式求解,提示:x = x - alp*(y1-y2)导数(alp为学习率)
    函数图像为:
    在这里插入图片描述
  • HTML代码
<html>

<head>
  <script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs">
  </script>

</head>

<body>
  <button class="btn btn-primary" onclick="fnRun0();">开始0</button>
  <div id="p0Id">out0</div>
  <button class="btn btn-primary" onclick="fnRun1();">开始1</button>
  <div id="p1Id">out1</div>
  <button class="btn btn-primary" onclick="fnRun2();">开始2</button>
  <div id="p2Id">out2</div>
</body>

<script>

  function get_ys(xs) {
    var ys = new Array();

    for (var i = 0; i < xs.length; i++) {
      ys[i] = xs[i]*xs[i] -2*xs[i]+ 3 +(0.01*(2*Math.random()-1));

    }
    return(ys);

  }
  
  function get_grad(xi) {
    var grad = 0;
    grad = 2 * xi - 2;
    return(grad);
  }

  function fnRun0() {
    var xs = new Array();
    for (var i = 0; i < 300; i++) {
      xs[i] = 0.01*i;
    }
    
    var ys = get_ys(xs);
    
    var x_ = Math.random()*3;
    const alp = 0.001;
    
    for(var j = 0;j<2001;j++){
      x_ = x_ - alp*get_grad(x_);
      if(j%200 == 0){
        console.log(j+" steps x_is "+x_+" loss is "+get_ys([x_]));
      }
    }
    document.getElementById("p0Id").innerHTML = get_ys([x_]);
  }
  
  fnRun0();
  
</script>

</html>
  • 输出结果
    loss最小值为2,最后我们系统输出的结果为 2.0025028420273303,可见系统还是比较准确的通过梯度下降算法求到了最小值。
"0 steps x_is 2.088901048405839 loss is 3.188181588041007"
"200 steps x_is 1.7296199046851404 loss is 2.5291956018861788"
"400 steps x_is 1.488882994549514 loss is 2.2340937472027194"
"600 steps x_is 1.32757683942688 loss is 2.1157200627422927"
"800 steps x_is 1.2194933898811164 loss is 2.044291255588076"
"1000 steps x_is 1.1470719000946268 loss is 2.015143069495562"
"1200 steps x_is 1.0985457639938911 loss is 2.003506844441931"
"1400 steps x_is 1.0660307481911326 loss is 2.0048429348246173"
"1600 steps x_is 1.044244009381783 loss is 1.9943007052299424"
"1800 steps x_is 1.029645769884495 loss is 1.9943557333293662"
"2000 steps x_is 1.0198641959516062 loss is 2.0025028420273303"

猜你喜欢

转载自blog.csdn.net/xiaosongshine/article/details/84639968