TensorFlow入门教程:21:tensorboard在线性回归训练中的过程追踪

版权声明:本文为博主原创文章,未经博主允许欢迎转载,但请注明出处。 https://blog.csdn.net/liumiaocn/article/details/82924284

在这里插入图片描述
在前面我们讲过tensorboard的使用方法,这里将会使用对Iris数据的花瓣长度与宽度线性分析的例子中来看一下tensorboard稍微详细一点的使用方法。

引入tensorboard

在代码中引入FileWriter即可简单的引入tensorboard,需要其将生成的文件输出至指定目录,而此目录为tensorboard启动时指定的目录,相关生成的文件正是tensorboard画图的依据。

示例代码

代码可做如下简单的变化,加入tensorboard数据生成的开关boardflag以及文件生成的目录boarddir

liumiaocn:tensorflow liumiao$ cat linearmodel.py
import tensorflow as tf
import numpy      as np
import os
import matplotlib.pyplot as plt
from   sklearn import datasets
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

class LinearModel:
  irisdata =  datasets.load_iris()
  xdata = irisdata.data[:,2]
  ydata = irisdata.data[:,3]
  X = tf.placeholder("float",name="X")
  Y = tf.placeholder("float",name="Y")
  W = tf.Variable(-3., name="W")
  B = tf.Variable(3., name="B")
  linearmodel = tf.add(tf.multiply(X,W),B)
  lossfunc = (tf.pow(Y - linearmodel, 2))
  learningrate  = 0.01
  learningsteps = 100
  boarddir      = '/tmp/tensorboard'
  boardflag     = False

  def load(self, xindex, yindex):
    self.irisdata =  datasets.load_iris()
    self.xdata    =  self.irisdata.data[:,xindex]
    self.ydata    =  self.irisdata.data[:,yindex]

  def train(self):
    trainoperation = tf.train.GradientDescentOptimizer(self.learningrate).minimize(self.lossfunc)
    sess = tf.Session()
    init = tf.global_variables_initializer()
    sess.run(init)

    index = 1
    writer  = tf.summary.FileWriter(self.boarddir,sess.graph)
    print("caculation begins ...")
    for i in range(self.learningsteps):
      for (x,y)  in zip(self.xdata,self.ydata):
        sess.run(trainoperation, feed_dict={self.X: x, self.Y:y})
      #if self.boardflag:
      #  writer.add_summary(summary_str,i)
    print("caculation ends ...")
    return self.B.eval(session=sess),self.W.eval(session=sess)
liumiaocn:tensorflow liumiao$

执行代码

liumiaocn:tensorflow liumiao$ mkdir -p /tmp/tensorboard/
liumiaocn:tensorflow liumiao$ cat basic-operation-16.py 
import matplotlib.pyplot as plt
from linearmodel import LinearModel

model = LinearModel()
model.load(2,3)
model.boardflag = True
(B,W) = model.train()
plt.scatter(model.xdata,model.ydata)
plt.plot(model.xdata,model.xdata*W + B, 'r', label='xxxx')
plt.show()
liumiaocn:tensorflow liumiao$

结果确认

在这里插入图片描述

损失函数

在迭代的100次中,学习以什么样的程度在逼近,也可以非常直观地从tensorboard中看出,这样需要我们在每次迭代中把信息输出出来,相关代码作如下改变即可

liumiaocn:tensorflow liumiao$ cat linearmodel.py
import tensorflow as tf
import numpy      as np
import os
import matplotlib.pyplot as plt
from   sklearn import datasets
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'

class LinearModel:
  irisdata =  datasets.load_iris()
  xdata = irisdata.data[:,2]
  ydata = irisdata.data[:,3]
  X = tf.placeholder("float",name="X")
  Y = tf.placeholder("float",name="Y")
  W = tf.Variable(-3., name="W")
  B = tf.Variable(3., name="B")
  linearmodel = tf.add(tf.multiply(X,W),B)
  lossfunc = (tf.pow(Y - linearmodel, 2))
  learningrate  = 0.01
  learningsteps = 100
  boarddir      = '/tmp/tensorboard'
  boardflag     = False

  def load(self, xindex, yindex):
    self.irisdata =  datasets.load_iris()
    self.xdata    =  self.irisdata.data[:,xindex]
    self.ydata    =  self.irisdata.data[:,yindex]

  def train(self):
    trainoperation = tf.train.GradientDescentOptimizer(self.learningrate).minimize(self.lossfunc)
    sess = tf.Session()
    init = tf.global_variables_initializer()
    sess.run(init)

    index       = 1
    writer      = tf.summary.FileWriter(self.boarddir,sess.graph)
    losssummary = tf.summary.scalar("loss", self.lossfunc)
    print("caculation begins ...")
    for i in range(self.learningsteps):
      for (x,y)  in zip(self.xdata,self.ydata):
        sess.run(trainoperation, feed_dict={self.X: x, self.Y:y})
        summaryinfo = sess.run(losssummary, feed_dict={self.X: x, self.Y: y})
      if self.boardflag:
        writer.add_summary(summaryinfo,i)
    print("caculation ends ...")
    return self.B.eval(session=sess),self.W.eval(session=sess)
liumiaocn:tensorflow liumiao$ 

执行确认

可以很清晰的看到整个的下降曲线,同时相关的信息可以进行动态的调节以及显示,数据可以以csv或者json的形式导出等。
在这里插入图片描述

猜你喜欢

转载自blog.csdn.net/liumiaocn/article/details/82924284