【T-Tensorflow框架学习】基于Tensorflow构造基础线性回归模型

版权声明:转载请声名出处,谢谢 https://blog.csdn.net/u010591976/article/details/82155558

基于Tensorflow构造基础线性回归模型

'''
Creat by HuangDandan
[email protected]
2018-08-26
构造基础线性回归模型
'''

#构造线性回归的模型
import numpy as np
import tensorflow as tf
import matplotlib.pyplot as plt

#随机生成1000个点,围绕在y=0.1x+0.3
num_points = 1000
vectors_set = []
for i in range(num_points):
    x1 = np.random.normal(0.0,0.55)
    y1 = x1*0.1 + 0.3 + np.random.normal(0.0,0.03)
    vectors_set.append([x1,y1])

#生成一些样本
x_data = [v[0] for v in vectors_set]
y_data = [v[1] for v in vectors_set]

#可视化
plt.scatter(x_data, y_data, c='r')
plt.show()

#生成1维的W矩阵,取值是[-1,1]之间的随机数,权重矩阵随机初始化
W = tf.Variable(tf.random_uniform([1],-1.0,1.0),name ='W')
#对b进行常量初始化,初始值为0
b = tf.Variable(tf.zeros([1]),name='b')
#构造预测值
y = W*x_data +b


#以预估值y和实际值y_data之间的均方误差作为损失
loss = tf.reduce_mean(tf.square(y-y_data),name='loss')
#采用梯度下降法来优化参数,梯度参数初始化
optimizer = tf.train.GradientDescentOptimizer(0.5)
#训练的过程就是最小化这个误差的值(train就是定义怎么求解这个模型 )
train = optimizer.minimize(loss,name='train')

sess = tf.Session()
#全局变量的初始化
init = tf.global_variables_initializer()
sess.run(init)

#初始化的W和b是多少
print("W=",sess.run(W),'b=',sess.run(b),'loss=',sess.run(loss))
#执行20次操作
for step in range(20):
    sess.run(train)
    #输出训练好的W和b
    print("W=",sess.run(W),"b=",sess.run(b),"loss=",sess.run(loss)) 

 #可视化
 plt.scatter(x_data,y_data,c='r')
plt.plot(x_data,sess.run(W)*x_data+sess.run(b))
plt.show()

这里写图片描述

输出:

W= [0.00664806] b= [0.] loss= 0.09335411
W= [0.03346747] b= [0.29992056] loss= 0.0021463665
W= [0.05288571] b= [0.2999717] loss= 0.001504327
W= [0.06653099] b= [0.30000874] loss= 0.0011872927
W= [0.07611956] b= [0.3000348] loss= 0.0010307436
W= [0.08285749] b= [0.3000531] loss= 0.000953441
W= [0.08759226] b= [0.30006593] loss= 0.0009152693
W= [0.0909194] b= [0.30007496] loss= 0.0008964207
W= [0.09325739] b= [0.30008134] loss= 0.0008871133
W= [0.0949003] b= [0.30008578] loss= 0.00088251743
W= [0.09605478] b= [0.3000889] loss= 0.00088024774
W= [0.09686604] b= [0.30009112] loss= 0.0008791272
W= [0.09743612] b= [0.30009267] loss= 0.0008785739
W= [0.09783671] b= [0.30009377] loss= 0.0008783006
W= [0.09811821] b= [0.30009452] loss= 0.0008781659
W= [0.09831601] b= [0.30009505] loss= 0.0008780991
W= [0.09845501] b= [0.30009544] loss= 0.00087806606
W= [0.09855269] b= [0.3000957] loss= 0.00087805
W= [0.09862133] b= [0.3000959] loss= 0.00087804196
W= [0.09866957] b= [0.30009604] loss= 0.00087803777
W= [0.09870346] b= [0.30009612] loss= 0.00087803614

猜你喜欢

转载自blog.csdn.net/u010591976/article/details/82155558
今日推荐