Octave实现线性回归(梯度下降)

Octave实现线性回归(梯度下降)

这几天看了吴恩达老师的视频,看到神经网络的时候想着把以前的东西巩固一下,所以用Octave实现了一下线性回归。
我这就先直接贴代码,过几天再来加注释。

function jVal = costFunction(theta)
    sum_ = 0;
    len = size(theta)(1);
    load('train3.txt')
    % load('train2.txt')
    % x = data(:,1:len-1);
    x = [linspace(1,1,50)' data(:,1:len-1)]';
    y = data(:,len);
    m=size(x,1); 
    h_y = fitFunction(x,theta)';
    delta = h_y - y;
    % sum(sum(delta.* x')
    % sum(delta)
    jVal=sum(delta.^2);
    % gradien(1)=sum(delta)/m  %梯度值
    % gradien(2)=sum(delta.* x)/m;
function data = randData(len)
    data = [];
    for c = [1:1:len];
        a = [1:1:50];
        for b = a
            a(b) = (1-rand(1)*0.15)*c*b;
        end
        data = [data a'];
    end
end

function hx = fitFunction(x,theta)

    % hx = theta(1) + theta(2) * x;
    hx = theta' * x;
function [optTheta,functionVal,exitFlag] = GradientFunction()
    options = optimset('GradObj','off','MaxIter',100000);  
    initialTheta = zeros(3,1);  
    [optTheta,functionVal,exitFlag] = fminunc(@costFunction,initialTheta,options);
    % data = randData(3);
    load('train3.txt');
    % scatter(data(:,1),data(:,2),'*');

    scatter3(data(:,1),data(:,2),data(:,3));
    hold on
    x = [linspace(1,1,50)' [1:1:50]' [1:2:100]' ];
    % y =  optTheta(1)  + optTheta(2) * x;
    y =  optTheta' *x';
    plot3(x(:,2),x(:,3),y);
    % plot(x,y)
发布了31 篇原创文章 · 获赞 38 · 访问量 2万+

猜你喜欢

转载自blog.csdn.net/yhy1315/article/details/79160197
今日推荐