机器学习代码练习

ex1 (Liner Regression)

result

1. ComputeCost(计算损失)

function J = computeCost(X, y, theta)

m = length(y); % number of training examples

J = 0

J = sum((X * theta - y).^2) / (2*m); 

2.gradientDescent(梯度下降)

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
theta_s=theta; 
for iter = 1:num_iters
    theta(1) = theta(1) - alpha / m * sum(X * theta_s - y);       
    theta(2) = theta(2) - alpha / m * sum((X * theta_s - y) .* X(:,2));  

% 必须同时更新theta(1)和theta(2),所以不能用X * theta,而要用theta_s存储上次结果。
    theta_s=theta; 


 

猜你喜欢

转载自blog.csdn.net/qq_35962520/article/details/84328789