转行程序员3 机器学习 Logistic Regression 纯属敦促自己学习

作业网址 http://openclassroom.stanford.edu/MainFolder/DocumentPage.php?course=MachineLearning&doc=exercises/ex4/ex4.html

编写的代码:

%%%%%%%%%%%%%%%%%%%%%%%%%%

x = load('ex4x.dat');

y = load('ex4y.dat');
m = length(y); % store the number of training examples 样本数量
x = [ones(m, 1), x]; % Add a column of ones to x  
% find returns the indices of the
% rows meeting the specified condition
pos = find(y == 1);
neg = find(y == 0);

% Assume the features are in the 2nd and 3rd
% columns of x
plot(x(pos, 2), x(pos,3), '+'); hold on
plot(x(neg, 2), x(neg, 3), 'o');

g = inline('1.0 ./ (1.0 + exp(-z))');
theta=zeros(3,1);

% 迭代次数
num=5;
J = zeros(num, 1);
for inter=1:num
    h_theta=g(x*theta);
    deltaJ=1/m*x'*(h_theta-y);  
    H=1/m*x'*diag(h_theta)*diag(1-h_theta)*x;
    theta=theta-H\deltaJ;
    J(inter) =(1/m)*sum(-y.*log(h_theta) - (1-y).*log(1-h_theta));
end



再看看答案的代码。。。感觉自己的差好多。。。继续努力。。

以下是提供的答案代码

% Exercise 4 -- Logistic Regression

clear all; close all; clc

x = load('ex4x.dat');
y = load('ex4y.dat');

[m, n] = size(x);

% Add intercept term to x
x = [ones(m, 1), x];

% Plot the training data
% Use different markers for positives and negatives
figure
pos = find(y); neg = find(y == 0);
plot(x(pos, 2), x(pos,3), '+')
hold on
plot(x(neg, 2), x(neg, 3), 'o')
hold on
xlabel('Exam 1 score')
ylabel('Exam 2 score')


% Initialize fitting parameters
theta = zeros(n+1, 1);

% Define the sigmoid function
g = inline('1.0 ./ (1.0 + exp(-z))');

% Newton's method
MAX_ITR = 5;
J = zeros(MAX_ITR, 1);

for i = 1:MAX_ITR
    % Calculate the hypothesis function
    z = x * theta;
    h = g(z);
    
    % Calculate gradient and hessian.
    % The formulas below are equivalent to the summation formulas
    % given in the lecture videos.
    grad = (1/m).*x' * (h-y);
    H = (1/m).*x' * diag(h) * diag(1-h) * x;
    
    % Calculate J (for testing convergence)
    J(i) =(1/m)*sum(-y.*log(h) - (1-y).*log(1-h));
    
    theta = theta - H\grad;
end
% Display theta
theta

% Calculate the probability that a student with
% Score 20 on exam 1 and score 80 on exam 2
% will not be admitted
prob = 1 - g([1, 20, 80]*theta)

% Plot Newton's method result
% Only need 2 points to define a line, so choose two endpoints
plot_x = [min(x(:,2))-2,  max(x(:,2))+2];
% Calculate the decision boundary line
plot_y = (-1./theta(3)).*(theta(2).*plot_x +theta(1));
plot(plot_x, plot_y)
legend('Admitted', 'Not admitted', 'Decision Boundary')
hold off

% Plot J
figure
plot(0:MAX_ITR-1, J, 'o--', 'MarkerFaceColor', 'r', 'MarkerSize', 8)
xlabel('Iteration'); ylabel('J')
% Display J
J

猜你喜欢

转载自blog.csdn.net/luxtime/article/details/51277068
今日推荐