Coursera-机器学习(吴恩达)第三周-编程作业

1、逻辑回归

逻辑回归与线性回归的主要区别在于假设函数,逻辑回归中的假设函数:

                                                                        hθ(x) = g(θ'x)=sgmoid(θ’x)

1)sigmoid 代码实现

% sigmoid 代码实现
function g = sigmoid(z)
%SIGMOID Compute sigmoid function
%   g = SIGMOID(z) computes the sigmoid of z.

% You need to return the following variables correctly 
g = zeros(size(z));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the sigmoid of each value of z (z can be a matrix,
%               vector or scalar).

g = 1 ./ (1 + exp(-z));		% 注意“./”


% =============================================================

end

2)Cost function and gradien

写代码之前首先清楚X、y、theta各是几乘几的矩阵。

function [J, grad] = costFunction(theta, X, y)
%COSTFUNCTION Compute cost and gradient for logistic regression
%   J = COSTFUNCTION(theta, X, y) computes the cost of using theta as the
%   parameter for logistic regression and the gradient of the cost
%   w.r.t. to the parameters.

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;
grad = zeros(size(theta));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
%               You should set J to the cost.
%               Compute the partial derivatives and set grad to the partial
%               derivatives of the cost w.r.t. each parameter in theta
%
% Note: grad should have the same dimensions as theta
%

J = (-y' * log(sigmoid(X * theta)) - ...
		(1 - y)' * log(1 - sigmoid(X * theta))) / m;

% for i = 1 : length(theta)
	% grad(i) = (sigmoid(X * theta) - y)' * X(:, i) / m;
% endfor

grad = X' * (sigmoid(X * theta) - y) / m;



% =============================================================

end

3)fminunc

fminunc是一个优化求解器,它可以找到一个未约束函数的最小值,对于逻辑回归问题,我们需要求解代价函数的最小值,以及对应的theta值。

参数说明:

options = optimset('GradObj', 'on', 'MaxIter', 400);

'GradObj' 设置为 'on' ,告诉 fminunc 我们使用的函数同时返回代价(cost)和梯度(gradient),这是的 fminunc 在最小化 cost 时使用我们自己的梯度。

‘MaxIter' 设置为 400,fminunc 在返回之前最多迭代400次。

[theta, cost] = fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

@(t)(costFunction(t, X, y)),所需求解最小值的代价函数,@为函数句柄,@后面括号里的 t 表示函数的参数,也就是我们所需要求解最小代价的参数θ。函数句柄参考https://blog.csdn.net/yhl_leo/article/details/50699990

initial_theta,初始θ值,一般不影响最后结果。

%% ============= Part 3: Optimizing using fminunc  =============
%  In this exercise, you will use a built-in function (fminunc) to find the
%  optimal parameters theta.

%  Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);

%  Run fminunc to obtain the optimal theta
%  This function will return theta and the cost 
[theta, cost] = ...
	fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

4)predict.m

function p = predict(theta, X)
%PREDICT Predict whether the label is 0 or 1 using learned logistic 
%regression parameters theta
%   p = PREDICT(theta, X) computes the predictions for X using a 
%   threshold at 0.5 (i.e., if sigmoid(theta'*x) >= 0.5, predict 1)

m = size(X, 1); % Number of training examples

% You need to return the following variables correctly
p = zeros(m, 1);

% ====================== YOUR CODE HERE ======================
% Instructions: Complete the following code to make predictions using
%               your learned logistic regression parameters. 
%               You should set p to a vector of 0's and 1's
%

% ------------- method 1 -------------------
% p = sigmoid(X * theta);

% for i = 1 : m
	% if(p(i) >= 0.5)
		% p(i) = 1;
	% else
		% p(i) = 0;
	% endif
% endfor


% ------------- method 2 -------------------
p = round(sigmoid(X * theta));		% round(>= 0.5) = 1, round(< 0.5) = 0

% =========================================================================


end

2、正规化

function [J, grad] = costFunctionReg(theta, X, y, lambda)
%COSTFUNCTIONREG Compute cost and gradient for logistic regression with regularization
%   J = COSTFUNCTIONREG(theta, X, y, lambda) computes the cost of using
%   theta as the parameter for regularized logistic regression and the
%   gradient of the cost w.r.t. to the parameters. 

% Initialize some useful values
m = length(y); % number of training examples

% You need to return the following variables correctly 
J = 0;
grad = zeros(size(theta));

% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta.
%               You should set J to the cost.
%               Compute the partial derivatives and set grad to the partial
%               derivatives of the cost w.r.t. each parameter in theta


J = (-y' * log(sigmoid(X * theta)) - (1 - y)' * log(1 - sigmoid(X * theta))) / m + lambda / 2 / m * sum(theta(2 : end).^2);

grad(1) = (sigmoid(X * theta) - y)' * X(:, 1) / m;

for i = 2 : length(theta)
	grad(i) = (sigmoid(X * theta) - y)' * X(:, i) / m + lambda / m * theta(i);
endfor




% =============================================================

end

猜你喜欢

转载自blog.csdn.net/hugh___/article/details/81736271