Skip to content
This repository was archived by the owner on Oct 27, 2022. It is now read-only.
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 20 additions & 21 deletions mlclass-ex1/gradientDescentMulti.m
Original file line number Diff line number Diff line change
@@ -1,36 +1,35 @@
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, numberOfIterations)
%GRADIENTDESCENTMULTI Performs gradient descent to learn theta
% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
% theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, numberOfIterations) updates theta by
% taking numberOfIterations gradient steps with learning rate alpha

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
J_history = zeros(numberOfIterations, 1);

for iter = 1:num_iters
for iteration = 1:numberOfIterations
% Perform a single gradient step on the parameter vector theta.

% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCostMulti) and gradient here.
%
% we minimize the value of J(theta) by changing the values of the
% vector theta NOT changing X or y

% alpha = learning rate as a single number

% hypothesis = mx1 column vector
% X = mxn matrix
% theta = nx1 column vector
hypothesis = X * theta;

% errors = mx1 column vector
% y = mx1 column vector
errors = hypothesis .- y;








% ============================================================
newDecrement = (alpha * (1/m) * errors' * X);

theta = theta - newDecrement';

% Save the cost J in every iteration
J_history(iter) = computeCostMulti(X, y, theta);
J_history(iteration) = computeCostMulti(X, y, theta);

end

Expand Down