fminunc in Octave

options = optimset('GradObj', 'on', 'MaxIter', '100');
initialTheta = zeros(2,1); [optTheta, functionVal, exitFlag] = fminunc(@costFunction, initialTheta, options);
  • 1
  • 2
  • 3

fminunc represents the unconstrained minimization function in Octave. When calling this function, you need to pass in a variable options with configuration information. In the above code, 'GradObj', 'on' in our setting item means setting the gradient target parameter to the open state (on), which also means that you do have to provide a gradient to the algorithm now. 'MaxIter', '100' means to set the maximum number of iterations to 100. initialTheta represents a guessed initial value of θ given by us.

Then we call the fminunc function, passing in three parameters, the first parameter @costFunction where the @ symbol represents a pointer to the costFunction function we defined earlier. The last two parameters are the initial value of thetatheta we defined and the configuration information options.

When we call this fminunc function, it will automatically choose one of many advanced optimization algorithms to use (you can also think of it as a gradient descent algorithm that automatically chooses the appropriate learning rate aa).

In the end, we will get three return values, which are the θ value optTheta that satisfies the minimized cost function J(θ), the value functionVal of jVal defined in costFunction, and the state value exitFlag that marks whether it has converged. If it has converged, mark it as 1, otherwise 0.

Guess you like

Origin http://43.154.161.224:23101/article/api/json?id=325899477&siteId=291194637