python metaphysics Modeling (3): Least Squares

(Cushions the first blog after six months, I do not know there is no one watching ...)

Translation and interpretation, as usual scipy official document, but it is based on the latest version 1.3.0

Documentation link: https: //docs.scipy.org/doc/scipy-1.3.0/reference/generated/scipy.optimize.least_squares.html#scipy.optimize.least_squares

函数原型:scipy.optimize.least_squares(funx0jac='2-point'bounds=(-infinf)method='trf'ftol=1e-08xtol=1e-08gtol=1e-08x_scale=1.0loss='linear'f_scale=1.0diff_step=Nonetr_solver=Nonetr_options={}jac_sparsity=Nonemax_nfev=Noneverbose=0args=()kwargs={})

Important parameters explained:

fun: Incoming loss function (cost function), which is optimized object;

x0: one-dimensional array (single float-type input will be considered as only one element of the array), the starting point that is optimized, but the solution locally optimal solution obtained here iterative least squares is not resolved, and so the starting point the setting is vital;

bounds: solving a range of specific forms of visible first blog;

mathod: optimization, support of three algorithms visible link to the document, the default algorithm 'trf', namely Trust Region Reflective algorithm, suitable for solving large-scale sparse problems;

loss: processing of the loss function, default 'linear' i.e. does not do any processing loss (standard least squares method), optionally in addition smooth l1, huber other in a smooth manner;

verbose: Optional 0 (does not output any message), 1 (output reports after), 2 (output iterative process), the default is 0, to see if the optimization can adjust the report.

The main argument is that, other parameters used in relatively small, if there is a need to see the official documentation.

The return value is a standard optimization problem scipy OptimizeResult class, the most important attribute is x: optimized parameters, as well as status: status code. You can view the specific content of the link: https: //docs.scipy.org/doc/scipy-1.3.0/reference/generated/scipy.optimize.OptimizeResult.html highlight = optimizeresult?

Document also gives the corresponding examples:

>>> def fun_rosenbrock(x): #还是rosenbrock函数
...     return np.array([10 * (x[1] - x[0]**2), (1 - x[0])])

>>> from scipy.optimize import least_squares
>>> x0_rosenbrock = np.array([2, 2])
>>> res_1 = least_squares(fun_rosenbrock, x0_rosenbrock)
>>> res_1.x #优化后的参数
array([ 1.,  1.])
>>> res_1.cost ]
9.8669242910846867e-30
>>> res_1.optimality
8.8928864934219529e-14

These are the basic usage scipy library least_squares function. `

 

Guess you like

Origin www.cnblogs.com/xsxsz/p/11355136.html