|
| 1 | +--- |
| 2 | +Title: 'minimize()' |
| 3 | +Description: 'Returns the minimum of a scalar function of one or more variables using optimization methods from SciPy.' |
| 4 | +Subjects: |
| 5 | + - 'Data Science' |
| 6 | + - 'Machine Learning' |
| 7 | +Tags: |
| 8 | + - 'Math' |
| 9 | + - 'Optimization' |
| 10 | + - 'Python' |
| 11 | + |
| 12 | +CatalogContent: |
| 13 | + - 'learn-python' |
| 14 | + - 'paths/data-science' |
| 15 | +--- |
| 16 | + |
| 17 | +The **`minimize()`** function in the SciPy library is used to find the minimum of a scalar function. It provides various optimization algorithms, including both gradient-based and derivative-free methods. |
| 18 | + |
| 19 | +## Syntax |
| 20 | + |
| 21 | +```python |
| 22 | +scipy.optimize.minimize(fun, x0, args=(), method=None, jac=None, hess=None, constraints=(), bounds=None, tol=None, options=None) |
| 23 | +``` |
| 24 | + |
| 25 | +## Parameters |
| 26 | + |
| 27 | +- `fun`: The objective function to be minimized. |
| 28 | +- `x0`: Initial guess for the variables. |
| 29 | +- `args`: Extra arguments passed to the objective function. |
| 30 | +- `method`: The optimization method to use (e.g., `'BFGS'`, `'Nelder-Mead'`, `'Powell'`, etc.). |
| 31 | +- `jac` (Optional): The gradient (Jacobian) of the objective function. If not provided, numerical differentiation is used. |
| 32 | +- `hess` (Optional): The Hessian matrix of the objective function. Typically used with second-order methods like 'Newton-CG' or 'trust-ncg'. |
| 33 | +- `constraints` (Optional): Constraints definition. Can include equality or inequality constraints. |
| 34 | +- `bounds` (Optional): Bounds on variables. |
| 35 | +- `tol` (Optional): Tolerance for termination. Specifies the convergence threshold. |
| 36 | +- `options` (Optional): A dictionary of additional options specific to the selected optimization method (e.g., maximum number of iterations, tolerance, etc.). |
| 37 | + |
| 38 | +It returns an `OptimizeResult` object with the optimal solution, function value at the solution, success status, and other optimization details. |
| 39 | + |
| 40 | +## Example |
| 41 | + |
| 42 | +In this example, we are using the `minimize()` function to find the minimum value of a quadratic objective function: |
| 43 | + |
| 44 | +```py |
| 45 | +from scipy.optimize import minimize |
| 46 | + |
| 47 | +# Define the objective function |
| 48 | +def objective_function(x): |
| 49 | + return x**2 |
| 50 | + |
| 51 | +# Initial guess |
| 52 | +x0 = 2 |
| 53 | + |
| 54 | +# Perform the minimization |
| 55 | +result = minimize(objective_function, x0) |
| 56 | + |
| 57 | +# Print the result |
| 58 | +print("Optimal value:", result.fun) |
| 59 | +print("Optimal point:", result.x) |
| 60 | +``` |
| 61 | + |
| 62 | +It produces the following output: |
| 63 | + |
| 64 | +```shell |
| 65 | +Optimal value: 3.5662963072207506e-16 |
| 66 | +Optimal point: [-1.88846401e-08] |
| 67 | +``` |
| 68 | + |
| 69 | +## Codebyte Example |
| 70 | + |
| 71 | +Run the following codebyte example to understand how the `minimize()` function works: |
| 72 | + |
| 73 | +```codebyte/python |
| 74 | +from scipy.optimize import minimize |
| 75 | +
|
| 76 | +# Define the objective function |
| 77 | +def objective_function(x): |
| 78 | + return (x - 3)**2 + 4 |
| 79 | +
|
| 80 | +# Initial guess |
| 81 | +x0 = 0 |
| 82 | +
|
| 83 | +# Perform the minimization |
| 84 | +result = minimize(objective_function, x0) |
| 85 | +
|
| 86 | +# Print the result |
| 87 | +print("Optimal value:", result.fun) |
| 88 | +print("Optimal point:", result.x) |
| 89 | +``` |
0 commit comments