A multistart framework for global optimization with scatter search and local NLP solvers written in Rust
globalsearch-rs: Rust implementation of a modified version of the OQNLP (OptQuest/NLP) algorithm with the core ideas from "Scatter Search and Local NLP Solvers: A Multistart Framework for Global Optimization" by Ugray et al. (2007). It combines scatter search metaheuristics with local minimization for global optimization of nonlinear problems.
Similar to MATLAB's GlobalSearch [2], using cobyla, argmin, rayon and ndarray.
-
π Python Bindings
-
π― Multistart heuristic framework for global optimization
-
π¦ Local optimization using the cobyla [3] and argmin crate [4]
-
π Parallel execution using Rayon
-
π Checkpointing support for long-running optimizations
Add this to your Cargo.toml:
[dependencies]
globalsearch = "0.4.0"Or use cargo add globalsearch in your project directory.
-
Install Rust toolchain using rustup.
-
Clone repository:
git clone https://github.com/GermanHeim/globalsearch-rs.git cd globalsearch-rs -
Build the project:
cargo build --release
-
Define a problem by implementing the
Problemtrait.use ndarray::{array, Array1, Array2}; use globalsearch::problem::Problem; use globalsearch::types::EvaluationError; pub struct MinimizeProblem; impl Problem for MinimizeProblem { fn objective(&self, x: &Array1<f64>) -> Result<f64, EvaluationError> { Ok( ..., // Your objective function here ) } fn gradient(&self, x: &Array1<f64>) -> Result<Array1<f64>, EvaluationError> { Ok(array![ ..., // Optional: Gradient of your objective function here ]) } fn hessian(&self, x: &Array1<f64>) -> Result<Array2<f64>, EvaluationError> { Ok(array![ ..., // Optional: Hessian of your objective function here ]) } fn variable_bounds(&self) -> Array2<f64> { array![[..., ...], [..., ...]] // Lower and upper bounds for each variable } fn constraints(&self) -> Vec<fn(&[f64], &mut ()) -> f64> { vec![ ..., // Optional: Constraint functions here, only valid with COBYLA ] } }
Where the
Problemtrait is defined as:pub trait Problem { fn objective(&self, x: &Array1<f64>) -> Result<f64, EvaluationError>; fn gradient(&self, x: &Array1<f64>) -> Result<Array1<f64>, EvaluationError>; fn hessian(&self, x: &Array1<f64>) -> Result<Array2<f64>, EvaluationError>; fn variable_bounds(&self) -> Array2<f64>; fn constraints(&self) -> Vec<fn(&[f64], &mut ()) -> f64>; }
The
constraintsmethod allows you to define constraint functions for constrained optimization problems. Constraints should follow the sign convention:- Positive or zero: constraint satisfied
- Negative: constraint violated
Example:
impl Problem for MinimizeProblem { // ... fn constraints(&self) -> Vec<fn(&[f64], &mut ()) -> f64> { vec![ |x: &[f64], _: &mut ()| 1.0 - x[0] - x[1], // x[0] + x[1] <= 1.0 |x: &[f64], _: &mut ()| x[0] - 0.5, // x[0] >= 0.5 ] } }
Depending on your choice of local solver, you might need to implement the
gradientandhessianmethods. Learn more about the local solver configuration in the argmin docs or theLocalSolverType.π΄ Note: If using a solver that isn't COBYLA, variable bounds are only used in the scatter search phase of the algorithm. The local solver is unconstrained (See argmin issue #137) and therefor can return solutions out of bounds. You can use OQNLP's
exclude_out_of_boundsmethod to handle this if needed. -
Set OQNLP parameters
use globalsearch::types::{LocalSolverType, OQNLPParams}; use globalsearch::local_solver::builders::SteepestDescentBuilder; let params: OQNLPParams = OQNLPParams { iterations: 125, wait_cycle: 10, threshold_factor: 0.2, distance_factor: 0.75, population_size: 250, local_solver_type: LocalSolverType::SteepestDescent, local_solver_config: SteepestDescentBuilder::default().build(), seed: 0, };
Or use the default parameters (which use COBYLA):
let params = OQNLPParams::default();
Where
OQNLPParamsis defined as:pub struct OQNLPParams { pub iterations: usize, pub wait_cycle: usize, pub threshold_factor: f64, pub distance_factor: f64, pub population_size: usize, pub local_solver_type: LocalSolverType, pub local_solver_config: LocalSolverConfig, pub seed: u64, }
And
LocalSolverTypeis defined as:pub enum LocalSolverType { LBFGS, NelderMead, SteepestDescent, TrustRegion, NewtonCG, COBYLA, }
You can also modify the local solver configuration for each type of local solver. See
builders.rsfor more details. -
Run the optimizer
use oqnlp::{OQNLP, OQNLPParams}; use types::{SolutionSet} fn main() -> Result<(), Box<dyn std::error::Error>> { let problem = MinimizeProblem; let params: OQNLPParams = OQNLPParams { iterations: 125, wait_cycle: 10, threshold_factor: 0.2, distance_factor: 0.75, population_size: 250, local_solver_type: LocalSolverType::SteepestDescent, local_solver_config: SteepestDescentBuilder::default().build(), seed: 0, }; let mut optimizer: OQNLP<MinimizeProblem> = OQNLP::new(problem, params)?; // OQNLP returns a solution set with the best solutions found let solution_set: SolutionSet = optimizer.run()?; println!("{}", solution_set) Ok(()) }
src/
βββ lib.rs # Module declarations
βββ oqnlp.rs # Core OQNLP algorithm implementation
βββ scatter_search.rs # Scatter search component
βββ local_solver/
β βββ builders.rs # Local solver configuration builders
β βββ runner.rs # Local solver runner
βββ filters.rs # Merit and distance filtering logic
βββ problem.rs # Problem trait
βββ types.rs # Data structures and parameters
βββ checkpoint.rs # Checkpointing module
python/ # Python bindings
- argmin
- COBYLA
- ndarray
- rayon [feature:
rayon] - kdam [feature:
progress_bar] - rand
- thiserror
- criterion.rs [dev-dependency]
- serde [feature:
checkpointing] - chrono [feature:
checkpointing] - bincode [feature:
checkpointing]
Distributed under the MIT License. See LICENSE.txt for more information.
If GlobalSearch-rs has been significant in your research, and you would like to acknowledge the project in your academic publication, we suggest citing the following paper:
@article{Heim2025,
author = {Heim, GermΓ‘n MartΓn},
doi = {10.21105/joss.09234},
journal = {Journal of Open Source Software},
number = {115},
pages = {9234},
publisher = {The Open Journal},
title = {GlobalSearch-rs: A multistart framework for global optimization written in Rust},
url = {https://doi.org/10.21105/joss.09234},
volume = {10},
year = {2025}
}[1] Zsolt Ugray, Leon Lasdon, John Plummer, Fred Glover, James Kelly, Rafael MartΓ, (2007) Scatter Search and Local NLP Solvers: A Multistart Framework for Global Optimization. INFORMS Journal on Computing 19(3):328-340. http://dx.doi.org/10.1287/ijoc.1060.0175
[2] GlobalSearch. The MathWorks, Inc. Available at: https://www.mathworks.com/help/gads/globalsearch.html (Accessed: 27 January 2025)
[3] RΓ©mi Lafage. cobyla - a pure Rust implementation. GitHub repository. MIT License. Available at: https://github.com/relf/cobyla (Accessed: 17 September 2025)
[4] Kroboth, S. argmin{}. Available at: https://argmin-rs.org/ (Accessed: 25 January 2025)
