Skip to content

GermanHeim/globalsearch-rs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

GlobalSearch-rs

A multistart framework for global optimization with scatter search and local NLP solvers written in Rust

Website | Docs | Examples

globalsearch-rs: Rust implementation of a modified version of the OQNLP (OptQuest/NLP) algorithm with the core ideas from "Scatter Search and Local NLP Solvers: A Multistart Framework for Global Optimization" by Ugray et al. (2007). It combines scatter search metaheuristics with local minimization for global optimization of nonlinear problems.

Similar to MATLAB's GlobalSearch [2], using cobyla, argmin, rayon and ndarray.

Features

  • 🐍 Python Bindings

  • 🎯 Multistart heuristic framework for global optimization

  • πŸ“¦ Local optimization using the cobyla [3] and argmin crate [4]

  • πŸš€ Parallel execution using Rayon

  • πŸ”„ Checkpointing support for long-running optimizations

Installation

Using as a dependency

Add this to your Cargo.toml:

[dependencies]
globalsearch = "0.4.0"

Or use cargo add globalsearch in your project directory.

Building from source

  1. Install Rust toolchain using rustup.

  2. Clone repository:

    git clone https://github.com/GermanHeim/globalsearch-rs.git
    cd globalsearch-rs
  3. Build the project:

    cargo build --release

Usage

  1. Define a problem by implementing the Problem trait.

    use ndarray::{array, Array1, Array2};
    use globalsearch::problem::Problem;
    use globalsearch::types::EvaluationError;
    
    pub struct MinimizeProblem;
    impl Problem for MinimizeProblem {
        fn objective(&self, x: &Array1<f64>) -> Result<f64, EvaluationError> {
            Ok(
                ..., // Your objective function here
            )
        }
    
        fn gradient(&self, x: &Array1<f64>) -> Result<Array1<f64>, EvaluationError> {
            Ok(array![
                ..., // Optional: Gradient of your objective function here
            ])
        }
    
        fn hessian(&self, x: &Array1<f64>) -> Result<Array2<f64>, EvaluationError> {
            Ok(array![
                ..., // Optional: Hessian of your objective function here
            ])
        }
    
        fn variable_bounds(&self) -> Array2<f64> {
            array![[..., ...], [..., ...]] // Lower and upper bounds for each variable
        }
    
        fn constraints(&self) -> Vec<fn(&[f64], &mut ()) -> f64> {
             vec![
               ..., // Optional: Constraint functions here, only valid with COBYLA
             ]
        }
    }

    Where the Problem trait is defined as:

    pub trait Problem {
        fn objective(&self, x: &Array1<f64>) -> Result<f64, EvaluationError>;
        fn gradient(&self, x: &Array1<f64>) -> Result<Array1<f64>, EvaluationError>;
        fn hessian(&self, x: &Array1<f64>) -> Result<Array2<f64>, EvaluationError>;
        fn variable_bounds(&self) -> Array2<f64>;
        fn constraints(&self) -> Vec<fn(&[f64], &mut ()) -> f64>;
    }

    The constraints method allows you to define constraint functions for constrained optimization problems. Constraints should follow the sign convention:

    • Positive or zero: constraint satisfied
    • Negative: constraint violated

    Example:

    impl Problem for MinimizeProblem {
        // ...
        fn constraints(&self) -> Vec<fn(&[f64], &mut ()) -> f64> {
            vec![
                |x: &[f64], _: &mut ()| 1.0 - x[0] - x[1], // x[0] + x[1] <= 1.0
                |x: &[f64], _: &mut ()| x[0] - 0.5,        // x[0] >= 0.5
            ]
        }
    }

    Depending on your choice of local solver, you might need to implement the gradient and hessian methods. Learn more about the local solver configuration in the argmin docs or the LocalSolverType.

    πŸ”΄ Note: If using a solver that isn't COBYLA, variable bounds are only used in the scatter search phase of the algorithm. The local solver is unconstrained (See argmin issue #137) and therefor can return solutions out of bounds. You can use OQNLP's exclude_out_of_bounds method to handle this if needed.

  2. Set OQNLP parameters

    use globalsearch::types::{LocalSolverType, OQNLPParams};
    use globalsearch::local_solver::builders::SteepestDescentBuilder;
    
    let params: OQNLPParams = OQNLPParams {
        iterations: 125,
        wait_cycle: 10,
        threshold_factor: 0.2,
        distance_factor: 0.75,
        population_size: 250,
        local_solver_type: LocalSolverType::SteepestDescent,
        local_solver_config: SteepestDescentBuilder::default().build(),
        seed: 0,
    };

    Or use the default parameters (which use COBYLA):

    let params = OQNLPParams::default();

    Where OQNLPParams is defined as:

    pub struct OQNLPParams {
        pub iterations: usize,
        pub wait_cycle: usize,
        pub threshold_factor: f64,
        pub distance_factor: f64,
        pub population_size: usize,
        pub local_solver_type: LocalSolverType,
        pub local_solver_config: LocalSolverConfig,
        pub seed: u64,
    }

    And LocalSolverType is defined as:

    pub enum LocalSolverType {
        LBFGS,
        NelderMead,
        SteepestDescent,
        TrustRegion,
        NewtonCG,
        COBYLA,
    }

    You can also modify the local solver configuration for each type of local solver. See builders.rs for more details.

  3. Run the optimizer

    use oqnlp::{OQNLP, OQNLPParams};
    use types::{SolutionSet}
    
    fn main() -> Result<(), Box<dyn std::error::Error>> {
         let problem = MinimizeProblem;
         let params: OQNLPParams = OQNLPParams {
                 iterations: 125,
                 wait_cycle: 10,
                 threshold_factor: 0.2,
                 distance_factor: 0.75,
                 population_size: 250,
                 local_solver_type: LocalSolverType::SteepestDescent,
                 local_solver_config: SteepestDescentBuilder::default().build(),
                 seed: 0,
             };
    
         let mut optimizer: OQNLP<MinimizeProblem> = OQNLP::new(problem, params)?;
    
         // OQNLP returns a solution set with the best solutions found
         let solution_set: SolutionSet = optimizer.run()?;
         println!("{}", solution_set)
    
         Ok(())
    }

Project Structure

src/
β”œβ”€β”€ lib.rs # Module declarations
β”œβ”€β”€ oqnlp.rs # Core OQNLP algorithm implementation
β”œβ”€β”€ scatter_search.rs # Scatter search component
β”œβ”€β”€ local_solver/
β”‚   β”œβ”€β”€ builders.rs # Local solver configuration builders
β”‚   └── runner.rs # Local solver runner
β”œβ”€β”€ filters.rs # Merit and distance filtering logic
β”œβ”€β”€ problem.rs # Problem trait
β”œβ”€β”€ types.rs # Data structures and parameters
└── checkpoint.rs # Checkpointing module
python/ # Python bindings

Dependencies

License

Distributed under the MIT License. See LICENSE.txt for more information.

Citing Globalsearch-rs

DOI

If GlobalSearch-rs has been significant in your research, and you would like to acknowledge the project in your academic publication, we suggest citing the following paper:

@article{Heim2025,
  author    = {Heim, GermΓ‘n MartΓ­n},
  doi       = {10.21105/joss.09234},
  journal   = {Journal of Open Source Software},
  number    = {115},
  pages     = {9234},
  publisher = {The Open Journal},
  title     = {GlobalSearch-rs: A multistart framework for global optimization written in Rust},
  url       = {https://doi.org/10.21105/joss.09234},
  volume    = {10},
  year      = {2025}
}

References

[1] Zsolt Ugray, Leon Lasdon, John Plummer, Fred Glover, James Kelly, Rafael MartΓ­, (2007) Scatter Search and Local NLP Solvers: A Multistart Framework for Global Optimization. INFORMS Journal on Computing 19(3):328-340. http://dx.doi.org/10.1287/ijoc.1060.0175

[2] GlobalSearch. The MathWorks, Inc. Available at: https://www.mathworks.com/help/gads/globalsearch.html (Accessed: 27 January 2025)

[3] RΓ©mi Lafage. cobyla - a pure Rust implementation. GitHub repository. MIT License. Available at: https://github.com/relf/cobyla (Accessed: 17 September 2025)

[4] Kroboth, S. argmin{}. Available at: https://argmin-rs.org/ (Accessed: 25 January 2025)

About

A multistart framework for global optimization with scatter search and local NLP solvers written in Rust

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Contributors 2

  •  
  •  

Languages