Git Product home page Git Product logo

ganesh's Introduction

Function Minimization in Rust, Simplified

GitHub Release GitHub last commit GitHub Actions Workflow Status GitHub License Crates.io Version docs.rs Codecov

ganesh, (/ɡəˈneɪʃ/) named after the Hindu god of wisdom, provides several common minimization algorithms as well as a straightforward, trait-based interface to create your own extensions. This crate is intended to be as simple as possible. The user needs to implement the Function trait on some struct which will take a vector of parameters and return a single-valued Result ($f(\mathbb{R}^n) \to \mathbb{R}$). Users can optionally provide a gradient function to speed up some algorithms, but a default central finite-difference implementation is provided so that all algorithms will work out of the box.

Caution

This crate is still in an early development phase, and the API is not stable. It can (and likely will) be subject to breaking changes before the 1.0.0 version release (and hopefully not many after that).

Table of Contents

Key Features

  • Simple but powerful trait-oriented library which tries to follow the Unix philosophy of "do one thing and do it well".
  • Generics to allow for different numeric types to be used in the provided algorithms.
  • Algorithms that are simple to use with sensible defaults.
  • Traits which make developing future algorithms simple and consistent.

Quick Start

This crate provides some common test functions in the test_functions module. Consider the following implementation of the Rosenbrock function:

use std::convert::Infallible;
use ganesh::prelude::*;

pub struct Rosenbrock {
    pub n: usize,
}
impl Function<f64, (), Infallible> for Rosenbrock {
    fn evaluate(&self, x: &[f64], _user_data: &mut ()) -> Result<f64, Infallible> {
        Ok((0..(self.n - 1))
            .map(|i| 100.0 * (x[i + 1] - x[i].powi(2)).powi(2) + (1.0 - x[i]).powi(2))
            .sum())
    }
}

To minimize this function, we could consider using the Nelder-Mead algorithm:

use ganesh::prelude::*;
use ganesh::algorithms::NelderMead;

let problem = Rosenbrock { n: 2 };
let nm = NelderMead::default();
let mut m = Minimizer::new(nm, 2);
let x0 = &[2.0, 2.0];
m.minimize(&problem, x0, &mut ()).unwrap();
println!("{}", m.status);

This should output

MSG:       term_f = STDDEV
X:         [0.9999999946231828, 0.9999999884539057]
F(X):      0.00000000000000009170942877687133
N_EVALS:   160
CONVERGED: true

Bounds

All minimizers in ganesh have access to a feature which allows algorithms which usually function in unbounded parameter spaces to only return results inside a bounding box. This is done via a parameter transformation, the same one used by LMFIT and MINUIT. While the user inputs parameters within the bounds, unbounded algorithms can (and in practice will) convert those values to a set of unbounded "internal" parameters. When functions are called, however, these internal parameters are converted back into bounded "external" parameters, via the following transformations:

Upper and lower bounds:

$$x_\text{int} = \arcsin\left(2\frac{x_\text{ext} - x_\text{min}}{x_\text{max} - x_\text{min}} - 1\right)$$ $$x_\text{ext} = x_\text{min} + \left(\sin(x_\text{int}) + 1\right)\frac{x_\text{max} - x_\text{min}}{2}$$

Upper bound only:

$$x_\text{int} = \sqrt{(x_\text{max} - x_\text{ext} + 1)^2 - 1}$$ $$x_\text{ext} = x_\text{max} + 1 - \sqrt{x_\text{int}^2 + 1}$$

Lower bound only:

$$x_\text{int} = \sqrt{(x_\text{ext} - x_\text{min} + 1)^2 - 1}$$ $$x_\text{ext} = x_\text{min} - 1 + \sqrt{x_\text{int}^2 + 1}$$

As noted in the documentation for both LMFIT and MINUIT, these bounds should be used with caution. They turn linear problems into nonlinear ones, which can mess with error propagation and even fit convergence, not to mention increase function complexity. Methods which output covariance matrices need to be adjusted if bounded, and MINUIT recommends fitting a second time near a minimum without bounds to ensure proper error propagation.

Future Plans

  • Eventually, I would like to implement BGFS and variants, MCMC algorithms, and some more modern gradient-free optimization techniques.
  • There are probably many optimizations and algorithm extensions that I'm missing right now because I just wanted to get it working first.
  • A test suite

ganesh's People

Contributors

denehoffman avatar github-actions[bot] avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.