Skip to content

Anselmoo/useful-optimizer

Repository files navigation

DOI

Useful Optimizer

Useful Optimizer is a dedicated set of optimization algorithms for numeric problems. It's designed to provide a comprehensive collection of optimization techniques that can be easily used and integrated into any project.

Version

The current version of Useful Optimizer is 0.1.0.

Features

  • A wide range of optimization algorithms.
  • Easy to use and integrate.
  • Suitable for various numeric problems.
  • Having fun to play with the algorithms

Installation

To install Useful Optimizer, you can use pip:

pip install git+https://github.com/Anselmoo/useful-optimizer

Usage

Here's a basic example of how to use Useful Optimizer:

from opt.benchmark import CrossEntropyMethod
from opt.benchmark.functions import shifted_ackley

optimizer = CrossEntropyMethod(
        func=shifted_ackley,
        dim=2,
        lower_bound=-12.768,
        upper_bound=+12.768,
        population_size=100,
        max_iter=1000,
    )
best_solution, best_fitness = optimizer.search()
print(f"Best solution: {best_solution}")
print(f"Best fitness: {best_fitness}")

You can also use the new gradient-based optimizers:

from opt.stochastic_gradient_descent import SGD
from opt.adamw import AdamW
from opt.bfgs import BFGS

# Gradient-based optimization
sgd = SGD(func=shifted_ackley, lower_bound=-12.768, upper_bound=12.768, dim=2, learning_rate=0.01)
best_solution, best_fitness = sgd.search()

# Adam variant with weight decay
adamw = AdamW(func=shifted_ackley, lower_bound=-12.768, upper_bound=12.768, dim=2, weight_decay=0.01)
best_solution, best_fitness = adamw.search()

# Quasi-Newton method
bfgs = BFGS(func=shifted_ackley, lower_bound=-12.768, upper_bound=12.768, dim=2, num_restarts=10)
best_solution, best_fitness = bfgs.search()

Implemented Optimizers

The current version of Useful Optimizer includes 54 optimization algorithms, each implemented as a separate module. Each optimizer is linked to its corresponding source code for easy reference and study.

🧠 Gradient-Based Optimizers

These optimizers use gradient information to guide the search process and are commonly used in machine learning and deep learning applications.

  • Adadelta - An adaptive learning rate method that uses only first-order information
  • Adagrad - Adapts the learning rate to the parameters, performing smaller updates for frequently occurring features
  • Adaptive Moment Estimation (Adam) - Combines advantages of AdaGrad and RMSProp with bias correction
  • AdaMax - Adam variant using infinity norm for second moment estimation
  • AdamW - Adam with decoupled weight decay for better regularization
  • AMSGrad - Adam variant with non-decreasing second moment estimates
  • BFGS - Quasi-Newton method approximating the inverse Hessian matrix
  • Conjugate Gradient - Efficient iterative method for solving systems of linear equations
  • L-BFGS - Limited-memory version of BFGS for large-scale optimization
  • Nadam - Nesterov-accelerated Adam combining Adam with Nesterov momentum
  • Nesterov Accelerated Gradient - Accelerated gradient method with lookahead momentum
  • RMSprop - Adaptive learning rate using moving average of squared gradients
  • SGD with Momentum - SGD enhanced with momentum for faster convergence
  • Stochastic Gradient Descent - Fundamental gradient-based optimization algorithm
🦋 Nature-Inspired Metaheuristics

These algorithms are inspired by natural phenomena and biological behaviors to solve optimization problems.

🧬 Evolutionary and Population-Based Algorithms

These algorithms use principles of evolution and population dynamics to find optimal solutions.

🎯 Local Search and Classical Methods

Traditional optimization methods including local search techniques and classical mathematical approaches.

🔬 Physics and Mathematical-Inspired Algorithms

Algorithms inspired by physical phenomena and mathematical concepts.

📊 Statistical and Probabilistic Methods

Methods based on statistical inference and probabilistic approaches.

🔧 Specialized and Constrained Optimization

Specialized algorithms for particular types of optimization problems.

Note

Please note that not all of these algorithms are suitable for all types of optimization problems. Some are better suited for continuous optimization problems, some for discrete optimization problems, and others for specific types of problems like quadratic programming or linear discriminant analysis.

Contributing

Contributions to Useful Optimizer are welcome! Please read the contributing guidelines before getting started.


Warning

This project was generated with GitHub Copilot and may not be completely verified. Please use with caution and feel free to report any issues you encounter. Thank you!

Warning

Some parts still contain the legacy np.random.rand call. See also: https://docs.astral.sh/ruff/rules/numpy-legacy-random/

About

A dedicated set of optimization algorithms for numeric problems.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages