Skip to content

Releases: denehoffman/ganesh

v0.8.4

03 Sep 21:15
73476bf
Compare
Choose a tag to compare

Fixed

  • use absolute value for absolute tolerance

Other

  • reverse some dot products with the wrong dimensions

v0.8.3

03 Sep 19:20
69b3a4e
Compare
Choose a tag to compare

Fixed

  • switch sign on function termination condition

v0.8.2

03 Sep 18:52
15304e8
Compare
Choose a tag to compare

Added

  • add function value terminators for BFGS algorithms
  • add function value terminators for BFGS algorithms

v0.8.1

03 Sep 18:33
afcbf82
Compare
Choose a tag to compare

Added

  • add gradient tolerance to L-BFGS-B

Other

  • Merge branch 'main' into development
  • export BFGS methods in mod

v0.8.0

03 Sep 17:43
192d54c
Compare
Choose a tag to compare

Added

  • add L-BFGS-B algorithm
  • update line search to take a max_step optional argument and return a bool flag of validity rather than an Option
  • add LineSearch trait and implementations of BFGS and L-BFGS algorithms
  • update NelderMead to count gradient evals and use bounded interface
  • add bounded evaluation shortcuts to Function trait and count gradient evaluations in Status

Fixed

  • simplify logic by removing internal m
  • change to inequality to ensure a proper status message if the max iterations are passed

Other

  • fix brackets in readme and update main lib docs
  • update readme
  • remove unused collections module

v0.7.1

23 Aug 23:10
e854c59
Compare
Choose a tag to compare

Other

  • fix doctests
  • make minimize return Result<(), E> and store Status in the Minimizer struct

v0.7.0

23 Aug 22:37
a5cf0d5
Compare
Choose a tag to compare

Added

  • add useful assert warning for trying to construct a NelderMead Simplex with fewer than 2 points
  • add check to make sure starting position is within bounds
  • add display method, methods for getting lower and upper bounds, and contains method for Bounds
  • add Debugs to NelderMead
  • add preliminary implementation of BFGS algorithm
  • add method to return the gradient and inverse of Hessian matrix

Fixed

  • remove tracking main.rs, which I use for quick demos
  • adaptive Nelder-Mead now requires inputting the dimension
  • remove out-of-bounds issue
  • step direction should be opposite the gradient
  • p is -grad_f so this was right all along
  • allow expect in Hessian inverse function
  • update BFGS algorithm to recent changes with ganesh
  • change learning_rate to an Option in gradient descent

Other

  • adds documentation to all parts of crate, additionally makes some Algorithm methods return Results now
  • fix typo in example
  • update dependencies
  • update licensing
  • switch license to MIT
  • add Bounds section to TOC
  • correct statements about Function trait in readme
  • typo in readme
  • update README.md
  • major rewrite of library, adds experimental bounds to Nelder Mead
  • qualify path to abs function
  • Merge remote-tracking branch 'origin/bfgs' into development
  • change slice to DVector in documentation
  • update docs and fix links/footnotes

v0.6.0

17 Aug 23:05
c6abf48
Compare
Choose a tag to compare

Added

  • reduces the Field trait to use num traits rather than nalgebra's RealField

Fixed

  • ensure all methods use the Field trait rather than just Float for better compatibility
  • re-export nalgebra::DVector

Other

  • fix some of the documentation to reflect recent changes to the crate

v0.5.0

15 Aug 19:31
a71f3b1
Compare
Choose a tag to compare

This release is fairly large, but the main differences will go unnoticed except for those who have implemented their own algorithms.

Added

  • Generics have been changed and the Field trait was removed. The generics now cover minimal traits required to run each method, along with a From<f32> bound which does the majority of what the Field trait did anyway.
  • Most types that were &[F] are now &DVector<F>. The exceptions are in algorithm new functions, which are more ergonomic if slices are used. In the actual implementations, DVectors are much more convenient than all the Vec iterations I was doing previously.
  • Added some line search algorithms to make the GradientDescent method more functional. This will also be used in the next update, which will hopefully provide the BFGS algorithm family.

Fixed

  • Some fields in the Newton and GradientDescent algorithm initializers were set wrong, making these algorithms function strangely or not at all. This has been fixed.

v0.4.0

30 Jul 15:52
6ffcc8d
Compare
Choose a tag to compare

Other

  • undo changes to previous version, lifetimes make things more difficult to work with for end-users. Removed NelderMeadMessage.