-
-
Notifications
You must be signed in to change notification settings - Fork 96
OptimizationODE package for ODE solvers #916
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
OptimizationODE.jl added and package created for the same.
Added basic tests for the gradient descent code in OptimizationODE.jl
Project.toml for the initial OptimizationODE.jl code.
Added algorithm functionality along with helper functions for: ODEGradientDescent, RKChebyshevDescent, RKAccelerated, PRKChebyshevDescent.
Updated tests for the new functions and AD test for ODEGradientDescent.
Updated code for the 4 algs, removing redundant code.
Updated tests using the ODEOptimizer wrapper.
added OrdinaryDiffEq for reusing ODE solvers.
Updated struct and constructors, the ODEOptimizer now takes type instead of instance.
updated tests.
ChrisRackauckas
requested changes
May 28, 2025
Add callback, progress, struct accepts type instead of insatance, maxiters passed.
Tests match the latest ODE code with callbacks and maxiters.
|
||
|
||
function SciMLBase.__init(prob::OptimizationProblem, opt::ODEOptimizer, data=Optimization.DEFAULT_DATA; | ||
η=0.1, dt=nothing, tmax=100.0, callback=Optimization.DEFAULT_CALLBACK, progress=false, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Suggested change
η=0.1, dt=nothing, tmax=100.0, callback=Optimization.DEFAULT_CALLBACK, progress=false, | |
dt=nothing, tmax=100.0, callback=Optimization.DEFAULT_CALLBACK, progress=false, |
|
||
η = get(cache.solver_args, :η, 0.1) | ||
dt = get(cache.solver_args, :dt, nothing) | ||
tmax = get(cache.solver_args, :tmax, 100.0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why is this set? Is that standard? That seems really odd.
Docs next |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Checklist
contributor guidelines, in particular the SciML Style Guide and
COLPRAC.
Additional context
This is a package for developing ODE solvers for Optimization with methods like Gradient Descent, Chebyshev stabilization and the Runge-kutta acceleration.