Skip to content

Update to the [email protected] interface #2506

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 33 commits into
base: main
Choose a base branch
from
Open

Conversation

Red-Portal
Copy link
Member

@Red-Portal Red-Portal commented Mar 14, 2025

This PR aims to update Turing's Variational module to match AdvancedVI's new interface starting from v0.3. I will try not to change the interface too much, but given the new features in AdvancedVI, I think breaking changes will be inevitable. Though the focus will be to provide a good default setting rather than to expose all the features.

Currently proposed interface:

using Turing
using AdvancedVI

d = randn(100)
Turing.@model function model()
   x ~ MvNormal(randn(100), 1)
   y ~ InverseGamma()
end
m = model()
q = Turing.Variational.q_fullrank_gaussian(Random.default_rng(), m)
n_iters = 1000
Turing.Variational.vi(m, q, n_iters)

Closes #2507
Closes #2508
Closes #2430

Red-Portal and others added 12 commits March 14, 2025 19:09
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@Red-Portal Red-Portal marked this pull request as draft March 14, 2025 23:23
Red-Portal and others added 5 commits March 14, 2025 19:29
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@yebai
Copy link
Member

yebai commented Mar 20, 2025

@Red-Portal, can you fix the tests before I take a look?

@Red-Portal
Copy link
Member Author

@yebai I marked the PR as a draft so that we can first agree on an interface, and then I flesh out the implementation and the tests. Do we wish we proceed in another way?

@yebai
Copy link
Member

yebai commented Mar 20, 2025

Let's address the interface later or in a separate PR since that might require more discussions. For this PR, let's try to keep the VI interface non-breaking where possible.

@yebai
Copy link
Member

yebai commented Apr 18, 2025

@Red-Portal can you take a look at the following error:

ERROR: LoadError: UndefVarError: `turnprogress` not defined in `AdvancedVI`

@Red-Portal Red-Portal marked this pull request as ready for review April 20, 2025 05:26
Copy link

codecov bot commented Apr 21, 2025

Codecov Report

Attention: Patch coverage is 0% with 78 lines in your changes missing coverage. Please review.

Project coverage is 47.32%. Comparing base (fc32e10) to head (43c19aa).

Files with missing lines Patch % Lines
src/variational/VariationalInference.jl 0.00% 46 Missing ⚠️
src/variational/bijectors.jl 0.00% 32 Missing ⚠️

❗ There is a different number of reports uploaded between BASE (fc32e10) and HEAD (43c19aa). Click for more details.

HEAD has 6 uploads less than BASE
Flag BASE (fc32e10) HEAD (43c19aa)
30 24
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #2506       +/-   ##
===========================================
- Coverage   84.05%   47.32%   -36.74%     
===========================================
  Files          21       21               
  Lines        1455     1456        +1     
===========================================
- Hits         1223      689      -534     
- Misses        232      767      +535     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Member

@yebai yebai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @Red-Portal. I left some comments below.

One high-level comment: I suggest we unify vi_fullrank_gaussian and vi_meanfield_gaussian into a single function, q_distribution(...; gaussian=true|false, fullrank=true|false), to reduce code redundancy.

Note on CI errors:

  • CI compains about missing ADVI() and TruncatedADAGrad().


if isfinite(energy)
return scale
elseif n_trial == num_max_trials
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
elseif n_trial == num_max_trials
else

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If I commit this change then the initialization procedure will throw an error at the very first failure? The intention was to let the algorithm fail for at least num_max_trials times.

# Use linked `varinfo` to determine the correct number of parameters.
# TODO: Replace with `length` once this is implemented for `VarInfo`.
varinfo_linked = DynamicPPL.link(varinfo, model)
num_params = length(varinfo_linked[:])
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we get the dimentionality via num_params = length(varinfo_linked) instead of length(varinfo_linked[:])?

cc @mkarikom

return reshape_outer ∘ f ∘ reshape_inner
end

"""
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please move Bijectors.bijector(model::DynamicPPL.Model,...) to DynamicPPL.

cc @mhauru

Wraps a bijector `f` such that it operates on vectors of length `prod(in_size)` and produces
a vector of length `prod(Bijectors.output(f, in_size))`.
"""
function wrap_in_vec_reshape(f, in_size)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since this function is only used once, I suggest we inline it and add comments to explain its behaviour.

@yebai yebai removed the request for review from torfjelde April 21, 2025 11:13
@Red-Portal
Copy link
Member Author

Sorry for the delay! I've been traveling in the past weeks, but will start working on this now

scale::Union{Nothing,<:LowerTriangular}=nothing,
kwargs...,
)
return q_init(rng, model; location, scale, meanfield=false, basedist=Normal(), kwargs...)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[JuliaFormatter] reported by reviewdog 🐶

Suggested change
return q_init(rng, model; location, scale, meanfield=false, basedist=Normal(), kwargs...)
return q_init(
rng, model; location, scale, meanfield=false, basedist=Normal(), kwargs...
)

@coveralls
Copy link

Pull Request Test Coverage Report for Build 14736179503

Details

  • 0 of 78 (0.0%) changed or added relevant lines in 2 files are covered.
  • 460 unchanged lines in 11 files lost coverage.
  • Overall coverage decreased (-36.7%) to 47.321%

Changes Missing Coverage Covered Lines Changed/Added Lines %
src/variational/bijectors.jl 0 32 0.0%
src/variational/VariationalInference.jl 0 46 0.0%
Files with Coverage Reduction New Missed Lines %
src/essential/container.jl 1 87.1%
src/variational/VariationalInference.jl 1 0.0%
src/mcmc/is.jl 15 5.88%
ext/TuringDynamicHMCExt.jl 27 0.0%
src/mcmc/mh.jl 32 60.18%
src/mcmc/particle_mcmc.jl 41 58.33%
src/mcmc/emcee.jl 44 9.62%
src/mcmc/sghmc.jl 50 21.54%
src/stdlib/distributions.jl 50 0.0%
ext/TuringOptimExt.jl 53 0.0%
Totals Coverage Status
Change from base Build 14321422608: -36.7%
Covered Lines: 689
Relevant Lines: 1456

💛 - Coveralls

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

AdvancedVI 0.3 and compatibility with Turing.jl
3 participants