You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: HISTORY.md
+9
Original file line number
Diff line number
Diff line change
@@ -49,6 +49,15 @@ This release removes the feature of `VarInfo` where it kept track of which varia
49
49
50
50
This change also affects sampling in Turing.jl.
51
51
52
+
**Other changes**
53
+
54
+
LogDensityProblemsAD is now removed as a dependency.
55
+
Instead of constructing a `LogDensityProblemAD.ADgradient` object, we now directly use `DifferentiationInterface` to calculate the gradient of the log density with respect to model parameters.
56
+
57
+
In practice, this means that if you want to calculate the gradient for a model, you can do:
58
+
59
+
TODO(penelopeysm): Finish this
60
+
52
61
## 0.34.2
53
62
54
63
- Fixed bugs in ValuesAsInModelContext as well as DebugContext where underlying PrefixContexts were not being applied.
Copy file name to clipboardExpand all lines: docs/src/api.md
+2-1
Original file line number
Diff line number
Diff line change
@@ -54,10 +54,11 @@ logjoint
54
54
55
55
### LogDensityProblems.jl interface
56
56
57
-
The [LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl) interface is also supported by simply wrapping a [`Model`](@ref) in a `DynamicPPL.LogDensityFunction`:
57
+
The [LogDensityProblems.jl](https://github.com/tpapp/LogDensityProblems.jl) interface is also supported by wrapping a [`Model`](@ref) in a `DynamicPPL.LogDensityFunction` or `DynamicPPL.LogDensityFunctionWithGrad`.
0 commit comments