You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/bayesian-differential-equations/index.qmd
+9-10Lines changed: 9 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -328,17 +328,14 @@ The fit is pretty good even though the data was quite noisy to start.
328
328
329
329
## Scaling to Large Models: Adjoint Sensitivities
330
330
331
-
DifferentialEquations.jl's efficiency for large stiff models has been shown in [multiple benchmarks](https://github.com/SciML/DiffEqBenchmarks.jl).
332
-
To learn more about how to optimize solving performance for stiff problems you can take a look at the [docs](https://docs.sciml.ai/DiffEqDocs/stable/tutorials/advanced_ode_example/).
333
-
334
-
_Sensitivity analysis_ is provided by the [SciMLSensitivity.jl package](https://docs.sciml.ai/SciMLSensitivity/stable/), which forms part of SciML's differential equation suite.
335
-
The model sensitivities are the derivatives of the solution with respect to the parameters.
336
-
Specifically, the local sensitivity of the solution to a parameter is defined by how much the solution would change if the parameter were changed by a small amount.
337
-
Sensitivity analysis provides a cheap way to calculate the gradient of the solution which can be used in parameter estimation and other optimization tasks.
338
-
The sensitivity analysis methods in SciMLSensitivity.jl are based on automatic differentiation (AD), and are compatible with many of Julia's AD backends.
331
+
Turing's gradient-based MCMC algorithms, such as NUTS, use ForwardDiff by default.
332
+
This works well for small models, but for larger models with many parameters, reverse-mode automatic differentiation is often more efficient (see [the automatic differentiation page]({{< meta usage-automatic-differentiation >}}) for more information).
333
+
334
+
To use reverse-mode AD with differential equations, you need to first load the [SciMLSensitivity.jl package](https://docs.sciml.ai/SciMLSensitivity/stable/), which forms part of SciML's differential equation suite.
335
+
Here, 'sensitivity' refers to the derivative of the solution of a differential equation with respect to its parameters.
339
336
More details on the mathematical theory that underpins these methods can be found in [the SciMLSensitivity documentation](https://docs.sciml.ai/SciMLSensitivity/stable/sensitivity_math/).
340
337
341
-
To enable sensitivity analysis, you will need to `import SciMLSensitivity`, and also use one of the AD backends that is compatible with SciMLSensitivity.jl when sampling.
338
+
Once SciMLSensitivity has been loaded, you can use one of the AD backends which are compatible with SciMLSensitivity.jl.
342
339
For example, if we wanted to use [Mooncake.jl](https://chalk-lab.github.io/Mooncake.jl/stable/), we could run:
In this case, SciMLSensitivity will automatically choose an appropriate sensitivity analysis algorithm for you.
352
+
(If SciMLSensitivity is not loaded, the call to `sample` will error.)
353
+
354
+
SciMLSensitivity has a number of sensitivity analysis algorithms: in this case it will automatically choose a default for you.
356
355
You can also manually specify an algorithm by providing the `sensealg` keyword argument to the `solve` function; the existing algorithms are covered in [this page of the SciMLSensitivity docs](https://docs.sciml.ai/SciMLSensitivity/stable/manual/differential_equation_sensitivities/).
357
356
358
357
For more examples of adjoint usage on large parameter models, consult the [DiffEqFlux documentation](https://docs.sciml.ai/DiffEqFlux/stable/).
0 commit comments