Skip to content

Commit dca8549

Browse files
authored
improve text on DE page (#665)
1 parent 73b99c6 commit dca8549

File tree

1 file changed

+9
-10
lines changed
  • tutorials/bayesian-differential-equations

1 file changed

+9
-10
lines changed

tutorials/bayesian-differential-equations/index.qmd

Lines changed: 9 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -328,17 +328,14 @@ The fit is pretty good even though the data was quite noisy to start.
328328

329329
## Scaling to Large Models: Adjoint Sensitivities
330330

331-
DifferentialEquations.jl's efficiency for large stiff models has been shown in [multiple benchmarks](https://github.com/SciML/DiffEqBenchmarks.jl).
332-
To learn more about how to optimize solving performance for stiff problems you can take a look at the [docs](https://docs.sciml.ai/DiffEqDocs/stable/tutorials/advanced_ode_example/).
333-
334-
_Sensitivity analysis_ is provided by the [SciMLSensitivity.jl package](https://docs.sciml.ai/SciMLSensitivity/stable/), which forms part of SciML's differential equation suite.
335-
The model sensitivities are the derivatives of the solution with respect to the parameters.
336-
Specifically, the local sensitivity of the solution to a parameter is defined by how much the solution would change if the parameter were changed by a small amount.
337-
Sensitivity analysis provides a cheap way to calculate the gradient of the solution which can be used in parameter estimation and other optimization tasks.
338-
The sensitivity analysis methods in SciMLSensitivity.jl are based on automatic differentiation (AD), and are compatible with many of Julia's AD backends.
331+
Turing's gradient-based MCMC algorithms, such as NUTS, use ForwardDiff by default.
332+
This works well for small models, but for larger models with many parameters, reverse-mode automatic differentiation is often more efficient (see [the automatic differentiation page]({{< meta usage-automatic-differentiation >}}) for more information).
333+
334+
To use reverse-mode AD with differential equations, you need to first load the [SciMLSensitivity.jl package](https://docs.sciml.ai/SciMLSensitivity/stable/), which forms part of SciML's differential equation suite.
335+
Here, 'sensitivity' refers to the derivative of the solution of a differential equation with respect to its parameters.
339336
More details on the mathematical theory that underpins these methods can be found in [the SciMLSensitivity documentation](https://docs.sciml.ai/SciMLSensitivity/stable/sensitivity_math/).
340337

341-
To enable sensitivity analysis, you will need to `import SciMLSensitivity`, and also use one of the AD backends that is compatible with SciMLSensitivity.jl when sampling.
338+
Once SciMLSensitivity has been loaded, you can use one of the AD backends which are compatible with SciMLSensitivity.jl.
342339
For example, if we wanted to use [Mooncake.jl](https://chalk-lab.github.io/Mooncake.jl/stable/), we could run:
343340

344341
```{julia}
@@ -352,7 +349,9 @@ adtype = AutoMooncake()
352349
sample(model, NUTS(; adtype=adtype), 1000; progress=false)
353350
```
354351

355-
In this case, SciMLSensitivity will automatically choose an appropriate sensitivity analysis algorithm for you.
352+
(If SciMLSensitivity is not loaded, the call to `sample` will error.)
353+
354+
SciMLSensitivity has a number of sensitivity analysis algorithms: in this case it will automatically choose a default for you.
356355
You can also manually specify an algorithm by providing the `sensealg` keyword argument to the `solve` function; the existing algorithms are covered in [this page of the SciMLSensitivity docs](https://docs.sciml.ai/SciMLSensitivity/stable/manual/differential_equation_sensitivities/).
357356

358357
For more examples of adjoint usage on large parameter models, consult the [DiffEqFlux documentation](https://docs.sciml.ai/DiffEqFlux/stable/).

0 commit comments

Comments
 (0)