You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
These divergences differ from the equivalent ones defined in the `Distances` package because they are normalized. Also, the package provides methods for calculating their gradient and the (diagonal elements of the) Hessian matrix.
17
+
These divergences differ from the equivalent ones defined in the `Distances` package because they are **normalized**.
18
+
19
+
Also, the package provides methods for calculating their gradient and the (diagonal elements of the) Hessian matrix.
18
20
19
21
The constructors for the types above are straightforward
Each divergence corresponds to a *divergence type*. You can always compute a certain divergence between two vectors using the following syntax
53
55
54
56
```julia
55
-
d =evaluate(div, x, y)
57
+
x =rand(100)
58
+
y =rand(100)
59
+
𝒦ℒ =KullbackLeibler()
60
+
𝒦ℒ(x, y)
56
61
```
57
62
58
-
Here, `div` is an instance of a divergence type. For example, the type for Kullback Leibler divergence is ``KullbackLeibler`` (more divergence types are described in some detail in what follows), then the Kullback Leibler divergence between ``x`` and ``y`` can be computed
59
-
```julia
60
-
d =evaluate(KullbackLeibler(), x, y)
61
-
```
63
+
Here, `div` is an instance of a divergence type.
62
64
63
65
We can also calculate the divergence between the vector ``x`` and the unit vector
64
66
```julia
65
-
r =evaluate(KullbackLeibler(), x)
67
+
r =𝒦ℒ(x)
66
68
```
67
69
68
-
The `Divergence` type is a subtype of `PreMetric` defined in the `Distances` package. As such, the divergences can be evaluated row-wise and column-wise for `X::Matrix` and `Y::Matrix`.
70
+
The `Divergence` type is a subtype of `PreMetric` defined in the `Distances` package. As such, the divergences can be evaluated column-wise for `X::Matrix` and `Y::Matrix`.
69
71
70
72
```julia
71
-
rowise(div, X, Y)
73
+
colwise(𝒦ℒ, X, Y)
72
74
```
73
75
74
-
```julia
75
-
colwise(div, X, Y)
76
-
```
77
76
78
77
### Gradient of the divergence
79
78
@@ -84,7 +83,8 @@ g = gradient(div, x, y)
84
83
```
85
84
or through its in-place version
86
85
```julia
87
-
gradient!(Array(Float64, size(x)), div, x, y)
86
+
u =Vector{Float64}(undef, size(x))
87
+
gradient!(u, div, x, y)
88
88
```
89
89
90
90
### Hessian of the divergence
@@ -94,12 +94,13 @@ h = hessian(div, x, y)
94
94
```
95
95
Its in-place variant is also defined
96
96
```julia
97
-
hessian!(Array(Float64, size(x)), div, x, y)
97
+
u =Vector{Float64}(undef, size(x))
98
+
hessian!(u, div, x, y)
98
99
```
99
100
100
-
Notice that the the divergence's Hessian is sparse, where the diagonal entries are the only ones different from zero. For this reason, `hessian(div, x, y)` returns an `Array{Float64,1}` with the diagonal entries of the hessian.
101
+
Notice that the the divergence's Hessian is sparse, where the diagonal entries are the only ones different from zero. For this reason, `hessian(div, x, y)` returns an `Array{T,1}` with the diagonal entries of the hessian.
0 commit comments