Skip to content

Commit 42c20fa

Browse files
authored
Redefine residual as d_pred - d_obs (#65)
This way we don't need to take care of the minus sign in the derivatives (like the gradient).
1 parent 4109f24 commit 42c20fa

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

src/inversion_ideas/data_misfit.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,7 @@ def gradient(self, model: Model) -> npt.NDArray[np.float64]:
109109
"""
110110
jac = self.simulation.jacobian(model)
111111
weights_matrix = self.weights_matrix
112-
return -2 * jac.T @ (weights_matrix.T @ weights_matrix @ self.residual(model))
112+
return 2 * jac.T @ (weights_matrix.T @ weights_matrix @ self.residual(model))
113113

114114
def hessian(
115115
self, model: Model
@@ -171,12 +171,12 @@ def residual(self, model: Model):
171171
172172
.. math::
173173
174-
\mathbf{r} = \mathbf{d} - \mathcal{F}(\mathbf{m})
174+
\mathbf{r} = \mathcal{F}(\mathbf{m}) - \mathbf{d}
175175
176176
where :math:`\mathbf{d}` is the vector with observed data, :math:`\mathcal{F}`
177177
is the forward model, and :math:`\mathbf{m}` is the model vector.
178178
"""
179-
return self.data - self.simulation(model)
179+
return self.simulation(model) - self.data
180180

181181
@property
182182
def weights(self) -> npt.NDArray[np.float64]:

0 commit comments

Comments
 (0)