Skip to content

Commit 1027661

Browse files
Merge branch 'main' into feat/bidirectional-displacement-filter
2 parents e6cc6af + 956faf3 commit 1027661

28 files changed

+551
-438
lines changed

CONTRIBUTING.md

Lines changed: 25 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,6 @@
33
## Contributing code
44

55
### Creating a development environment
6-
76
It is recommended to use [conda](conda:)
87
or [mamba](mamba:) to create a
98
development environment for movement. In the following we assume you have
@@ -34,7 +33,6 @@ pre-commit install
3433
```
3534

3635
### Pull requests
37-
3836
In all cases, please submit code to the main repository via a pull request (PR).
3937
We recommend, and adhere, to the following conventions:
4038

@@ -62,7 +60,6 @@ A typical PR workflow would be:
6260
## Development guidelines
6361

6462
### Formatting and pre-commit hooks
65-
6663
Running `pre-commit install` will set up [pre-commit hooks](https://pre-commit.com/) to ensure a consistent formatting style. Currently, these include:
6764
* [ruff](https://github.com/astral-sh/ruff) does a number of jobs, including code linting and auto-formatting.
6865
* [mypy](https://mypy.readthedocs.io/en/stable/index.html) as a static type checker.
@@ -109,7 +106,6 @@ Make sure to provide docstrings for all public functions, classes, and methods.
109106
This is important as it allows for [automatic generation of the API reference](#updating-the-api-reference).
110107

111108
### Testing
112-
113109
We use [pytest](https://docs.pytest.org/en/latest/) for testing and aim for
114110
~100% test coverage (as far as is reasonable).
115111
All new features should be tested.
@@ -120,6 +116,31 @@ Do not include these data in the repository, especially if they are large.
120116
We store several sample datasets in an external data repository.
121117
See [sample data](#sample-data) for more information.
122118

119+
### Logging
120+
We use the {mod}`loguru<loguru._logger>`-based {class}`MovementLogger<movement.utils.logging.MovementLogger>` for logging.
121+
The logger is configured to write logs to a rotating log file at the `DEBUG` level and to {obj}`sys.stderr` at the `WARNING` level.
122+
123+
To import the logger:
124+
```python
125+
from movement.utils.logging import logger
126+
```
127+
128+
Once the logger is imported, you can log messages with the appropriate [severity levels](inv:loguru#levels) using the same syntax as {mod}`loguru<loguru._logger>` (e.g. `logger.debug("Debug message")`, `logger.warning("Warning message")`).
129+
130+
#### Logging and raising exceptions
131+
Both {meth}`logger.error()<movement.utils.logging.MovementLogger.error>` and {meth}`logger.exception()<movement.utils.logging.MovementLogger.exception>` can be used to log [](inv:python#tut-errors), with the difference that the latter will include the traceback in the log message.
132+
As these methods will return the logged Exception, you can log and raise the Exception in a single line:
133+
```python
134+
raise logger.error(ValueError("message"))
135+
raise logger.exception(ValueError("message")) # with traceback
136+
```
137+
138+
#### When to use `print`, `warnings.warn`, and `logger.warning`
139+
We aim to adhere to the [When to use logging guide](inv:python#logging-basic-tutorial) to ensure consistency in our logging practices.
140+
In general:
141+
* Use {func}`print` for simple, non-critical messages that do not need to be logged.
142+
* Use {func}`warnings.warn` for user input issues that are non-critical and can be addressed within movement, e.g. deprecated function calls that are redirected, invalid `fps` number in {class}`ValidPosesDataset<movement.validators.datasets.ValidPosesDataset>` that is implicitly set to `None`; or when processing data containing excessive NaNs, which the user can potentially address using appropriate methods, e.g. {func}`interpolate_over_time()<movement.filtering.interpolate_over_time>`
143+
* Use {meth}`logger.warning()<loguru._logger.Logger.warning>` for non-critical issues where default values are assigned to optional parameters, e.g. `individual_names`, `keypoint_names` in {class}`ValidPosesDataset<movement.validators.datasets.ValidPosesDataset>`.
123144

124145
### Continuous integration
125146
All pushes and pull requests will be built by [GitHub actions](github-docs:actions).
@@ -155,7 +176,6 @@ The addition of a GitHub tag triggers the package's deployment to PyPI.
155176
The version number is automatically determined from the latest tag on the _main_ branch.
156177

157178
## Contributing documentation
158-
159179
The documentation is hosted via [GitHub pages](https://pages.github.com/) at
160180
[movement.neuroinformatics.dev](target-movement).
161181
Its source files are located in the `docs` folder of this repository.
@@ -173,7 +193,6 @@ ensuring that the documentation is published in sync with each PyPI release.
173193

174194

175195
### Editing the documentation
176-
177196
To edit the documentation, first clone the repository, and install `movement` in a
178197
[development environment](#creating-a-development-environment).
179198

@@ -279,7 +298,6 @@ For example, to reference the {meth}`xarray.Dataset.update` method, use:
279298
:::
280299
::::
281300

282-
283301
### Building the documentation locally
284302
We recommend that you build and view the documentation website locally, before you push your proposed changes.
285303

@@ -338,7 +356,6 @@ make clean html linkcheck
338356
:::
339357

340358
## Sample data
341-
342359
We maintain some sample datasets to be used for testing, examples and tutorials on an
343360
[external data repository](gin:neuroinformatics/movement-test-data).
344361
Our hosting platform of choice is called [GIN](gin:) and is maintained
@@ -409,8 +426,6 @@ To add a new file, you will need to:
409426

410427
9. Upload the committed changes to the GIN repository by running `gin upload`. Latest changes to the repository can be pulled via `gin download`. `gin sync` will synchronise the latest changes bidirectionally.
411428

412-
413-
414429
### `metadata.yaml` example entry
415430
```yaml
416431
"SLEAP_three-mice_Aeon_proofread.analysis.h5":

docs/source/community/roadmaps.md

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,16 @@ The following features are being considered for the first stable version `v1.0`.
1919
navigation, social interactions, etc.
2020
- __Integrate with neurophysiological data analysis tools__. We eventually aim to facilitate combined analysis of motion and neural data.
2121

22-
## Short-term milestone - `v0.1`
22+
## Focus areas for 2025
23+
24+
- Annotate space by defining regions of interest programmatically and via our [GUI](target-gui).
25+
- Annotate time by defining events of interest programmatically and via our [GUI](target-gui).
26+
- Enable workflows for aligning motion tracks with concurrently recorded neurophysiological signals.
27+
- Enrich the interactive visualisation of motion tracks in `napari`, providing more customisation options.
28+
- Enable the saving of filtered tracks and derived kinematic variables to disk.
29+
- Implement metrics useful for analysing spatial navigation, social interactions, and collective behaviour.
30+
31+
## Version 0.1
2332
We've released version `v0.1` of `movement` in March 2025, providing a basic set of features to demonstrate the project's potential and to gather feedback from users. Our minimum requirements for this milestone were:
2433

2534
- [x] Ability to import pose tracks from [DeepLabCut](dlc:), [SLEAP](sleap:) and [LightningPose](lp:) into a common `xarray.Dataset` structure.

docs/source/conf.py

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -211,6 +211,8 @@
211211
"xarray": ("https://docs.xarray.dev/en/stable/", None),
212212
"scipy": ("https://docs.scipy.org/doc/scipy/", None),
213213
"pandas": ("https://pandas.pydata.org/pandas-docs/stable/", None),
214+
"python": ("https://docs.python.org/3", None),
215+
"loguru": ("https://loguru.readthedocs.io/en/stable/", None),
214216
}
215217

216218

movement/__init__.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
from importlib.metadata import PackageNotFoundError, version
22

3-
from movement.utils.logging import configure_logging
3+
from movement.utils.logging import logger
44

55
try:
66
__version__ = version("movement")
7-
except PackageNotFoundError:
7+
except PackageNotFoundError: # pragma: no cover
88
# package is not installed
99
pass
1010

@@ -13,5 +13,5 @@
1313

1414
xr.set_options(keep_attrs=True, display_expand_data=False)
1515

16-
# initialize logger upon import
17-
configure_logging()
16+
# Configure logging to stderr and a file
17+
logger.configure()

movement/filtering.py

Lines changed: 17 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,12 @@
66
import xarray as xr
77
from scipy import signal
88

9+
910
from movement.kinematics import compute_displacement
1011
from movement.utils.logging import log_to_attrs
12+
13+
from movement.utils.logging import log_to_attrs, logger
14+
1115
from movement.utils.reports import report_nan_values
1216
from movement.utils.vector import compute_norm
1317

@@ -273,10 +277,16 @@ def rolling_filter(
273277

274278
# Compute the statistic over each window
275279
allowed_statistics = ["mean", "median", "max", "min"]
276-
if statistic not in allowed_statistics:
277280
raise ValueError(
278281
f"Invalid statistic '{statistic}'. "
279282
f"Must be one of {allowed_statistics}.",
283+
284+
raise logger.error(
285+
ValueError(
286+
f"Invalid statistic '{statistic}'. "
287+
f"Must be one of {allowed_statistics}."
288+
)
289+
280290
)
281291

282292
data_rolled = getattr(data_windows, statistic)(skipna=True)
@@ -343,7 +353,13 @@ def savgol_filter(
343353
344354
"""
345355
if "axis" in kwargs:
356+
346357
raise ValueError("The 'axis' argument may not be overridden.")
358+
359+
raise logger.error(
360+
ValueError("The 'axis' argument may not be overridden.")
361+
)
362+
347363
data_smoothed = data.copy()
348364
data_smoothed.values = signal.savgol_filter(
349365
data,

movement/io/load_bboxes.py

Lines changed: 5 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,6 @@
11
"""Load bounding boxes tracking data into ``movement``."""
22

33
import ast
4-
import logging
54
import re
65
from collections.abc import Callable
76
from pathlib import Path
@@ -11,16 +10,14 @@
1110
import pandas as pd
1211
import xarray as xr
1312

14-
from movement.utils.logging import log_error
13+
from movement.utils.logging import logger
1514
from movement.validators.datasets import ValidBboxesDataset
1615
from movement.validators.files import (
1716
DEFAULT_FRAME_REGEXP,
1817
ValidFile,
1918
ValidVIATracksCSV,
2019
)
2120

22-
logger = logging.getLogger(__name__)
23-
2421

2522
def from_numpy(
2623
position_array: np.ndarray,
@@ -229,8 +226,8 @@ def from_file(
229226
frame_regexp=frame_regexp,
230227
)
231228
else:
232-
raise log_error(
233-
ValueError, f"Unsupported source software: {source_software}"
229+
raise logger.error(
230+
ValueError(f"Unsupported source software: {source_software}")
234231
)
235232

236233

@@ -337,7 +334,7 @@ def from_via_tracks_file(
337334

338335
# Specific VIA-tracks .csv file validation
339336
via_file = ValidVIATracksCSV(file.path, frame_regexp=frame_regexp)
340-
logger.debug(f"Validated VIA tracks .csv file {via_file.path}.")
337+
logger.info(f"Validated VIA tracks .csv file {via_file.path}.")
341338

342339
# Create an xarray.Dataset from the data
343340
bboxes_arrays = _numpy_arrays_from_via_tracks_file(
@@ -363,8 +360,7 @@ def from_via_tracks_file(
363360
ds.attrs["source_software"] = "VIA-tracks"
364361
ds.attrs["source_file"] = file.path.as_posix()
365362

366-
logger.info(f"Loaded tracks of the bounding boxes from {via_file.path}:")
367-
logger.info(ds)
363+
logger.info(f"Loaded bounding boxes tracks from {via_file.path}:\n{ds}")
368364
return ds
369365

370366

movement/io/load_poses.py

Lines changed: 7 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
"""Load pose tracking data from various frameworks into ``movement``."""
22

3-
import logging
43
from pathlib import Path
54
from typing import Literal
65

@@ -11,7 +10,7 @@
1110
from sleap_io.io.slp import read_labels
1211
from sleap_io.model.labels import Labels
1312

14-
from movement.utils.logging import log_error, log_warning
13+
from movement.utils.logging import logger
1514
from movement.validators.datasets import ValidPosesDataset
1615
from movement.validators.files import (
1716
ValidAniposeCSV,
@@ -20,8 +19,6 @@
2019
ValidHDF5,
2120
)
2221

23-
logger = logging.getLogger(__name__)
24-
2522

2623
def from_numpy(
2724
position_array: np.ndarray,
@@ -151,8 +148,8 @@ def from_file(
151148
elif source_software == "Anipose":
152149
return from_anipose_file(file_path, fps, **kwargs)
153150
else:
154-
raise log_error(
155-
ValueError, f"Unsupported source software: {source_software}"
151+
raise logger.error(
152+
ValueError(f"Unsupported source software: {source_software}")
156153
)
157154

158155

@@ -291,8 +288,7 @@ def from_sleap_file(
291288
ds = _ds_from_sleap_labels_file(file.path, fps=fps)
292289
# Add metadata as attrs
293290
ds.attrs["source_file"] = file.path.as_posix()
294-
logger.info(f"Loaded pose tracks from {file.path}:")
295-
logger.info(ds)
291+
logger.info(f"Loaded pose tracks from {file.path}:\n{ds}")
296292
return ds
297293

298294

@@ -436,8 +432,7 @@ def _ds_from_lp_or_dlc_file(
436432
ds = from_dlc_style_df(df=df, fps=fps, source_software=source_software)
437433
# Add metadata as attrs
438434
ds.attrs["source_file"] = file.path.as_posix()
439-
logger.info(f"Loaded pose tracks from {file.path}:")
440-
logger.info(ds)
435+
logger.info(f"Loaded pose tracks from {file.path}:\n{ds}")
441436
return ds
442437

443438

@@ -469,7 +464,7 @@ def _ds_from_sleap_analysis_file(
469464
scores = np.full(tracks.shape[:1] + tracks.shape[2:], np.nan)
470465
individual_names = [n.decode() for n in f["track_names"][:]] or None
471466
if individual_names is None:
472-
log_warning(
467+
logger.warning(
473468
f"Could not find SLEAP Track in {file.path}. "
474469
"Assuming single-individual dataset and assigning "
475470
"default individual name."
@@ -513,7 +508,7 @@ def _ds_from_sleap_labels_file(
513508
tracks_with_scores = _sleap_labels_to_numpy(labels)
514509
individual_names = [track.name for track in labels.tracks] or None
515510
if individual_names is None:
516-
log_warning(
511+
logger.warning(
517512
f"Could not find SLEAP Track in {file.path}. "
518513
"Assuming single-individual dataset and assigning "
519514
"default individual name."

movement/io/save_poses.py

Lines changed: 9 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
"""Save pose tracking data from ``movement`` to various file formats."""
22

3-
import logging
43
from pathlib import Path
54
from typing import Literal
65

@@ -9,12 +8,10 @@
98
import pandas as pd
109
import xarray as xr
1110

12-
from movement.utils.logging import log_error
11+
from movement.utils.logging import logger
1312
from movement.validators.datasets import ValidPosesDataset
1413
from movement.validators.files import ValidFile
1514

16-
logger = logging.getLogger(__name__)
17-
1815

1916
def _ds_to_dlc_style_df(
2017
ds: xr.Dataset, columns: pd.MultiIndex
@@ -201,10 +198,11 @@ def to_dlc_file(
201198
split_individuals = _auto_split_individuals(ds)
202199

203200
elif not isinstance(split_individuals, bool):
204-
raise log_error(
205-
ValueError,
206-
"Expected 'split_individuals' to be a boolean or 'auto', but got "
207-
f"{type(split_individuals)}.",
201+
raise logger.error(
202+
ValueError(
203+
"Expected 'split_individuals' to be a boolean or 'auto', "
204+
f"but got {type(split_individuals)}."
205+
)
208206
)
209207

210208
if split_individuals:
@@ -417,7 +415,7 @@ def _validate_file_path(
417415
)
418416
except (OSError, ValueError) as error:
419417
logger.error(error)
420-
raise error
418+
raise
421419
return file
422420

423421

@@ -438,8 +436,8 @@ def _validate_dataset(ds: xr.Dataset) -> None:
438436
439437
"""
440438
if not isinstance(ds, xr.Dataset):
441-
raise log_error(
442-
TypeError, f"Expected an xarray Dataset, but got {type(ds)}."
439+
raise logger.error(
440+
TypeError(f"Expected an xarray Dataset, but got {type(ds)}.")
443441
)
444442

445443
missing_vars = set(ValidPosesDataset.VAR_NAMES) - set(ds.data_vars)

0 commit comments

Comments
 (0)