You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/source/usage.rst
+26-3
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ Firstly, why use CEBRA?
12
12
13
13
CEBRA is primarily designed for producing robust, consistent extractions of latent factors from time-series data. It supports three modes, and is a self-supervised representation learning algorithm that uses our modified contrastive learning approach designed for multi-modal time-series data. In short, it is a type of non-linear dimensionality reduction, like `tSNE <https://www.jmlr.org/papers/v9/vandermaaten08a.html>`_ and `UMAP <https://arxiv.org/abs/1802.03426>`_. We show in our original paper that it outperforms tSNE and UMAP at producing closer-to-ground-truth latents and is more consistent.
14
14
15
-
That being said, CEBRA can be used on non-time-series data and it does not strictly require multi-modal data. In general, we recommend considering using CEBRA for measuring changes in consistency across conditions (brain areas, cells, animals), for hypothesis-guided decoding, and for toplogical exploration of the resulting embedding spaces. It can also be used for visualization and considering dynamics within the embedding space. For examples of how CEBRA can be used to map space, decode natural movies, and make hypotheses for neural coding of sensorimotor systems, see our paper (Schneider, Lee, Mathis, 2023).
15
+
That being said, CEBRA can be used on non-time-series data and it does not strictly require multi-modal data. In general, we recommend considering using CEBRA for measuring changes in consistency across conditions (brain areas, cells, animals), for hypothesis-guided decoding, and for topological exploration of the resulting embedding spaces. It can also be used for visualization and considering dynamics within the embedding space. For examples of how CEBRA can be used to map space, decode natural movies, and make hypotheses for neural coding of sensorimotor systems, see our paper (Schneider, Lee, Mathis, 2023).
16
16
17
17
The CEBRA workflow
18
18
------------------
@@ -419,10 +419,10 @@ We can now fit the model in different modes.
419
419
420
420
.. rubric:: Multi-session training
421
421
422
-
For multi-sesson training, lists of data are provided instead of a single dataset and eventual corresponding auxiliary variables.
422
+
For multi-session training, lists of data are provided instead of a single dataset and eventual corresponding auxiliary variables.
423
423
424
424
.. warning::
425
-
For now, multi-session training can only handle a **unique set of continuous labels**. All other combinations will raise an error.
425
+
For now, multi-session training can only handle a **unique set of continuous labels** or a **unique discrete label**. All other combinations will raise an error. For the continuous case we provide the following example:
426
426
427
427
428
428
.. testcode::
@@ -450,6 +450,29 @@ Once you defined your CEBRA model, you can run:
0 commit comments