Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
76d9686
First files for the NIDM results examples SPM default batch
bclenet May 23, 2025
1507582
First files for the NIDM results examples SPM default batch
bclenet May 23, 2025
c95c120
Adding provenance records for the example
bclenet May 28, 2025
e4c41c7
Directory tree
bclenet May 28, 2025
5d65f71
Adding dataset in the listing
bclenet May 28, 2025
cc03977
Realignment output
bclenet Jun 4, 2025
9efb7a8
GeneratedBy in sidecar JSONs
bclenet Jun 4, 2025
d24493a
Issue with Normalize id
bclenet Jun 4, 2025
0f780a5
Merge branch 'master' into BEP028_spm
bclenet Jun 4, 2025
a145486
Codespell
bclenet Jun 4, 2025
f538e28
Modified batch to avoid keeping .nii.gz files + Complete .mat files t…
bclenet Jun 10, 2025
a31360f
Update command lines for activities
bclenet Jun 10, 2025
1709078
Digest and Type allowed in the sidecar JSON
bclenet Jun 10, 2025
10d79c3
Adding more authors
bclenet Jun 13, 2025
29323f8
Add DatasetLinks to resolve BIDS URIs
bclenet Jun 13, 2025
0190aea
Removing entities describing changes in .mat files
bclenet Jun 19, 2025
4582cf1
removing / after bids::prov
bclenet Jul 11, 2025
dcf1651
removing / after bids::prov
bclenet Jul 11, 2025
93f31fe
removing merge code
bclenet Jul 11, 2025
1e20f4d
removing attributedto link
bclenet Jul 11, 2025
3738b43
Update directory tree
bclenet Oct 1, 2025
1373b53
Merge branch 'master' into BEP028_spm
bclenet Oct 3, 2025
702dad3
Moving RDF graphs into docs/
bclenet Oct 16, 2025
5a52547
Example to be validated by BEP028 version of the schema
bclenet Oct 16, 2025
5ebe180
Codespell
bclenet Oct 16, 2025
4f32a1f
Readme update
bclenet Oct 16, 2025
eb0f115
bidsignore
bclenet Oct 17, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/workflows/validate_datasets.yml
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ jobs:
fi

- name: Skip legacy validation for post-legacy datasets
run: for DS in mrs_* dwi_deriv pet006 pheno004 volume_timing; do touch $DS/.SKIP_VALIDATION; done
run: for DS in mrs_* dwi_deriv pet006 pheno004 volume_timing provenance_*; do touch $DS/.SKIP_VALIDATION; done
if: matrix.bids-validator == 'legacy'

- name: Skip stable validation for datasets with unreleased validator features
Expand All @@ -119,7 +119,7 @@ jobs:
- name: Skip main validation for datasets with unreleased spec features
# Replace ${EMPTY} with dataset patterns, when this is needed
# Reset to "for DS in ${EMPTY}; ..." after a spec release
run: for DS in dwi_deriv pheno004; do touch $DS/.SKIP_VALIDATION; done
run: for DS in dwi_deriv pheno004 provenance_*; do touch $DS/.SKIP_VALIDATION; done
if: matrix.bids-validator != 'dev'

- name: Set BIDS_SCHEMA variable for dev version
Expand All @@ -128,7 +128,7 @@ jobs:
# Update this URL to the schema.json from PRs to the spec, when needed.
# If this variable is unset, dev will generally track the latest development
# release of https://jsr.io/@bids/schema
run: echo BIDS_SCHEMA=https://bids-specification.readthedocs.io/en/latest/schema.json >> $GITHUB_ENV
run: echo BIDS_SCHEMA=https://bids-specification--2099.org.readthedocs.build/en/2099/schema.json >> $GITHUB_ENV

- name: Validate all BIDS datasets using bids-validator
run: |
Expand Down
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -372,5 +372,6 @@ TABLE BELOW IS GENERATED AUTOMATICALLY.
DO NOT EDIT DIRECTLY.
-->

| name | description | datatypes | suffixes | link to full data | maintained by |
|--------|---------------|-------------|------------|---------------------|-----------------|
| name | description | datatypes | suffixes | link to full data | maintained by |
|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------|:------------|:-----------|:--------------------|:----------------|
| [provenance_spm_default Provenance metadata for a derivative dataset after functional MRI preprocessing performed with [`SPM`](https://www.fil.ion.ucl.ac.uk/spm/). This example is based upon [ds000011](cf. https://openfmri.org/dataset/ds000011/) data. [@bclenet](https://github.com/bclenet) anat, func T1w, bold, seg8, act, ent, soft](https://github.com/bids-standard/bids-examples/tree/master/provenance_spm_default Provenance metadata for a derivative dataset after functional MRI preprocessing performed with [`SPM`](https://www.fil.ion.ucl.ac.uk/spm/). This example is based upon [ds000011](cf. https://openfmri.org/dataset/ds000011/) data. [@bclenet](https://github.com/bclenet) anat, func T1w, bold, seg8, act, ent, soft) | n/a | n/a | n/a | n/a | n/a |
1 change: 1 addition & 0 deletions dataset_listing.tsv
Original file line number Diff line number Diff line change
Expand Up @@ -86,3 +86,4 @@ xeeg_hed_score EEG and iEEG data with annotations of artifacts, seizures and mod
dwi_deriv exemplifies the storage of diffusion MRI derivates that may be generated on the Siemens XA platform. dwi dwi
pheno004 Minimal dataset with subjects with imaging and/or phenotype data [@ericearl](https://github.com/ericearl) phenotype, anat T1w
mri_chunk Example MRI dataset to illustrate BIDS chunk entity. A single subject, two chunks. [@valosekj](https://github.com/valosekj) anat T1w
provenance_spm_default Provenance metadata for a derivative dataset after functional MRI preprocessing performed with [`SPM`](https://www.fil.ion.ucl.ac.uk/spm/). This example is based upon [ds000011](cf. https://openfmri.org/dataset/ds000011/) data. [@bclenet](https://github.com/bclenet) anat, func T1w, bold, seg8, act, ent, soft
1 change: 1 addition & 0 deletions provenance_spm/.bidsignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
sub-01/*
77 changes: 77 additions & 0 deletions provenance_spm/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
# Provenance of fMRI preprocessing with SPM

This example aims at showing provenance metadata for a functional MRI preprocessing performed with [`SPM`](https://www.fil.ion.ucl.ac.uk/spm/). Provenance metadata was created manually ; it acts as a guideline for further machine-generated provenance by `SPM`.

> [!WARNING]
> Validation of the `sub-01/` directory is ignored through a `.bidsignore` file, as the aim of the example is to focus on provenance metadata.

## Original dataset

This is a derivative dataset, based upon `sub-01` data from OpenfMRI DS000011 classification learning and tone counting experiment (cf. https://openfmri.org/dataset/ds000011/).

## Code

The Matlab batch file `code/spm_preprocessing.m` allows to perform the preprocessing.

## Directory tree

The directory tree is as follows. Files marked with a ✍️ were generated manually, other files were generated by the preprocessing step.

> [!NOTE]
> Note that the `docs/` directory contains explanatory data (see [Provenance as a RDF graph](#provenance-as-a-rdf-graph)) that is not required to encode provenance.

```
.
├── ✍️ code
│ └── ✍️ spm_preprocessing.m
├── ✍️ dataset_description.json
├── ✍️ docs
│ ├── ✍️ prov-spm.jsonld
│ └── ✍️ prov-spm.png
├── ✍️ prov
│ ├── ✍️ prov-spm_act.json
│ ├── ✍️ prov-spm_ent.json
│ └── ✍️ prov-spm_soft.json
├── ✍️ README.md
└── sub-01
├── anat
│ ├── ✍️ c1sub-01_T1w.json
│ ├── c1sub-01_T1w.nii
│ ├── ✍️ c2sub-01_T1w.json
│ ├── c2sub-01_T1w.nii
│ ├── ✍️ c3sub-01_T1w.json
│ ├── c3sub-01_T1w.nii
│ ├── ✍️ c4sub-01_T1w.json
│ ├── c4sub-01_T1w.nii
│ ├── ✍️ c5sub-01_T1w.json
│ ├── c5sub-01_T1w.nii
│ ├── ✍️ msub-01_T1w.json
│ ├── msub-01_T1w.nii
│ ├── ✍️ sub-01_T1w.json
│ ├── sub-01_T1w.nii
│ ├── ✍️ sub-01_T1w_seg8.json
│ ├── sub-01_T1w_seg8.mat
│ ├── ✍️ wmsub-01_T1w.json
│ ├── wmsub-01_T1w.nii
│ ├── ✍️ y_sub-01_T1w.json
│ └── y_sub-01_T1w.nii
└── func
├── ✍️ meansub-01_task-tonecounting_bold.json
├── meansub-01_task-tonecounting_bold.nii
├── ✍️ rp_sub-01_task-tonecounting_bold.json
├── rp_sub-01_task-tonecounting_bold.txt
├── ✍️ rsub-01_task-tonecounting_bold.json
├── rsub-01_task-tonecounting_bold.nii
├── sub-01_task-tonecounting_bold.mat
├── sub-01_task-tonecounting_bold.nii
├── ✍️ swrsub-01_task-tonecounting_bold.json
├── swrsub-01_task-tonecounting_bold.nii
├── ✍️ wrsub-01_task-tonecounting_bold.json
└── wrsub-01_task-tonecounting_bold.nii
```

## Provenance as a RDF graph

Provenance metadata can be aggregated as a JSON-LD RDF graph, which is available in [`docs/prov-spm.jsonld`](docs/prov-spm.jsonld). This is a rendered version of the graph, also available in docs/prov-spm.png.

![Rendered version of the RDF graph](docs/prov-spm.png)
89 changes: 89 additions & 0 deletions provenance_spm/code/spm_preprocessing.m
Original file line number Diff line number Diff line change
@@ -0,0 +1,89 @@
%-----------------------------------------------------------------------
% Job saved on 06-Jun-2025 15:40:49 by cfg_util (rev $Rev: 7345 $)
% spm SPM - SPM12 (7771)
% cfg_basicio BasicIO - Unknown
%-----------------------------------------------------------------------
matlabbatch{1}.cfg_basicio.file_dir.file_ops.file_move.files = {'data/sub-01/func/sub-01_task-tonecounting_bold.nii.gz'};
matlabbatch{1}.cfg_basicio.file_dir.file_ops.file_move.action.copyto = {'sub-01/func'};
matlabbatch{2}.cfg_basicio.file_dir.file_ops.file_move.files = {'data/sub-01/anat/sub-01_T1w.nii.gz'};
matlabbatch{2}.cfg_basicio.file_dir.file_ops.file_move.action.copyto = {'sub-01/anat'};
matlabbatch{3}.cfg_basicio.file_dir.file_ops.cfg_gunzip_files.files(1) = cfg_dep('Move/Delete Files: Moved/Copied Files', substruct('.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('.','files'));
matlabbatch{3}.cfg_basicio.file_dir.file_ops.cfg_gunzip_files.outdir = {''};
matlabbatch{3}.cfg_basicio.file_dir.file_ops.cfg_gunzip_files.keep = false;
matlabbatch{4}.cfg_basicio.file_dir.file_ops.cfg_gunzip_files.files(1) = cfg_dep('Move/Delete Files: Moved/Copied Files', substruct('.','val', '{}',{2}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('.','files'));
matlabbatch{4}.cfg_basicio.file_dir.file_ops.cfg_gunzip_files.outdir = {''};
matlabbatch{4}.cfg_basicio.file_dir.file_ops.cfg_gunzip_files.keep = false;
matlabbatch{5}.spm.spatial.realign.estwrite.data{1}(1) = cfg_dep('Gunzip Files: Gunzipped Files', substruct('.','val', '{}',{3}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('()',{':'}));
matlabbatch{5}.spm.spatial.realign.estwrite.eoptions.quality = 0.9;
matlabbatch{5}.spm.spatial.realign.estwrite.eoptions.sep = 4;
matlabbatch{5}.spm.spatial.realign.estwrite.eoptions.fwhm = 5;
matlabbatch{5}.spm.spatial.realign.estwrite.eoptions.rtm = 1;
matlabbatch{5}.spm.spatial.realign.estwrite.eoptions.interp = 2;
matlabbatch{5}.spm.spatial.realign.estwrite.eoptions.wrap = [0 0 0];
matlabbatch{5}.spm.spatial.realign.estwrite.eoptions.weight = '';
matlabbatch{5}.spm.spatial.realign.estwrite.roptions.which = [2 1];
matlabbatch{5}.spm.spatial.realign.estwrite.roptions.interp = 4;
matlabbatch{5}.spm.spatial.realign.estwrite.roptions.wrap = [0 0 0];
matlabbatch{5}.spm.spatial.realign.estwrite.roptions.mask = 1;
matlabbatch{5}.spm.spatial.realign.estwrite.roptions.prefix = 'r';
matlabbatch{6}.spm.spatial.coreg.estimate.ref(1) = cfg_dep('Realign: Estimate & Reslice: Mean Image', substruct('.','val', '{}',{5}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('.','rmean'));
matlabbatch{6}.spm.spatial.coreg.estimate.source(1) = cfg_dep('Gunzip Files: Gunzipped Files', substruct('.','val', '{}',{4}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('()',{':'}));
matlabbatch{6}.spm.spatial.coreg.estimate.other = {''};
matlabbatch{6}.spm.spatial.coreg.estimate.eoptions.cost_fun = 'nmi';
matlabbatch{6}.spm.spatial.coreg.estimate.eoptions.sep = [4 2];
matlabbatch{6}.spm.spatial.coreg.estimate.eoptions.tol = [0.02 0.02 0.02 0.001 0.001 0.001 0.01 0.01 0.01 0.001 0.001 0.001];
matlabbatch{6}.spm.spatial.coreg.estimate.eoptions.fwhm = [7 7];
matlabbatch{7}.spm.spatial.preproc.channel.vols(1) = cfg_dep('Gunzip Files: Gunzipped Files', substruct('.','val', '{}',{4}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('()',{':'}));
matlabbatch{7}.spm.spatial.preproc.channel.biasreg = 0.001;
matlabbatch{7}.spm.spatial.preproc.channel.biasfwhm = 60;
matlabbatch{7}.spm.spatial.preproc.channel.write = [0 1];
matlabbatch{7}.spm.spatial.preproc.tissue(1).tpm = {'/opt/spm12-r7219/spm12_mcr/spm12/tpm/TPM.nii,1'};
matlabbatch{7}.spm.spatial.preproc.tissue(1).ngaus = 1;
matlabbatch{7}.spm.spatial.preproc.tissue(1).native = [1 0];
matlabbatch{7}.spm.spatial.preproc.tissue(1).warped = [0 0];
matlabbatch{7}.spm.spatial.preproc.tissue(2).tpm = {'/opt/spm12-r7219/spm12_mcr/spm12/tpm/TPM.nii,2'};
matlabbatch{7}.spm.spatial.preproc.tissue(2).ngaus = 1;
matlabbatch{7}.spm.spatial.preproc.tissue(2).native = [1 0];
matlabbatch{7}.spm.spatial.preproc.tissue(2).warped = [0 0];
matlabbatch{7}.spm.spatial.preproc.tissue(3).tpm = {'/opt/spm12-r7219/spm12_mcr/spm12/tpm/TPM.nii,3'};
matlabbatch{7}.spm.spatial.preproc.tissue(3).ngaus = 2;
matlabbatch{7}.spm.spatial.preproc.tissue(3).native = [1 0];
matlabbatch{7}.spm.spatial.preproc.tissue(3).warped = [0 0];
matlabbatch{7}.spm.spatial.preproc.tissue(4).tpm = {'/opt/spm12-r7219/spm12_mcr/spm12/tpm/TPM.nii,4'};
matlabbatch{7}.spm.spatial.preproc.tissue(4).ngaus = 3;
matlabbatch{7}.spm.spatial.preproc.tissue(4).native = [1 0];
matlabbatch{7}.spm.spatial.preproc.tissue(4).warped = [0 0];
matlabbatch{7}.spm.spatial.preproc.tissue(5).tpm = {'/opt/spm12-r7219/spm12_mcr/spm12/tpm/TPM.nii,5'};
matlabbatch{7}.spm.spatial.preproc.tissue(5).ngaus = 4;
matlabbatch{7}.spm.spatial.preproc.tissue(5).native = [1 0];
matlabbatch{7}.spm.spatial.preproc.tissue(5).warped = [0 0];
matlabbatch{7}.spm.spatial.preproc.tissue(6).tpm = {'/opt/spm12-r7219/spm12_mcr/spm12/tpm/TPM.nii,6'};
matlabbatch{7}.spm.spatial.preproc.tissue(6).ngaus = 2;
matlabbatch{7}.spm.spatial.preproc.tissue(6).native = [0 0];
matlabbatch{7}.spm.spatial.preproc.tissue(6).warped = [0 0];
matlabbatch{7}.spm.spatial.preproc.warp.mrf = 1;
matlabbatch{7}.spm.spatial.preproc.warp.cleanup = 1;
matlabbatch{7}.spm.spatial.preproc.warp.reg = [0 0.001 0.5 0.05 0.2];
matlabbatch{7}.spm.spatial.preproc.warp.affreg = 'mni';
matlabbatch{7}.spm.spatial.preproc.warp.fwhm = 0;
matlabbatch{7}.spm.spatial.preproc.warp.samp = 3;
matlabbatch{7}.spm.spatial.preproc.warp.write = [0 1];
matlabbatch{7}.spm.spatial.preproc.warp.vox = NaN;
matlabbatch{7}.spm.spatial.preproc.warp.bb = [NaN NaN NaN NaN NaN NaN];
matlabbatch{8}.spm.spatial.normalise.write.subj.def(1) = cfg_dep('Segment: Forward Deformations', substruct('.','val', '{}',{7}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('.','fordef', '()',{':'}));
matlabbatch{8}.spm.spatial.normalise.write.subj.resample(1) = cfg_dep('Realign: Estimate & Reslice: Resliced Images (Sess 1)', substruct('.','val', '{}',{5}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('.','sess', '()',{1}, '.','rfiles'));
matlabbatch{8}.spm.spatial.normalise.write.woptions.bb = [-78 -112 -70 78 76 85];
matlabbatch{8}.spm.spatial.normalise.write.woptions.vox = [2 2 2];
matlabbatch{8}.spm.spatial.normalise.write.woptions.interp = 4;
matlabbatch{8}.spm.spatial.normalise.write.woptions.prefix = 'w';
matlabbatch{9}.spm.spatial.normalise.write.subj.def(1) = cfg_dep('Segment: Forward Deformations', substruct('.','val', '{}',{7}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('.','fordef', '()',{':'}));
matlabbatch{9}.spm.spatial.normalise.write.subj.resample(1) = cfg_dep('Segment: Bias Corrected (1)', substruct('.','val', '{}',{7}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('.','channel', '()',{1}, '.','biascorr', '()',{':'}));
matlabbatch{9}.spm.spatial.normalise.write.woptions.bb = [-78 -112 -70 78 76 85];
matlabbatch{9}.spm.spatial.normalise.write.woptions.vox = [2 2 2];
matlabbatch{9}.spm.spatial.normalise.write.woptions.interp = 4;
matlabbatch{9}.spm.spatial.normalise.write.woptions.prefix = 'w';
matlabbatch{10}.spm.spatial.smooth.data(1) = cfg_dep('Normalise: Write: Normalised Images (Subj 1)', substruct('.','val', '{}',{8}, '.','val', '{}',{1}, '.','val', '{}',{1}, '.','val', '{}',{1}), substruct('()',{1}, '.','files'));
matlabbatch{10}.spm.spatial.smooth.fwhm = [6 6 6];
matlabbatch{10}.spm.spatial.smooth.dtype = 0;
matlabbatch{10}.spm.spatial.smooth.im = 0;
matlabbatch{10}.spm.spatial.smooth.prefix = 's';
29 changes: 29 additions & 0 deletions provenance_spm/dataset_description.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
{
"Name": "Provenance records for SPM-based fMRI statistical analysis",
"BIDSVersion": "1.10.0",
"DatasetType": "derivative",
"License": "CC0",
"Authors": [
"Boris Clénet",
"Thomas Betton",
"Hermann Courteille",
"Cyril Regan"
],
"GeneratedBy": [
{
"Name": "SPM preprocessing"
}
],
"SourceDatasets": [
{
"URL": "ds000011",
"Version": "1.0.0"
},
{
"URL": "https://github.com/incf-nidash/nidmresults-examples/"
}
],
"DatasetLinks": {
"ds000011": "https://doi.org/10.18112/openneuro.ds000011.v1.0.0"
}
}
Loading
Loading