bruAristimunha commited on
Commit
38e0733
·
verified ·
1 Parent(s): f76fb96

Add architecture-only model card

Browse files
Files changed (1) hide show
  1. README.md +186 -0
README.md ADDED
@@ -0,0 +1,186 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: bsd-3-clause
3
+ library_name: braindecode
4
+ pipeline_tag: feature-extraction
5
+ tags:
6
+ - eeg
7
+ - biosignal
8
+ - pytorch
9
+ - neuroscience
10
+ - braindecode
11
+ - convolutional
12
+ - transformer
13
+ - sleep-staging
14
+ ---
15
+
16
+ # ContraWR
17
+
18
+ Contrast with the World Representation ContraWR from Yang et al (2021) .
19
+
20
+ > **Architecture-only repository.** This repo documents the
21
+ > `braindecode.models.ContraWR` class. **No pretrained weights are
22
+ > distributed here** — instantiate the model and train it on your own
23
+ > data, or fine-tune from a published foundation-model checkpoint
24
+ > separately.
25
+
26
+ ## Quick start
27
+
28
+ ```bash
29
+ pip install braindecode
30
+ ```
31
+
32
+ ```python
33
+ from braindecode.models import ContraWR
34
+
35
+ model = ContraWR(
36
+ n_chans=22,
37
+ sfreq=250,
38
+ input_window_seconds=4.0,
39
+ n_outputs=4,
40
+ )
41
+ ```
42
+
43
+ The signal-shape arguments above are example defaults — adjust them
44
+ to match your recording.
45
+
46
+ ## Documentation
47
+
48
+ - Full API reference (parameters, references, architecture figure):
49
+ <https://braindecode.org/stable/generated/braindecode.models.ContraWR.html>
50
+ - Interactive browser with live instantiation:
51
+ <https://huggingface.co/spaces/braindecode/model-explorer>
52
+ - Source on GitHub: <https://github.com/braindecode/braindecode/blob/master/braindecode/models/contrawr.py#L10>
53
+
54
+ ## Architecture description
55
+
56
+ The block below is the rendered class docstring (parameters,
57
+ references, architecture figure where available).
58
+
59
+ <div class='bd-doc'><main>
60
+ <p>Contrast with the World Representation ContraWR from Yang et al (2021) [Yang2021]_.</p>
61
+ <span style="display:inline-block;padding:2px 8px;border-radius:4px;background:#5cb85c;color:white;font-size:11px;font-weight:600;margin-right:4px;">Convolution</span>
62
+
63
+
64
+
65
+ This model is a convolutional neural network that uses a spectral
66
+ representation with a series of convolutional layers and residual blocks.
67
+ The model is designed to learn a representation of the EEG signal that can
68
+ be used for sleep staging.
69
+
70
+ Parameters
71
+ ----------
72
+ steps : int, optional
73
+ Number of steps to take the frequency decomposition `hop_length`
74
+ parameters by default 20.
75
+ emb_size : int, optional
76
+ Embedding size for the final layer, by default 256.
77
+ res_channels : list[int], optional
78
+ Number of channels for each residual block, by default [32, 64, 128].
79
+ activation: nn.Module, default=nn.ELU
80
+ Activation function class to apply. Should be a PyTorch activation
81
+ module class like ``nn.ReLU`` or ``nn.ELU``. Default is ``nn.ELU``.
82
+ drop_prob : float, default=0.5
83
+ The dropout rate for regularization. Values should be between 0 and 1.
84
+
85
+ .. versionadded:: 0.9
86
+
87
+ Notes
88
+ -----
89
+ This implementation is not guaranteed to be correct, has not been checked
90
+ by original authors. The modifications are minimal and the model is expected
91
+ to work as intended. the original code from [Code2023]_.
92
+
93
+ References
94
+ ----------
95
+ .. [Yang2021] Yang, C., Xiao, C., Westover, M. B., & Sun, J. (2023).
96
+ Self-supervised electroencephalogram representation learning for automatic
97
+ sleep staging: model development and evaluation study. JMIR AI, 2(1), e46769.
98
+ .. [Code2023] Yang, C., Westover, M.B. and Sun, J., 2023. BIOT
99
+ Biosignal Transformer for Cross-data Learning in the Wild.
100
+ GitHub https://github.com/ycq091044/BIOT (accessed 2024-02-13)
101
+
102
+ .. rubric:: Hugging Face Hub integration
103
+
104
+ When the optional ``huggingface_hub`` package is installed, all models
105
+ automatically gain the ability to be pushed to and loaded from the
106
+ Hugging Face Hub. Install with::
107
+
108
+ pip install braindecode[hub]
109
+
110
+ **Pushing a model to the Hub:**
111
+
112
+ .. code::
113
+ from braindecode.models import ContraWR
114
+
115
+ # Train your model
116
+ model = ContraWR(n_chans=22, n_outputs=4, n_times=1000)
117
+ # ... training code ...
118
+
119
+ # Push to the Hub
120
+ model.push_to_hub(
121
+ repo_id="username/my-contrawr-model",
122
+ commit_message="Initial model upload",
123
+ )
124
+
125
+ **Loading a model from the Hub:**
126
+
127
+ .. code::
128
+ from braindecode.models import ContraWR
129
+
130
+ # Load pretrained model
131
+ model = ContraWR.from_pretrained("username/my-contrawr-model")
132
+
133
+ # Load with a different number of outputs (head is rebuilt automatically)
134
+ model = ContraWR.from_pretrained("username/my-contrawr-model", n_outputs=4)
135
+
136
+ **Extracting features and replacing the head:**
137
+
138
+ .. code::
139
+ import torch
140
+
141
+ x = torch.randn(1, model.n_chans, model.n_times)
142
+ # Extract encoder features (consistent dict across all models)
143
+ out = model(x, return_features=True)
144
+ features = out["features"]
145
+
146
+ # Replace the classification head
147
+ model.reset_head(n_outputs=10)
148
+
149
+ **Saving and restoring full configuration:**
150
+
151
+ .. code::
152
+ import json
153
+
154
+ config = model.get_config() # all __init__ params
155
+ with open("config.json", "w") as f:
156
+ json.dump(config, f)
157
+
158
+ model2 = ContraWR.from_config(config) # reconstruct (no weights)
159
+
160
+ All model parameters (both EEG-specific and model-specific such as
161
+ dropout rates, activation functions, number of filters) are automatically
162
+ saved to the Hub and restored when loading.
163
+
164
+ See :ref:`load-pretrained-models` for a complete tutorial.</main>
165
+ </div>
166
+
167
+ ## Citation
168
+
169
+ Please cite both the original paper for this architecture (see the
170
+ *References* section above) and braindecode:
171
+
172
+ ```bibtex
173
+ @article{aristimunha2025braindecode,
174
+ title = {Braindecode: a deep learning library for raw electrophysiological data},
175
+ author = {Aristimunha, Bruno and others},
176
+ journal = {Zenodo},
177
+ year = {2025},
178
+ doi = {10.5281/zenodo.17699192},
179
+ }
180
+ ```
181
+
182
+ ## License
183
+
184
+ BSD-3-Clause for the model code (matching braindecode).
185
+ Pretraining-derived weights, if you fine-tune from a checkpoint,
186
+ inherit the licence of that checkpoint and its training corpus.