dbroustail commited on
Commit
ced21fc
Β·
verified Β·
1 Parent(s): 32e065b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +28 -5
README.md CHANGED
@@ -127,7 +127,7 @@ model-index:
127
  metrics:
128
  - type: r2
129
  value: 0.116
130
- name: R squared
131
  - type: rmse
132
  value: 0.1482
133
  name: Root Mean Squared Error
@@ -144,6 +144,22 @@ model-index:
144
  - type: cohen_kappa
145
  value: 0.191
146
  name: Cohen's Kappa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
147
  ---
148
  <div align="center">
149
  <img src="https://raw.githubusercontent.com/pulp-bio/BioFoundation/refs/heads/main/docs/model/logo/LuMamba_logo.png" alt="LuMamba Logo" width="800"/>
@@ -214,12 +230,19 @@ Larger model sizes can be attained by increasing the number of bi-Mamba blocks `
214
 
215
  ---
216
 
217
- ## πŸ“Š Results (Highlights)
218
 
219
- - **TUAB (abnormal vs normal):** 80.99 % Bal. Acc., 0.883 AUROC, 0.892 AUPR.
220
- (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
 
221
  - **APAVA (Alzheimer's detection)**: 0.955 AUROC, 0.970 AUPR (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
222
  - **TDBrain (Parkinson's detection)**: 0.961 AUROC, 0.960 AUPR (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
 
 
 
 
 
 
223
 
224
  **Efficiency:** Up to **377Γ— fewer FLOPs** relative to transformer-based baselines and supporting up to **500x longer** EEG windows, thanks to the efficient FEMBA bi-Mamba encoder.
225
 
@@ -336,7 +359,7 @@ If you use LuMamba, please cite:
336
  - **[LUNA](https://huggingface.co/PulpBio/LUNA)** β€” Transformer-based topology-agnostic EEG foundation model (NeurIPS 2025). Source of the channel-unification cross-attention module that LuMamba reuses.
337
  - **[FEMBA](https://huggingface.co/PulpBio/FEMBA)** β€” Bidirectional Mamba foundation model for EEG. Source of the linear-complexity temporal backbone that LuMamba reuses.
338
  - **[TinyMyo](https://huggingface.co/PulpBio/TinyMyo)** β€” Tiny foundation model for flexible EMG signal processing at the edge.
339
- -
340
  ## πŸ—’οΈ Changelog
341
 
342
  - **v1.0:** Initial release of LuMamba model card with task-specific checkpoints and instructions.
 
127
  metrics:
128
  - type: r2
129
  value: 0.116
130
+ name: R-squared
131
  - type: rmse
132
  value: 0.1482
133
  name: Root Mean Squared Error
 
144
  - type: cohen_kappa
145
  value: 0.191
146
  name: Cohen's Kappa
147
+ - task:
148
+ type: time-series-classification
149
+ name: Major Depressive Disorder Detection
150
+ dataset:
151
+ type: MODMA
152
+ name: MODMA
153
+ metrics:
154
+ - type: balanced_accuracy
155
+ value: 0.595
156
+ name: Balanced Accuracy (%)
157
+ - type: roc_auc
158
+ value: 0.448
159
+ name: AUROC
160
+ - type: pr_auc
161
+ value: 0.420
162
+ name: AUC-PR
163
  ---
164
  <div align="center">
165
  <img src="https://raw.githubusercontent.com/pulp-bio/BioFoundation/refs/heads/main/docs/model/logo/LuMamba_logo.png" alt="LuMamba Logo" width="800"/>
 
230
 
231
  ---
232
 
233
+ ## πŸ“Š Results
234
 
235
+ - **TUAB (abnormal vs normal):** 80.99 % Bal. Acc., 0.883 AUROC, 0.892 AUPR (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
236
+ - **TUSL (slowing event VS. seizure detection)**: 0.708 AUROC, 0.289 AUPR (LuMamba-Tiny, pre-trained with reconstruction-only).
237
+ - **TUAR (artifact detection)**: 0.914 AUROC, 0.510 AUPR (LuMamba-Tiny, pre-trained with reconstruction-only).
238
  - **APAVA (Alzheimer's detection)**: 0.955 AUROC, 0.970 AUPR (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
239
  - **TDBrain (Parkinson's detection)**: 0.961 AUROC, 0.960 AUPR (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
240
+ - **Mumtaz2016 (Depression detection)**: 0.725 Bal. Acc., 0.931 AUROC, 0.952 AUPR (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
241
+ - **SEED-V (5-class emotion detection)**: 0.350 Bal. Acc., 0.191 Cohen's Kappa (LuMamba-Tiny, pre-trained with reconstruction-only).
242
+ - **MoBI (gait prediction)**: 0.116 R-squared, 0.148 RMSE (LuMamba-Tiny, pre-trained with reconstruction-only).
243
+ - **MODMA (full 128-channel set)**: 59.47 % Bal. Acc., 0.448 AUROC, 0.420 AUPR (LuMamba-Tiny, pre-trained with reconstruction-only)
244
+ - **MODMA (reduced 13-channel subset)**: 59.09 % Bal. Acc., 0.522 AUROC, 0.4153 AUPR (LuMamba-Tiny, pre-trained with LeJEPA-reconstruction).
245
+
246
 
247
  **Efficiency:** Up to **377Γ— fewer FLOPs** relative to transformer-based baselines and supporting up to **500x longer** EEG windows, thanks to the efficient FEMBA bi-Mamba encoder.
248
 
 
359
  - **[LUNA](https://huggingface.co/PulpBio/LUNA)** β€” Transformer-based topology-agnostic EEG foundation model (NeurIPS 2025). Source of the channel-unification cross-attention module that LuMamba reuses.
360
  - **[FEMBA](https://huggingface.co/PulpBio/FEMBA)** β€” Bidirectional Mamba foundation model for EEG. Source of the linear-complexity temporal backbone that LuMamba reuses.
361
  - **[TinyMyo](https://huggingface.co/PulpBio/TinyMyo)** β€” Tiny foundation model for flexible EMG signal processing at the edge.
362
+
363
  ## πŸ—’οΈ Changelog
364
 
365
  - **v1.0:** Initial release of LuMamba model card with task-specific checkpoints and instructions.