Text Generation
PyTorch
Safetensors
English
openlm
mamba
linear
Eval Results
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -93,7 +93,7 @@ We follow their training recipe and release our version of Mamba-7B.
93
 
94
  ## Training Details
95
  - Mamba-7B was trained using AWS SageMaker on 128 H100 80GB GPUs.
96
- - Training began in March 2024 and lasted around 3 weeks (some down time due to crashes and loss spikes)
97
  | **Hyperparameter** | **Value** |
98
  |--------------------|------------|
99
  | Precision | `bfloat16` |
 
93
 
94
  ## Training Details
95
  - Mamba-7B was trained using AWS SageMaker on 128 H100 80GB GPUs.
96
+ - Training began in March 2024 and lasted three weeks.
97
  | **Hyperparameter** | **Value** |
98
  |--------------------|------------|
99
  | Precision | `bfloat16` |