Model Card for Model ID
Model Details
Model Description
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- Developed by: Isaac Chung
- License: [Apache 2.0]
- Finetuned from model [optional]: openbmb/MiniCPM-2B-sft-bf16
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
How to Get Started with the Model
Use the code below to get started with the model.
# Load model directly
from transformers import AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("isaacchung/MiniCPM-2B-RAFT-lora-hotpotqa-dev", trust_remote_code=True)
Training Details
Training Data
isaacchung/hotpotqa-dev-raft-subset
Training Procedure
Training Hyperparameters
-->See https://github.com/isaac-chung/MiniCPM/commit/213282b679eb8eb054bb13f02af71b9d71ad3721.
Speeds, Sizes, Times [optional]
- train_runtime: 4607.6477
- train_samples_per_second: 5.209
- train_steps_per_second: 0.651
- train_loss: 0.5153028686841329
- epoch: 9.52
Training Loss
From the last epoch:
'loss': 0.4504, 'grad_norm': 2.259155507591921, 'learning_rate': 2.7586206896551725e-06, 'epoch': 9.02}
{'loss': 0.431, 'grad_norm': 1.7071411656099411, 'learning_rate': 2.586206896551724e-06, 'epoch': 9.05}
{'loss': 0.4627, 'grad_norm': 1.7915555416805786, 'learning_rate': 2.413793103448276e-06, 'epoch': 9.08}
{'loss': 0.4528, 'grad_norm': 1.9988269942330565, 'learning_rate': 2.2413793103448275e-06, 'epoch': 9.11}
{'loss': 0.445, 'grad_norm': 1.8423666856380017, 'learning_rate': 2.0689655172413796e-06, 'epoch': 9.14}
{'loss': 0.4424, 'grad_norm': 1.7539963730934427, 'learning_rate': 1.896551724137931e-06, 'epoch': 9.17}
{'loss': 0.3817, 'grad_norm': 1.755668315740134, 'learning_rate': 1.724137931034483e-06, 'epoch': 9.21}
{'loss': 0.4012, 'grad_norm': 1.8214703589809635, 'learning_rate': 1.5517241379310346e-06, 'epoch': 9.24}
{'loss': 0.4567, 'grad_norm': 1.6490771602855827, 'learning_rate': 1.3793103448275862e-06, 'epoch': 9.27}
{'loss': 0.491, 'grad_norm': 1.5838108179327266, 'learning_rate': 1.206896551724138e-06, 'epoch': 9.3}
{'loss': 0.516, 'grad_norm': 1.7848893180960532, 'learning_rate': 1.0344827586206898e-06, 'epoch': 9.33}
{'loss': 0.3674, 'grad_norm': 1.6589815898285354, 'learning_rate': 8.620689655172415e-07, 'epoch': 9.37}
{'loss': 0.455, 'grad_norm': 1.6377170040397837, 'learning_rate': 6.896551724137931e-07, 'epoch': 9.4}
{'loss': 0.4322, 'grad_norm': 1.7061632686271986, 'learning_rate': 5.172413793103449e-07, 'epoch': 9.43}
{'loss': 0.3934, 'grad_norm': 1.784527156508834, 'learning_rate': 3.4482758620689656e-07, 'epoch': 9.46}
{'loss': 0.4457, 'grad_norm': 1.5131773700813846, 'learning_rate': 1.7241379310344828e-07, 'epoch': 9.49}
{'loss': 0.4026, 'grad_norm': 1.8239453129182908, 'learning_rate': 0.0, 'epoch': 9.52}
Technical Specifications [optional]
Compute Infrastructure
Hardware
- 1x NVIDIA RTX 6000 Ada
Model Card Authors
Model Card Contact
- Downloads last month
- 15
Inference API (serverless) does not yet support model repos that contain custom code.