File size: 917 Bytes
2df50aa
 
20e7efd
 
 
 
 
2df50aa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20e7efd
 
2df50aa
20e7efd
2df50aa
 
 
20e7efd
 
2df50aa
 
 
20e7efd
2df50aa
20e7efd
2df50aa
20e7efd
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
library_name: transformers
tags:
- '#mergekit '
- '#arcee-ai'
datasets:
- arcee-ai/sec-data-mini
---

# Model Card for Model ID

<!-- Provide a quick summary of what the model is/does. -->



## Model Details

### Model Description

<!-- Provide a longer summary of what this model is. -->

This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.

- **Developed by:** Arcee-ai
- **Create from model :** mistral-ai/Mistral-7B-Instruct-v0.2

### Model Sources

<!-- Provide the basic links for the model. -->

- **Repository:** [https://github.com/arcee-ai/PruneMe]
- **Paper :** [https://arxiv.org/pdf/2403.17887.pdf]

## Uses

Some of the use cases - https://github.com/arcee-ai/PruneMe/tree/main?tab=readme-ov-file#use-cases

### Downstream Use

Can be use to finetune. It would be nice to explore the continual pre-training as well.