PierreColombo commited on
Commit
4a6f144
1 Parent(s): cd3d59a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -6
README.md CHANGED
@@ -1,11 +1,15 @@
1
  ---
2
  library_name: transformers
3
- tags: []
 
 
 
 
4
  ---
5
 
6
- # SaulLM-7B-Base
7
 
8
- The base model for SaulLM-7B, a large language model tailored for Legal domain. This model is obtained by continue pretraining of Mistral-7B.
9
 
10
 
11
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/644a900e3a619fe72b14af0f/OU4Y3s-WckYKMN4fQkNiS.png)
@@ -22,7 +26,6 @@ This is the model card of a 🤗 transformers model that has been pushed on the
22
  - **Model type:** 7B
23
  - **Language(s) (NLP):** English
24
  - **License:** MIT
25
- - **Finetuned from model:** Check Saul-7B-Instruct
26
 
27
  ### Model Sources
28
 
@@ -46,7 +49,7 @@ Here's how you can run the model using the pipeline() function from 🤗 Transfo
46
  import torch
47
  from transformers import pipeline
48
 
49
- pipe = pipeline("text-generation", model="Equall/Saul-Base", torch_dtype=torch.bfloat16, device_map="auto")
50
  # We use the tokenizer’s chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
51
  messages = [
52
  {"role": "user", "content": "[YOUR QUERY GOES HERE]"},
@@ -65,6 +68,9 @@ This model is built upon the technology of LLM, which comes with inherent limita
65
 
66
  ## Citation
67
 
 
 
 
68
 
69
 
70
  ```bibtex
@@ -76,4 +82,4 @@ This model is built upon the technology of LLM, which comes with inherent limita
76
  archivePrefix={arXiv},
77
  primaryClass={cs.CL}
78
  }
79
- ```
 
1
  ---
2
  library_name: transformers
3
+ tags:
4
+ - legal
5
+ license: mit
6
+ language:
7
+ - en
8
  ---
9
 
10
+ # Equall/Saul-Base-v1
11
 
12
+ This is the base model for Equall/Saul-Base, a large instruct language model tailored for Legal domain. This model is obtained by continue pretraining of Mistral-7B.
13
 
14
 
15
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/644a900e3a619fe72b14af0f/OU4Y3s-WckYKMN4fQkNiS.png)
 
26
  - **Model type:** 7B
27
  - **Language(s) (NLP):** English
28
  - **License:** MIT
 
29
 
30
  ### Model Sources
31
 
 
49
  import torch
50
  from transformers import pipeline
51
 
52
+ pipe = pipeline("text-generation", model="Equall/Saul-Instruct-v1", torch_dtype=torch.bfloat16, device_map="auto")
53
  # We use the tokenizer’s chat template to format each message - see https://huggingface.co/docs/transformers/main/en/chat_templating
54
  messages = [
55
  {"role": "user", "content": "[YOUR QUERY GOES HERE]"},
 
68
 
69
  ## Citation
70
 
71
+ <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
72
+
73
+ **BibTeX:**
74
 
75
 
76
  ```bibtex
 
82
  archivePrefix={arXiv},
83
  primaryClass={cs.CL}
84
  }
85
+ ```