Update README.md
Browse files
README.md
CHANGED
@@ -1,19 +1,29 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
|
5 |
## Model Description
|
6 |
|
7 |
-
Falcon is a family of state-of-the-art language models created by the Technology Innovation Institute in Abu Dhabi.
|
8 |
-
|
9 |
-
## Fine-tuning details
|
10 |
-
|
11 |
-
The model was trained using a batch size of 1 and a learning rate of 2e-4 over five epochs. The maximum sequence length was set to 512 tokens, and gradient accumulation was used with a step size of four. The model was trained for a total of 250 steps.
|
12 |
|
13 |
-
##
|
14 |
|
15 |
-
|
16 |
|
17 |
-
|
|
|
18 |
|
19 |
-
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- en
|
4 |
+
Tags:
|
5 |
+
- fine-tuned
|
6 |
+
- legal
|
7 |
+
- Indian law
|
8 |
+
license: "apache-2.0"
|
9 |
+
metrics:
|
10 |
+
- perplexity
|
11 |
+
---
|
12 |
+
|
13 |
+
# Fine-Tuned Falcon 7B - Indian Law
|
14 |
+
|
15 |
+
This is a Falcon 7B model fine-tuned for question answering in the domain of Indian law. It has been trained to answer questions regarding various aspects of the Indian legal system, such as the Constitution, the roles of governmental positions, and more.
|
16 |
|
17 |
## Model Description
|
18 |
|
19 |
+
Falcon is a family of state-of-the-art language models created by the Technology Innovation Institute in Abu Dhabi. This version, Falcon 7B, has been fine-tuned to specialize in understanding and generating responses related to Indian law. The model was trained on a custom dataset composed of question-answer pairs about Indian law.
|
|
|
|
|
|
|
|
|
20 |
|
21 |
+
## How to use
|
22 |
|
23 |
+
You can use this model for generating responses. Here is how to do it:
|
24 |
|
25 |
+
```python
|
26 |
+
from transformers import pipeline
|
27 |
|
28 |
+
generator = pipeline('text-generation', model='path_to_your_model')
|
29 |
+
print(generator("<human>: What is the role of the Judiciary as per the Constitution of India?", max_length=100))
|