Update README.md
Browse files
README.md
CHANGED
@@ -6,11 +6,11 @@ language:
|
|
6 |
- ja
|
7 |
---
|
8 |
|
9 |
-
## FINGU-AI/
|
10 |
|
11 |
### Overview
|
12 |
|
13 |
-
The FINGU-AI/
|
14 |
|
15 |
### Key Features
|
16 |
|
@@ -20,15 +20,15 @@ The FINGU-AI/QwenllmFi, model offers a specialized curriculum tailored to Englis
|
|
20 |
|
21 |
### Model Information
|
22 |
|
23 |
-
- **Model Name**: FINGU-AI/
|
24 |
-
- **Description**: FINGU-AI/
|
25 |
-
- **Checkpoint**: FINGU-AI/
|
26 |
- **Author**:
|
27 |
- **License**: Apache-2.0
|
28 |
|
29 |
### How to Use
|
30 |
|
31 |
-
To use the FINGU-AI/
|
32 |
|
33 |
```python
|
34 |
#!pip install 'transformers>=4.39.0'
|
@@ -40,7 +40,7 @@ from transformers import AutoModelForCausalLM, AutoTokenizer, AutoConfig,TextStr
|
|
40 |
|
41 |
|
42 |
streamer = TextStreamer(tokenizer)
|
43 |
-
model_id = 'FINGU-AI/
|
44 |
#config = AutoConfig.from_pretrained(model_id)
|
45 |
model = AutoModelForCausalLM.from_pretrained(model_id, attn_implementation="flash_attention_2", torch_dtype= torch.bfloat16,)
|
46 |
model.to('cuda')
|
|
|
6 |
- ja
|
7 |
---
|
8 |
|
9 |
+
## FINGU-AI/FinguAI-Chat-v1
|
10 |
|
11 |
### Overview
|
12 |
|
13 |
+
The FINGU-AI/FinguAI-Chat-v1, model offers a specialized curriculum tailored to English, Korean, and Japanese speakers interested in finance, investment, and legal frameworks. It aims to enhance language proficiency while providing insights into global finance markets and regulatory landscapes.
|
14 |
|
15 |
### Key Features
|
16 |
|
|
|
20 |
|
21 |
### Model Information
|
22 |
|
23 |
+
- **Model Name**: FINGU-AI/FinguAI-Chat-v1
|
24 |
+
- **Description**: FINGU-AI/FinguAI-Chat-v1 model trained on various languages, including English, Korean, and Japanese.
|
25 |
+
- **Checkpoint**: FINGU-AI/FinguAI-Chat-v1
|
26 |
- **Author**:
|
27 |
- **License**: Apache-2.0
|
28 |
|
29 |
### How to Use
|
30 |
|
31 |
+
To use the FINGU-AI/FinguAI-Chat-v1 model, you can utilize the Hugging Face Transformers library. Here's a Python code snippet demonstrating how to load the model and generate predictions:
|
32 |
|
33 |
```python
|
34 |
#!pip install 'transformers>=4.39.0'
|
|
|
40 |
|
41 |
|
42 |
streamer = TextStreamer(tokenizer)
|
43 |
+
model_id = 'FINGU-AI/FinguAI-Chat-v1'
|
44 |
#config = AutoConfig.from_pretrained(model_id)
|
45 |
model = AutoModelForCausalLM.from_pretrained(model_id, attn_implementation="flash_attention_2", torch_dtype= torch.bfloat16,)
|
46 |
model.to('cuda')
|