|
--- |
|
license: cc-by-nc-4.0 |
|
language: |
|
- ko |
|
library_name: transformers |
|
--- |
|
|
|
base_model: LDCC/LDCC-SOLAR-10.7B |
|
pipeline_tag: text-generation |
|
--- |
|
|
|
# **msy127/ft-240209-sft** |
|
|
|
|
|
## Our Team |
|
|
|
| Research & Engineering | Product Management | |
|
| :--------------------: | :----------------: | |
|
| David Sohn | David Sohn | |
|
|
|
|
|
## **Model Details** |
|
|
|
### **Base Model** |
|
|
|
[LDCC/LDCC-SOLAR-10.7B](https://huggingface.co/LDCC/LDCC-SOLAR-10.7B) |
|
|
|
### **Trained On** |
|
|
|
- **OS**: Ubuntu 22.04 |
|
- **GPU**: A100 40GB 1ea |
|
- **transformers**: v4.37 |
|
|
|
|
|
|
|
## **Implementation Code** |
|
|
|
This model contains the chat_template instruction format. |
|
You can use the code below. |
|
|
|
```python |
|
# Use a pipeline as a high-level helper |
|
from transformers import pipeline |
|
|
|
pipe = pipeline("text-generation", model="msy127/ft-240209-sft") |
|
|
|
# Load model directly |
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
|
|
tokenizer = AutoTokenizer.from_pretrained("msy127/ft-240209-sft") |
|
model = AutoModelForCausalLM.from_pretrained("msy127/ft-240209-sft") |
|
``` |
|
|
|
## **Introduction to our service platform** |
|
- AI Companion service platform that talks while looking at your face. |
|
- You can preview the future of the world's best, character.ai. |
|
- https://livetalkingai.com |