Mistral-7B-Saiga / README.md
Gaivoronsky's picture
Update README.md
fcaea19
|
raw
history blame
787 Bytes
metadata
license: cc-by-4.0
datasets:
  - lksy/ru_instruct_gpt4
  - IlyaGusev/ru_turbo_saiga
  - IlyaGusev/ru_sharegpt_cleaned
  - IlyaGusev/oasst1_ru_main_branch
language:
  - ru
pipeline_tag: text-generation

This is a generative model converted to fp16 format based on IlyaGusev/saiga_mistral_7b_lora

Install vLLM:

pip install vllm

Start server:

python -u -m vllm.entrypoints.openai.api_server --host 0.0.0.0 --model Gaivoronsky/Mistral-7B-Saiga

Client:

import openai
response = openai.ChatCompletion.create(
        model="Gaivoronsky/Mistral-7B-Saiga",
        messages=[{"role": "user", "content": 'Привет'}],
        max_tokens=512,
        )
response['choices'][0]['message']['content']