Edit model card

OpenVINO IR model with int8 quantization

Model definition for LocalAI:

name: Yi-6B
backend: transformers
parameters:
  model: fakezeta/Yi-1.5-6B-Chat-ov-int8
context_size: 8192
type: OVModelForCausalLM
template:
  use_tokenizer_template: true

To run the model directly with LocalAI:

local-ai run huggingface://fakezeta/Yi-1.5-6B-Chat-ov-int8/model.yaml

πŸ™ GitHub β€’ πŸ‘Ύ Discord β€’ 🐀 Twitter β€’ πŸ’¬ WeChat
πŸ“ Paper β€’ πŸ™Œ FAQ β€’ πŸ“— Learning Hub

Intro

Yi-1.5 is an upgraded version of Yi. It is continuously pre-trained on Yi with a high-quality corpus of 500B tokens and fine-tuned on 3M diverse fine-tuning samples.

Compared with Yi, Yi-1.5 delivers stronger performance in coding, math, reasoning, and instruction-following capability, while still maintaining excellent capabilities in language understanding, commonsense reasoning, and reading comprehension.

Model Context Length Pre-trained Tokens
Yi-1.5 4K 3.6T

Models

Benchmarks

  • Chat models

    Yi-1.5-34B-Chat is on par with or excels beyond larger models in most benchmarks.

    image/png

    Yi-1.5-9B-Chat is the top performer among similarly sized open-source models.

    image/png

  • Base models

    Yi-1.5-34B is on par with or excels beyond larger models in some benchmarks.

    image/png

    Yi-1.5-9B is the top performer among similarly sized open-source models.

    image/png

Quick Start

For getting up and running with Yi-1.5 models quickly, see README.

Downloads last month
4

Collection including fakezeta/Yi-1.5-6B-Chat-ov-int8