You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

⛰️Chinese Valley: A Bilingual Video Assistant with Large Language model Enhanced abilitY

Ziwang Zhao, Ruipu Luo, Da Li, Min Yang, Minghui Qiu, Zhiheng Wang, Changqing Li, Kang Wu

Chinese Valley is a derivative of Valley, dedicated to empowering Valley with the ability to speak Chinese and English. The model is built based on BELLE-LLaMA-EXT-7B and trained in the Chinese-English multimodal dataset. In addition, we also provide 13B Chinese Valley weight, the address is Chinese-Valley-13B.

Chinese Valley 是 Valley 的衍生品,致力于赋予 Valley 生成中文的能力。 该模型基于 BELLE-LLaMA-EXT-7B 构建,并在中英多模态数据集上进行训练。此外,我们也提供了 Chinese Valley 13B 规模的权重,地址为 Chinese-Valley-13B.

Usage:

You can use our Chinese Valley with the following command:

您可以使用以下命令使用我们的 Chinese Valley

# clone repository
git clone https://github.com/RupertLuo/Valley.git
cd /content/Valley/

# install requirement
pip install -e .

# download model checkpoints
git lfs install
git lfs clone https://huggingface.co/Zhaoziwang/chinese_valley7b_v1

# generate result
python valley/inference/run_valley.py --model-name "./chinese_valley7b_v1" --video-file "./valley/serve/examples/videos/aa5dbc3a110f410bb02572408b0fb778.mp4" --query "视频内容是什么<video>" --system-prompt "你是大型语言视觉助手 Chinese-Valley。你能够理解用户提供的视觉内容或视频,并使用自然语言协助用户完成各种任务。请仔细按照人类的指令进行回答,并详细解释你的答案。"

Besides, Colab notebook is available now. You can experience Chinese-Valley-7B with Colab.

此外,您可以在 Colab 上使用我们的模型。

License

Chinese Valley uses MIT License. All data and code in this project can only be used for academic purposes.

本项目使用 MIT 许可,所有的数据和代码仅供学术研究使用。

Citation

If this project is helpful to your research, please consider liking this model and following our follow-up work.

Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.