llm-jp/llm-jp-13b-instruct-full-ac_001_16x-dolly-ichikara_004_001_single-oasst-oasst2-v2.0
Text Generation
•
Updated
•
91
•
2
None defined yet.
This repository provides large language models developed by LLM-jp, a collaborative project launched in Japan.
Pre-trained models |
llm-jp-13b-v2.0 |
llm-jp-13b-v1.0 |
llm-jp-1.3b-v1.0 |
Checkpoints format: transformers
(Megatron-DeepSpeed format available here)