Edit model card

BLOOM-zh

Traditional Chinese-enhanced BLOOM language model

Model Card

Version 1.0 / 10.Apr.2023

BLOOM-zh is a joint collaboration between CKIP lab at Acedemia Sinica (link), MediaTek Research (連結, 连结, link), and National Academy for Educational Research (link). This model is released for non-commerical research purposes only.

Table of Contents

  1. Model Details
  2. Uses
  3. Training Data
  4. Risks and Limitations
  5. Recommendations
  6. Model Card Authors

Model Details

BLOOM-zh is a language model with enhanced Traditional Chinese capability. It is derived from BLOOMZ. BLOOM-zh is trained extendedly on large amount of Traditional Chinese text data.

Basics

  • Developed by: MediaTek Research
  • Model Type: Transformer-based Language Model
  • Version: 1.0.0
  • Languages: Multiple; see training data
  • License: MEDIATEK RESEARCH License (link) and RAIL License v1.0 (link)
  • Release Date Estimate: Monday, 10.April.2023
  • Send Questions to: info@mtkresearch.com
  • Paper: https://arxiv.org/abs/2303.04715
  • Cite as: MediaTek Research: Traditional Chinese-enhanced BLOOM language model. International, February 2023.
  • Organizations of contributors:
    • MediaTek Research
    • Academia Sinica
    • National Academy for Educational Research

Technical Specifications

This section provides information for people who work on model development.

For technical specifications, please refer to BLOOM.

Environmental Impact

For environmental impact, please refer to BLOOM.

Uses

This section addresses questions around how the model is intended to be used, discusses the foreseeable users of the model (including those affected by the model), and describes uses that are considered out of scope or misuse of the model. It provides information for anyone considering using the model or who is affected by the model.

For the uses of the model, please refer to BLOOM.

Training Data

This section provides a high-level overview of the training data. It is relevant for anyone who wants to know the basics of what the model is learning.

We trained the 3B parameter model on a total of 13 Billion tokens of mostly high quality Traditional Chinese text. Details are provided in the paper.

Risks and Limitations

This section identifies foreseeable harms and misunderstandings.

For risks and limitations, please refer to BLOOM.

Factors

This section lists some different aspects of BLOOM models. Its focus is on those aspects that are likely to give rise to high variance in model behavior.

  • The model is trained on Traditional Chinese. However, the pretrained weights capture more than 40 different languages.

  • The model is trained on web crawled data, news articles, novels, knowledge sources (encyclopedia, education sector) and instructions.

Recommendations

This section provides information on warnings and potential mitigations.

For recommendations, please refer to BLOOM.

Model Card Authors

Ordered roughly chronologically and by amount of time spent.

Philipp Ennen, Po-Chun Hsu, Chan-Jan Hsu, Chang-Le Liu, Yen-Chen Wu, Yin-Hsiang Liao, Chin-Tung Lin, Chi-Ming Chung, Yi-Chang Chen, Da-Shan Shiu, Wei-Yun Ma

Downloads last month
1,129
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using ckip-joint/bloom-3b-zh 2