Hoa 1B4 (Bloom architecture)

Hoa is an autoregressive Large Language Model (LLM), based on Bloom's model architecture. Hoa was trained on part of the Common Crawl dataset in Vietnamese and English.

Details will be available soon.

To contact us, mail to: leanhcuong@gmail.com (Lê Anh Cường) | hieunguyen1053@outlook.com (Hiếu) | nv.cuong@int2.vn (Nguyễn Việt Cường)

How to use

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("vlsp-2023-vllm/hoa-1b4")
model = AutoModelForCausalLM.from_pretrained("vlsp-2023-vllm/hoa-1b4", low_cpu_mem_usage=True)

device = torch.device("cuda" if torch.cuda.is_available() else "cpu") 
model.to(device)

prompt = "Địa chỉ trường Đại học Tôn Đức Thắng nằm ở số"
input_ids = tokenizer(prompt, return_tensors="pt")['input_ids'].to(device)

gen_tokens = model.generate(input_ids, max_length=max_length, repetition_penalty=1.1)

print(tokenizer.batch_decode(gen_tokens)[0])
Downloads last month
942
Safetensors
Model size
1.31B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for vlsp-2023-vllm/hoa-1b4

Adapters
104 models
Finetunes
4 models

Dataset used to train vlsp-2023-vllm/hoa-1b4

Collection including vlsp-2023-vllm/hoa-1b4

Evaluation results