--- language: vi tags: - vi - vietnamese - gpt2 - gpt3 - text-generation - lm - nlp datasets: - wikilinguage widget: - text: "Hoa quả và rau thường rẻ hơn khi vào mùa. " inference: parameters: max_length: 120 do_sample: True temperature: 1.0 --- # GPT-3 small Pretrained GPT Neo (GPT-3 small) , it's architecture intentionally resembles that of GPT-3, model was trained on Vietnamese dataset for text generation # How to use the model ~~~~ from transformers import GPT2Tokenizer, GPTNeoForCausalLM tokenizer = GPT2Tokenizer.from_pretrained('minhtoan/gpt3-small-vietnamese') model = GPTNeoForCausalLM.from_pretrained('minhtoan/gpt3-small-vietnamese') text = "Hoa quả và rau thường rẻ hơn khi vào mùa" input_ids = tokenizer.encode(text, return_tensors='pt') max_length = 100 sample_outputs = model.generate(input_ids, do_sample=True, max_length=max_length) for i, sample_output in enumerate(sample_outputs): print(">> Generated text {}\n\n{}".format(i+1, tokenizer.decode(sample_output.tolist()))) print('\n---') ~~~~ ## Author ` Phan Minh Toan `