Text Generation
Transformers
PyTorch
Thai
English
mpt
custom_code
text-generation-inference
mrp commited on
Commit
85f2d34
1 Parent(s): 0473525

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -34,7 +34,7 @@ Users (both direct and downstream) should be made aware of the risks, biases, an
34
  # How to Get Started with the Model
35
  Use the code [here](https://colab.research.google.com/drive/1y_7oOU3ZJI0h4chUrXFL3K4kelW_OI2G?usp=sharing#scrollTo=4yN3Bo6iAH2L) below to get started with the model.
36
  Or
37
- ```
38
  from transformers import AutoModelForCausalLM, AutoTokenizer
39
  tokenizer = AutoTokenizer.from_pretrained( "airesearch/WangchanLion7B", trust_remote_code=True)
40
  model = AutoModelForCausalLM.from_pretrained(
 
34
  # How to Get Started with the Model
35
  Use the code [here](https://colab.research.google.com/drive/1y_7oOU3ZJI0h4chUrXFL3K4kelW_OI2G?usp=sharing#scrollTo=4yN3Bo6iAH2L) below to get started with the model.
36
  Or
37
+ ```python
38
  from transformers import AutoModelForCausalLM, AutoTokenizer
39
  tokenizer = AutoTokenizer.from_pretrained( "airesearch/WangchanLion7B", trust_remote_code=True)
40
  model = AutoModelForCausalLM.from_pretrained(