DevWorld commited on
Commit
96e52fb
1 Parent(s): 76bed5c

Update README.md

Browse files

Not keras now, we use transformers at this time.

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -37,7 +37,7 @@ Models are trained on a context length of 8192 tokens, which is equivalent to Ge
37
 
38
  ### Usage
39
 
40
- Below we share some code snippets on how to get quickly started with running the model. First make sure to `pip install -U keras keras-nlp`, then copy the snippet from the section that is relevant for your usecase.
41
 
42
  #### Running the model with transformers
43
  [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deveworld/Gemago/blob/main/Gemago_2b_Infer.ipynb)
 
37
 
38
  ### Usage
39
 
40
+ Below we share some code snippets on how to get quickly started with running the model. First make sure to `pip install -U transformers`, then copy the snippet from the section that is relevant for your usecase.
41
 
42
  #### Running the model with transformers
43
  [![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/deveworld/Gemago/blob/main/Gemago_2b_Infer.ipynb)