Fix tokenizer config

#2
by pcuenq HF staff - opened

The main fix was to remove tokenizer_file, because it pointed to a path inside /home/osanseviero. This made it impossible for the model to work (can be tested with the inference widget). I also performed a couple of additional changes:

  • Formatted in multiple lines.
  • Removed do_basic_tokenize, never_split, as they were using default values.

Thanks!

osanseviero changed pull request status to merged

Sign up or log in to comment