Llama-3-70b tokenizer.
#116
by
BigDeeper
- opened
Which library do I need to get direct access to the tokenizer? These HF pages don't seem to mention the tokenizer used.
No description provided.
model = AutoModelForCausalLM.from_pretrained(
self.model_id,
device_map='auto',
trust_remote_code=True,
)
self.tokenizer = AutoTokenizer.from_pretrained(self.model_id)
self.pipeline = pipeline(
'text-generation',
model=model,
tokenizer=self.tokenizer,
trust_remote_code=True,
)
I think you can also do pipeline.tokenizer.
osanseviero
changed discussion status to
closed
Hi