Update "How to Use in `transformers`" to use `pipeline`

#18
by mishig HF staff - opened
Meta Llama org

Update "How to Use in transformers" to use pipeline

cc: @osanseviero @pcuenq @ArthurZ wdyt?

Meta Llama org

if so, I can update the notebook as well

Meta Llama org

Looks good to me! What do you think about loading the pipeline directly and removing the model line and the AutoModelForCausalLM import?

generator = pipeline("text-generation", model=model_id, torch_dtype=dtype, device_map=device, tokenizer=tokenizer)

(Just an idea)

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment