lysandre HF staff amyeroberts HF staff commited on
Commit
08ab08c
1 Parent(s): cb32f77

Update README.md (#32)

Browse files

- Update README.md (be7e3656b683ae3c15062377b2929d5237514d43)


Co-authored-by: Amy Roberts <amyeroberts@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +4 -4
README.md CHANGED
@@ -55,8 +55,8 @@ You can use this model directly with a pipeline for text generation.
55
  >>> from transformers import pipeline
56
 
57
  >>> generator = pipeline('text-generation', model="facebook/opt-350m")
58
- >>> generator("Hello, I'm am conscious and")
59
- [{'generated_text': "Hello, I'm am conscious and I'm a bit of a noob. I'm looking for"}]
60
  ```
61
 
62
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
@@ -66,8 +66,8 @@ By default, generation is deterministic. In order to use the top-k sampling, ple
66
 
67
  >>> set_seed(32)
68
  >>> generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True)
69
- >>> generator("Hello, I'm am conscious and")
70
- [{'generated_text': "Hello, I'm am conscious and I'm interested in this project. Can I get an initial contact"}]
71
  ```
72
 
73
  ### Limitations and bias
 
55
  >>> from transformers import pipeline
56
 
57
  >>> generator = pipeline('text-generation', model="facebook/opt-350m")
58
+ >>> generator("What are we having for dinner?")
59
+ [{'generated_text': "What are we having for dinner?\nI'm having a steak and a salad.\nI'm""}]
60
  ```
61
 
62
  By default, generation is deterministic. In order to use the top-k sampling, please set `do_sample` to `True`.
 
66
 
67
  >>> set_seed(32)
68
  >>> generator = pipeline('text-generation', model="facebook/opt-350m", do_sample=True)
69
+ >>> generator("What are we having for dinner?")
70
+ [{'generated_text': "What are we having for dinner?\n\nWith spring fast approaching, it’s only appropriate"}]
71
  ```
72
 
73
  ### Limitations and bias