arnocandel commited on
Commit
04cb502
1 Parent(s): 7c142cb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -16,12 +16,12 @@ datasets:
16
  # h2oGPT Model Card
17
  ## Summary
18
 
19
- H2O.ai's `h2ogpt-oig-oasst1-256-6.9b` is a 6.9 billion parameter instruction-following large language model licensed for commercial use.
20
 
21
  - Base model: [EleutherAI/pythia-6.9b](https://huggingface.co/EleutherAI/pythia-6.9b)
22
  - Fine-tuning dataset: [h2oai/h2ogpt-oig-oasst1-instruct-cleaned-v1](https://huggingface.co/datasets/h2oai/h2ogpt-oig-oasst1-instruct-cleaned-v1)
23
  - Data-prep and fine-tuning code: [H2O.ai Github](https://github.com/h2oai/h2ogpt)
24
- - Training logs: [zip](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6.9b/blob/main/pythia-6.9b.h2ogpt-oig-oasst1-instruct-cleaned-v1.json.1_epochs.5fc91911bc2bfaaf3b6c2de577c4b0ae45a07a4a.9.zip)
25
 
26
  ## Usage
27
 
@@ -36,13 +36,13 @@ pip install accelerate==0.18.0
36
  import torch
37
  from transformers import pipeline
38
 
39
- generate_text = pipeline(model="h2oai/h2ogpt-oig-oasst1-256-6.9b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
40
 
41
  res = generate_text("Why is drinking water so healthy?", max_new_tokens=100)
42
  print(res[0]["generated_text"])
43
  ```
44
 
45
- Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6.9b/blob/main/h2oai_pipeline.py),
46
  store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
47
 
48
  ```python
@@ -90,7 +90,7 @@ GPTNeoXForCausalLM(
90
 
91
  ```json
92
  GPTNeoXConfig {
93
- "_name_or_path": "h2oai/h2ogpt-oig-oasst1-256-6.9b",
94
  "architectures": [
95
  "GPTNeoXForCausalLM"
96
  ],
 
16
  # h2oGPT Model Card
17
  ## Summary
18
 
19
+ H2O.ai's `h2ogpt-oig-oasst1-256-6_9b` is a 6.9 billion parameter instruction-following large language model licensed for commercial use.
20
 
21
  - Base model: [EleutherAI/pythia-6.9b](https://huggingface.co/EleutherAI/pythia-6.9b)
22
  - Fine-tuning dataset: [h2oai/h2ogpt-oig-oasst1-instruct-cleaned-v1](https://huggingface.co/datasets/h2oai/h2ogpt-oig-oasst1-instruct-cleaned-v1)
23
  - Data-prep and fine-tuning code: [H2O.ai Github](https://github.com/h2oai/h2ogpt)
24
+ - Training logs: [zip](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b/blob/main/pythia-6.9b.h2ogpt-oig-oasst1-instruct-cleaned-v1.json.1_epochs.5fc91911bc2bfaaf3b6c2de577c4b0ae45a07a4a.9.zip)
25
 
26
  ## Usage
27
 
 
36
  import torch
37
  from transformers import pipeline
38
 
39
+ generate_text = pipeline(model="h2oai/h2ogpt-oig-oasst1-256-6_9b", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
40
 
41
  res = generate_text("Why is drinking water so healthy?", max_new_tokens=100)
42
  print(res[0]["generated_text"])
43
  ```
44
 
45
+ Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6_9b/blob/main/h2oai_pipeline.py),
46
  store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
47
 
48
  ```python
 
90
 
91
  ```json
92
  GPTNeoXConfig {
93
+ "_name_or_path": "h2oai/h2ogpt-oig-oasst1-256-6_9b",
94
  "architectures": [
95
  "GPTNeoXForCausalLM"
96
  ],