arnocandel
commited on
Commit
·
2fe366f
1
Parent(s):
f8d2c63
commit files to HF hub
Browse files
README.md
CHANGED
@@ -36,7 +36,7 @@ print(res[0]["generated_text"])
|
|
36 |
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6.9b/blob/main/h2oai_pipeline.py),
|
37 |
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
|
38 |
|
39 |
-
```
|
40 |
import torch
|
41 |
from h2oai_pipeline import H2OTextGenerationPipeline
|
42 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
@@ -79,7 +79,7 @@ GPTNeoXForCausalLM(
|
|
79 |
|
80 |
## Model Configuration
|
81 |
|
82 |
-
```
|
83 |
GPTNeoXConfig {
|
84 |
"_name_or_path": "h2oai/h2ogpt-oig-oasst1-256-6.9b",
|
85 |
"architectures": [
|
|
|
36 |
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/h2oai/h2ogpt-oig-oasst1-256-6.9b/blob/main/h2oai_pipeline.py),
|
37 |
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
|
38 |
|
39 |
+
```python
|
40 |
import torch
|
41 |
from h2oai_pipeline import H2OTextGenerationPipeline
|
42 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
|
|
79 |
|
80 |
## Model Configuration
|
81 |
|
82 |
+
```json
|
83 |
GPTNeoXConfig {
|
84 |
"_name_or_path": "h2oai/h2ogpt-oig-oasst1-256-6.9b",
|
85 |
"architectures": [
|