pseudotensor's picture
Update README.md
4215e83
---
license: other
language:
- en
library_name: transformers
inference: false
thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
tags:
- gpt
- llm
- large language model
- LLaMa
datasets:
- h2oai/h2ogpt-oig-oasst1-instruct-cleaned-v2
---
# h2oGPT Model Card
## Summary
H2O.ai's `h2oai/h2ogpt-research-oig-oasst1-512-30b` is a 30 billion parameter instruction-following large language model for research use only.
- Base model [decapoda-research/llama-30b-hf](https://huggingface.co/decapoda-research/llama-30b-hf)
- LORA [h2oai/h2ogpt-research-oig-oasst1-512-30b-lora](https://huggingface.co/h2oai/h2ogpt-research-oig-oasst1-512-30b-lora)
- This HF version was built using the [export script and steps](https://huggingface.co/h2oai/h2ogpt-research-oig-oasst1-512-30b-lora#build-hf-model)
All details about performance etc. are provided in the [LORA Model Card](https://huggingface.co/h2oai/h2ogpt-research-oig-oasst1-512-30b-lora).