Update README.md
Browse files
README.md
CHANGED
@@ -12,7 +12,7 @@ language:
|
|
12 |
|
13 |
Hyper-Pretrained Transformers (HPT) is a novel multimodal LLM framework from [HyperGAI](https://hypergai.com/), and has been trained for vision-language models that are capable of multimodal understanding for both textual and visual inputs. Here we release our best open-sourced 8B Multimodal LLM HPT 1.5 Air. Built with Meta Llama 3, our hyper capable HPT 1.5 Air packs a punch on real world understanding and complex reasoning. This repository contains the open-source weight to reproduce the evaluation results of HPT 1.5 Air on different benchmarks.
|
14 |
|
15 |
-
For full details of this model please read our [technical blog post]()
|
16 |
|
17 |
## Run the model
|
18 |
|
|
|
12 |
|
13 |
Hyper-Pretrained Transformers (HPT) is a novel multimodal LLM framework from [HyperGAI](https://hypergai.com/), and has been trained for vision-language models that are capable of multimodal understanding for both textual and visual inputs. Here we release our best open-sourced 8B Multimodal LLM HPT 1.5 Air. Built with Meta Llama 3, our hyper capable HPT 1.5 Air packs a punch on real world understanding and complex reasoning. This repository contains the open-source weight to reproduce the evaluation results of HPT 1.5 Air on different benchmarks.
|
14 |
|
15 |
+
For full details of this model please read our [technical blog post](https://hypergai.com/blog/hpt-1-5-air-best-open-sourced-8b-multimodal-llm-with-llama-3)
|
16 |
|
17 |
## Run the model
|
18 |
|