<html> | |
<head> | |
<title>orca_mini_v3_7B-GGUF (Q4_K_M)</title> | |
</head> | |
<body> | |
<h1>orca_mini_v3_7B-GGUF (Q4_K_M)</h1> | |
<p> | |
With the utilization of the | |
<a href="https://github.com/abetlen/llama-cpp-python">llama-cpp-python</a> | |
package, we are excited to introduce the GGUF model hosted in the Hugging | |
Face Docker Spaces, made accessible through an OpenAI-compatible API. This | |
space includes comprehensive API documentation to facilitate seamless | |
integration. | |
</p> | |
<ul> | |
<li> | |
The API endpoint: | |
<a href="https://limcheekin-orca-mini-v3-7b-gguf.hf.space/v1" | |
>https://limcheekin-orca-mini-v3-7b-gguf.hf.space/v1</a | |
> | |
</li> | |
<li> | |
The API doc: | |
<a href="https://limcheekin-orca-mini-v3-7b-gguf.hf.space/docs" | |
>https://limcheekin-orca-mini-v3-7b-gguf.hf.space/docs</a | |
> | |
</li> | |
</ul> | |
<p> | |
If you find this resource valuable, your support in the form of starring | |
the space would be greatly appreciated. Your engagement plays a vital role | |
in furthering the application for a community GPU grant, ultimately | |
enhancing the capabilities and accessibility of this space. | |
</p> | |
</body> | |
</html> | |