---
title: Hugging Face
sidebar:
    order: 9
---

import { FileTree } from "@astrojs/starlight/components"
import { Steps } from "@astrojs/starlight/components"
import { Tabs, TabItem } from "@astrojs/starlight/components"
import { Image } from "astro:assets"
import LLMProviderFeatures from "../../../components/LLMProviderFeatures.astro"

The `huggingface` provider allows you to use [Hugging Face Models](https://huggingface.co/models?other=text-generation-inference) using [Text Generation Inference](https://huggingface.co/docs/text-generation-inference/index).

```js "huggingface:"
script({ model: "huggingface:microsoft/Phi-3-mini-4k-instruct" })
```

To use Hugging Face models with GenAIScript, follow these steps:

<Steps>

<ol>

<li>

Sign up for a [Hugging Face account](https://huggingface.co/) and obtain an API key from their [console](https://huggingface.co/settings/tokens).
If you are creating a **Fined Grained** token, enable the **Make calls to the serverless inference API** option.

</li>

<li>

Add your Hugging Face API key to the `.env` file
as `HUGGINGFACE_API_KEY`, `HF_TOKEN` or `HUGGINGFACE_TOKEN` variables.

```txt title=".env"
HUGGINGFACE_API_KEY=hf_...
```

</li>

<li>

Find the model that best suits your needs by visiting the [HuggingFace models](https://huggingface.co/models?other=text-generation-inference).

</li>

<li>

Update your script to use the `model` you choose.

```js
script({
    ...
    model: "huggingface:microsoft/Phi-3-mini-4k-instruct",
})
```

</li>

</ol>

</Steps>

:::note

Some models may require a Pro account.

:::

### Logging

You can enable the `genaiscript:huggingface` and `genaiscript:huggingface:msg` [logging namespaces](/genaiscript/reference/scripts/logging) for more information about the requests and responses:

<LLMProviderFeatures provider="huggingface" />
