---
title: Configuration
description: Set up your LLM connection and authorization with environment
    variables for seamless integration.
keywords: LLM setup, API configuration, environment variables, secure
    authorization, LLM integration
sidebar:
    order: 2
hero:
    image:
        alt:
            A small, flat, 8-bit style illustration features simple colored blocks
            linked by straight lines to depict API and cloud connections. Each block
            displays a basic symbol, including representations of artificial
            intelligence, a cloud, code, an environment file, and a lock for security.
            Iconic, abstract shapes indicate major tech providers like OpenAI, Azure,
            Google, Hugging Face, GitHub, and Anthropic. The arrangement is clear and
            geometric, with each block in a unique color from a five-color palette,
            all on a transparent background with no text, people, or visual effects.
        file: ./configuration.png
---

import { FileTree } from "@astrojs/starlight/components"
import { Steps } from "@astrojs/starlight/components"
import { Tabs, TabItem } from "@astrojs/starlight/components"
import { Image } from "astro:assets"
import { YouTube } from "astro-embed"
import LLMProviderFeatures from "../../../components/LLMProviderFeatures.astro"

import lmSrc from "../../../assets/vscode-language-models.png"
import lmAlt from "../../../assets/vscode-language-models.png.txt?raw"

import lmSelectSrc from "../../../assets/vscode-language-models-select.png"
import lmSelectAlt from "../../../assets/vscode-language-models-select.png.txt?raw"

import oaiModelsSrc from "../../../assets/openai-model-names.png"
import oaiModelsAlt from "../../../assets/openai-model-names.png.txt?raw"

import {
    AZURE_OPENAI_API_VERSION,
    AZURE_AI_INFERENCE_VERSION,
} from "../../../../../packages/core/src/constants"

You will need to [configure](/genaiscript/configuration) the LLM connection and authorization secrets.
You can use remote (like OpenAI, Azure, etc.) and local models (like Ollama, Jan, LMStudio, etc.) with GenAIScript.

There are a few shortcuts where GenAIScript will automatically detect
the configuration; otherwise, you'll want to follow [the configuration instructions](/genaiscript/configuration).

- in Visual Studio Code with GitHub Copilot Chat installed, GenAIScript will automatically use the Copilot Chat models
- in a GitHub Codespace, GenAIScript will automatically use GitHub Models
- if Ollama is running, GenAIScript will automatically use the Ollama models

**If none of these scenario apply, follow [the configuration instructions](/genaiscript/configuration).**

## Next steps

Write your [first script](/genaiscript/getting-started/your-first-genai-script).
