---
title: "VS Code Language Model API"
description: "Learn how to use Cline with the experimental VS Code Language Model API, enabling access to models from GitHub Copilot and other compatible extensions."
---

Cline offers _experimental_ support for the [VS Code Language Model API](https://code.visualstudio.com/api/extension-guides/language-model). This API enables extensions to grant access to language models directly within the VS Code environment. Consequently, you might be able to leverage models from:

- **GitHub Copilot:** Provided you have an active Copilot subscription and the extension installed.
- **Other VS Code Extensions:** Any extension that implements the Language Model API.

**Important Note:** This integration is currently in an experimental phase and might not perform as anticipated. Its functionality relies on other extensions correctly implementing the VS Code Language Model API.

### Prerequisites

- **VS Code:** The Language Model API is accessible via VS Code (it is not currently supported by Cursor).
- **A Language Model Provider Extension:** An extension that furnishes a language model is required. Examples include:
    - **GitHub Copilot:** With a Copilot subscription, the GitHub Copilot and GitHub Copilot Chat extensions can serve as model providers.
    - **Alternative Extensions:** Explore the VS Code Marketplace for extensions mentioning "Language Model API" or "lm". Other experimental options may be available

### Configuration Steps

1. **Ensure Copilot Account is Active and Extensions are installed:** User logged into either the Copilot or Copilot Chat extension should be able to gain access via Cline.
2. **Access Cline Settings:** Click the gear icon (⚙️) located in the Cline panel.
3. **Choose Provider:** Select "VS Code LM API" from the "API Provider" dropdown menu.
4. **Select Model:** If the Copilot extension(s) are installed and the user is logged into their Copilot account, the "Language Model" dropdown will populate with available models after a short time. The naming convention is `vendor/family`. For instance, if Copilot is active, you might encounter options such as:
    - `copilot - gpt-3.5-turbo`
    - `copilot - gpt-4o-mini`
    - `copilot - gpt-4`
    - `copilot - gpt-4-turbo`
    - `copilot - gpt-4o`
    - `copilot - claude-3.5-sonnet` **NOTE:** this model does not work.
    - `copilot - gemini-2.0-flash`
    - `copilot - gpt-4.1`

For best results with the VSCode LM API Provider, we suggest using the OpenAI Models (GPT 3, 4, 4.1, 4o etc.)

### Current Limitations

- **Experimental API Status:** The VS Code Language Model API is still under active development. Anticipate potential changes and instability.
- **Dependency on Extensions:** This feature is entirely contingent on other extensions making models available. Cline does not directly control the list of accessible models.
- **Restricted Functionality:** The VS Code Language Model API might not encompass all features available through other API providers (e.g., image input capabilities, streaming responses, detailed usage metrics).
- **No Direct Cost Management:** Users are subject to the pricing structures and terms of service of the extension providing the model. Cline cannot directly monitor or regulate associated costs.
- **GitHub Copilot Rate Throttling:** When employing the VS Code LM API with GitHub Copilot, be mindful that GitHub may enforce rate limits on Copilot usage. These limitations are governed by GitHub, not Cline.

### Troubleshooting Tips

- **Models Not Appearing:**
    - Confirm that VS Code is installed.
    - Verify that a language model provider extension (e.g., GitHub Copilot, GitHub Copilot Chat) is installed and enabled.
    - If utilizing Copilot, ensure you have previously sent a Copilot Chat message using the desired model.
- **Unexpected Operation:** Should you encounter unforeseen behavior, it is likely an issue stemming from the underlying Language Model API or the provider extension. Consider reporting the problem to the developers of the provider extension.
