---
title: ModelScope
---

>[ModelScope](https://www.modelscope.cn/home) is a big repository of the models and datasets.

This page covers how to use the modelscope ecosystem within LangChain.
It is broken into two parts: installation and setup, and then references to specific modelscope wrappers.

## Installation

<CodeGroup>
```bash pip
pip install -U langchain-modelscope-integration
```

```bash uv
uv add langchain-modelscope-integration
```
</CodeGroup>

Head to [ModelScope](https://modelscope.cn/) to sign up to ModelScope and generate an [SDK token](https://modelscope.cn/my/myaccesstoken). Once you've done this set the `MODELSCOPE_SDK_TOKEN` environment variable:

```bash
export MODELSCOPE_SDK_TOKEN=<your_sdk_token>
```

## Chat Models

`ModelScopeChatEndpoint` class exposes chat models from ModelScope. See available models [here](https://www.modelscope.cn/docs/model-service/API-Inference/intro).

```python
from langchain_modelscope import ModelScopeChatEndpoint

llm = ModelScopeChatEndpoint(model="Qwen/Qwen2.5-Coder-32B-Instruct")
llm.invoke("Sing a ballad of LangChain.")
```

## Embeddings

`ModelScopeEmbeddings` class exposes embeddings from ModelScope.

```python
from langchain_modelscope import ModelScopeEmbeddings

embeddings = ModelScopeEmbeddings(model_id="damo/nlp_corom_sentence-embedding_english-base")
embeddings.embed_query("What is the meaning of life?")
```

## LLMs
`ModelScopeEndpoint` class exposes LLMs from ModelScope.

```python
from langchain_modelscope import ModelScopeEndpoint

llm = ModelScopeEndpoint(model="Qwen/Qwen2.5-Coder-32B-Instruct")
llm.invoke("The meaning of life is")
```
