# IBM

The `LangChain` integrations related to [IBM watsonx.ai](https://www.ibm.com/products/watsonx-ai) platform.

IBM® watsonx.ai™ AI studio is part of the IBM [watsonx](https://www.ibm.com/watsonx)™ AI and data platform, bringing together new generative 
AI capabilities powered by [foundation models](https://www.ibm.com/products/watsonx-ai/foundation-models) and traditional machine learning (ML) 
into a powerful studio spanning the AI lifecycle. Tune and guide models with your enterprise data to meet your needs with easy-to-use tools for 
building and refining performant prompts. With watsonx.ai, you can build AI applications in a fraction of the time and with a fraction of the data. 
Watsonx.ai offers:

- **Multi-model variety and flexibility:** Choose from IBM-developed, open-source and third-party models, or build your own model.
- **Differentiated client protection:** IBM stands behind IBM-developed models and indemnifies the client against third-party IP claims.
- **End-to-end AI governance:** Enterprises can scale and accelerate the impact of AI with trusted data across the business, using data wherever it resides.
- **Hybrid, multi-cloud deployments:** IBM provides the flexibility to integrate and deploy your AI workloads into your hybrid-cloud stack of choice.


## Installation and Setup

Install the integration package with
```bash
pip install -qU langchain-ibm
```

Get an IBM watsonx.ai api key and set it as an environment variable (`WATSONX_APIKEY`)
```python
import os

os.environ["WATSONX_APIKEY"] = "your IBM watsonx.ai api key"
```

## Chat Model

### ChatWatsonx

See a [usage example](/docs/integrations/chat/ibm_watsonx).

```python
from langchain_ibm import ChatWatsonx
```

## LLMs

### WatsonxLLM

See a [usage example](/docs/integrations/llms/ibm_watsonx).

```python
from langchain_ibm import WatsonxLLM
```

## Embedding Models

### WatsonxEmbeddings

See a [usage example](/docs/integrations/text_embedding/ibm_watsonx).

```python
from langchain_ibm import WatsonxEmbeddings
```
