---
sidebar_label: 'Custom Adapters'
---

import {PlatformSelector} from '@site/src/components/PlatformSelector/PlatformSelector';

import StreamAdaptersReactJs from './_001-custom-adapters/#react/streamAdapter.mdx';
import StreamAdaptersJavaScript from './_001-custom-adapters/#js/streamAdapter.mdx';

import BatchAdaptersReactJs from './_001-custom-adapters/#react/batchAdapter.mdx';
import BatchAdaptersJavaScript from './_001-custom-adapters/#js/batchAdapter.mdx';

import AdapterExtrasReactJs from './_001-custom-adapters/#react/adapterExtras.mdx';
import AdapterExtrasJavaScript from './_001-custom-adapters/#js/adapterExtras.mdx';

# Custom Adapters

If you're building your own APIs and you would like to use `NLUX` as the UI for your own AI assistant,
you can do so by creating a custom adapter.

There are 2 types of custom adapters that you can create:

* Streaming Adapters
* Batch Adapters

---

## Stream Adapters
<br /><br />

Stream adapters are used when the API sends responses in a stream (e.g. Server-Sent Events, WebSockets).

The advantage of using a stream adapter is that **the chat UI will be updated in real-time while the LLM is still
generating text**. This is particularly useful if the API takes a long time to process a request and sends responses
in a stream. Most major LLM providers (e.g. OpenAI, Llama Index, Hugging Face) support streaming responses.

<PlatformSelector reactJs={StreamAdaptersReactJs} javascript={StreamAdaptersJavaScript}/>

---

## Batch Adapters

<PlatformSelector reactJs={BatchAdaptersReactJs} javascript={BatchAdaptersJavaScript}/>

---

## Adapter Extras

<PlatformSelector reactJs={AdapterExtrasReactJs} javascript={AdapterExtrasJavaScript}/>
