---
license: other
license_name: deepseek
license_link: https://github.com/deepseek-ai/DeepSeek-MoE/blob/main/LICENSE-MODEL
---
[🏠Homepage] | [🤖 Chat with DeepSeek LLM] | [Discord] | [Wechat(微信)]
Paper Link👁️
### 1. Introduction to DeepSeekMoE
See the [Introduction](https://github.com/deepseek-ai/DeepSeek-MoE/blob/main) for more details.
### 2. How to Use
Here give some examples of how to use our model.
#### Text Completion
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM, GenerationConfig
model_name = "deepseek-ai/deepseek-moe-16b-base"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto")
model.generation_config = GenerationConfig.from_pretrained(model_name)
model.generation_config.pad_token_id = model.generation_config.eos_token_id
text = "An attention function can be described as mapping a query and a set of key-value pairs to an output, where the query, keys, values, and output are all vectors. The output is"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs.to(model.device), max_new_tokens=100)
result = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(result)
```
### 3. License
This code repository is licensed under the MIT License. The use of DeepSeekMoE models is subject to the Model License. DeepSeekMoE supports commercial use.
See the [LICENSE-MODEL](https://github.com/deepseek-ai/DeepSeek-MoE/blob/main/LICENSE-MODEL) for more details.
### 4. Contact
If you have any questions, please raise an issue or contact us at [service@deepseek.com](mailto:service@deepseek.com).