|
--- |
|
license: mit |
|
datasets: |
|
- magicgh/Ask-before-Plan |
|
language: |
|
- en |
|
base_model: |
|
- meta-llama/Meta-Llama-3-8B-Instruct |
|
- mistralai/Mistral-7B-Instruct-v0.2 |
|
--- |
|
|
|
# CEP Framework |
|
|
|
<a href="https://arxiv.org/abs/2406.12639">Paper</a> • |
|
<a href="https://huggingface.co/datasets/magicgh/Ask-before-Plan">Data</a> • |
|
<a href="https://drive.google.com/file/d/1vMIhs8mpMgk33pFDv2rWg6AJNyD70Sod">Environment</a> • |
|
<a href="https://github.com/magicgh/Ask-before-Plan">Code</a> |
|
|
|
This repository contains the checkpoint for the CEP framework in our EMNLP 2024 Paper, *Ask-before-Plan: Proactive Language Agents for Real-World Planning*. |
|
We release our CEP models, including LLaMA-3-8B and Mistral-7B variants, finetuned on Clarification and Execution subtasks. |
|
|
|
## Get Started |
|
1. Download our checkpoints. |
|
```bash |
|
git lfs install |
|
git clone https://huggingface.co/magicgh/CEP |
|
``` |
|
2. OpenAI compatible servers. |
|
```bash |
|
python3 -m vllm.entrypoints.openai.api_server |
|
--served-model-name ${model_name} |
|
--model ${model} |
|
--kv-cache-dtype fp8 |
|
--port ${port} |
|
--enable-lora |
|
--lora-modules ${lora_models} |
|
--chat-template ${chat_template} |
|
``` |
|
## Citation |
|
If you find our research helpful for your work, please star [this repository](https://github.com/magicgh/Ask-before-Plan) and cite our paper: |
|
``` |
|
@article{ask-before-plan, |
|
author = {Xuan Zhang and Yang Deng and Zifeng Ren and See-Kiong Ng and Tat-Seng Chua}, |
|
journal = {ArXiv preprint}, |
|
title = {Ask-before-Plan: Proactive Language Agents for Real-World Planning}, |
|
url = {https://arxiv.org/abs/2406.12639}, |
|
year = {2024} |
|
} |
|
``` |