Model Card for Model arman-longformer-8k
This project use Longformer's attention mechanism to alireza7/ARMAN-MSR-persian-base in order to perform abstractive summarization on long documents. so new model can accept 8K tokens (rather than 512 tokens).it should be fine-tuned for summarization tasks.
converting code is available in github repository
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support