File size: 503 Bytes
98a262d
 
 
2ca2bd2
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
---
license: cc-by-nc-sa-2.0
---

PISCES is a pre-trained many-to-many summarization model that learns language modeling, cross-lingual ability and summarization ability through a designed three-stage pre-training.

This model is introduced by *Towards Unifying Multi-Lingual and Cross-Lingual Summarization* (To appear in ACL 2023 main conference)


```python
tokenizer = MBart50Tokenizer.from_pretrained('Krystalan/PISCES')
model = MBartForConditionalGeneration.from_pretrained('Krystalan/PISCES')
```