Edit model card

PISCES is a pre-trained many-to-many summarization model that learns language modeling, cross-lingual ability and summarization ability through a designed three-stage pre-training.

This model is introduced by Towards Unifying Multi-Lingual and Cross-Lingual Summarization (To appear in ACL 2023 main conference)

tokenizer = MBart50Tokenizer.from_pretrained('Krystalan/PISCES')
model = MBartForConditionalGeneration.from_pretrained('Krystalan/PISCES')
Downloads last month
19
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.