模型:
zedfum/arman-longformer-8k
This project use Longformer's attention mechanism to alireza7/ARMAN-MSR-persian-base in order to perform abstractive summarization on long documents. so new model can accept 8K tokens (rather than 512 tokens).it should be fine-tuned for summarization tasks.
converting code is available in github repository