metadata
language: pt
license: mit
tags:
- msmarco
- t5
- pytorch
- tensorflow
- pt
- pt-br
datasets:
- msmarco
widget:
- text: Texto de exemplo em português
inference: false
mt5-base-multi-msmarco Reranker finetuned on Multi MS MARCO
Introduction
mT5-base is a mT5-based model finetuned on a multilingual translated version of MS MARCO passage dataset. This dataset, named Multi MS MARCO, is formed by 12 complete MS MARCO passages collection in 12 different languages. Further information about the dataset or the translation method can be found on our Cross-Lingual repository.
Usage
from transformers import T5Tokenizer, MT5ForConditionalGeneration
model_name = 'unicamp-dl/mt5-base-multi-msmarco'
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = MT5ForConditionalGeneration.from_pretrained(model_name)
Citation
If you use ptt5-base-msmarco-pt-100k, please cite:
@article{rosa2021cost,
title={A cost-benefit analysis of cross-lingual transfer methods},
author={Rosa, Guilherme Moraes and Bonifacio, Luiz Henrique and de Souza, Leandro Rodrigues and Lotufo, Roberto and Nogueira, Rodrigo},
journal={arXiv preprint arXiv:2105.06813},
year={2021}
}