metadata
language: pt
license: mit
tags:
- msmarco
- miniLM
- pytorch
- tensorflow
- pt
- pt-br
datasets:
- msmarco
widget:
- text: Texto de exemplo em português
inference: false
multilingual-MiniLM-L6-v2-en-pt-msmarco Reranker finetuned on mMARCO
Introduction
multilingual-MiniLM-L6-v2-en-pt-msmarco is a multilingual miniLM-based model finetuned on a bilingual version of MS MARCO passage dataset. This bilingual dataset version is formed by the original MS MARCO dataset (in English) and a Portuguese translated version. Further information about the dataset or the translation method can be found on our Cross-Lingual repository.
Usage
from transformers import AutoTokenizer, AutoModel
model_name = 'unicamp-dl/multilingual-MiniLM-L6-v2-en-pt-msmarco'
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModel.from_pretrained(model_name)
Citation
If you use mt5-base-en-pt-msmarco, please cite:
@article{rosa2021cost,
title={A cost-benefit analysis of cross-lingual transfer methods},
author={Rosa, Guilherme Moraes and Bonifacio, Luiz Henrique and de Souza, Leandro Rodrigues and Lotufo, Roberto and Nogueira, Rodrigo},
journal={arXiv preprint arXiv:2105.06813},
year={2021}
}