File size: 1,340 Bytes
602357c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
61acbd9
602357c
 
 
 
 
 
 
 
 
 
 
 
 
 
61acbd9
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
---
language: pt
license: mit
tags:
- msmarco
- t5
- pytorch
- tensorflow
- pt
- pt-br
datasets:
- msmarco
widget:
- text: "Texto de exemplo em português"
inference: false
---
# PTT5-base Reranker finetuned on Portuguese MS MARCO
## Introduction
ptt5-base-msmarco-pt-100k is a T5-based model pretrained in the BrWac corpus, finetuned on Portuguese translated version of MS MARCO passage dataset. This model was finetuned for 100k steps. 
Further information about the dataset or the translation method can be found on our [**mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset**](https://arxiv.org/abs/2108.13897) and [mMARCO](https://github.com/unicamp-dl/mMARCO) repository.

## Usage
```python

from transformers import T5Tokenizer, T5ForConditionalGeneration

model_name = 'unicamp-dl/ptt5-base-msmarco-pt-100k'
tokenizer  = T5Tokenizer.from_pretrained(model_name)
model      = T5ForConditionalGeneration.from_pretrained(model_name)

```
# Citation
If you use ptt5-base-msmarco-pt-100k, please cite:

    @misc{bonifacio2021mmarco,
      title={mMARCO: A Multilingual Version of MS MARCO Passage Ranking Dataset}, 
      author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
      year={2021},
      eprint={2108.13897},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}