File size: 1,244 Bytes
2a1989b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
25119bc
2a1989b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
19cfdd1
2a1989b
dadd946
 
 
 
19cfdd1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
language: pt
license: mit
tags:
- msmarco
- t5
- pytorch
- tensorflow
- pt
- pt-br
datasets:
- msmarco
widget:
- text: "Texto de exemplo em português"
inference: false
---
# PTT5-base Reranker finetuned on Portuguese MS MARCO
## Introduction
ptt5-base-msmarco-pt-10k is a T5-based model pretrained in the BrWac corpus, finetuned on Portuguese translated version of MS MARCO passage dataset. This model was finetuned for 10k steps. 
Further information about the dataset or the translation method can be found on our [Cross-Lingual repository](https://github.com/unicamp-dl/cross-lingual-analysis).

## Usage
```python

from transformers import T5Tokenizer, T5ForConditionalGeneration

model_name = 'unicamp-dl/ptt5-base-msmarco-pt-10k'
tokenizer  = T5Tokenizer.from_pretrained(model_name)
model      = T5ForConditionalGeneration.from_pretrained(model_name)

```
# Citation
If you use ptt5-base-msmarco-pt-10k, please cite:

    @article{rosa2021cost,
      title={A cost-benefit analysis of cross-lingual transfer methods},
      author={Rosa, Guilherme Moraes and Bonifacio, Luiz Henrique and de Souza, Leandro Rodrigues and Lotufo, Roberto and Nogueira, Rodrigo},
      journal={arXiv preprint arXiv:2105.06813},
      year={2021}
    }