system HF staff commited on
Commit
bd9791e
1 Parent(s): a5a78af

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +64 -0
README.md ADDED
@@ -0,0 +1,64 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: pt
3
+ license: mit
4
+ tags:
5
+ - t5
6
+ - pytorch
7
+ - tensorflow
8
+ datasets:
9
+ - brWaC
10
+ ---
11
+
12
+ # Portuguese T5 (aka "PTT5")
13
+
14
+ ## Introduction
15
+ PTT5 is a T5 model pretrained on Portuguese data which improves T5's performance on Portuguese sentence similarity and entailment tasks. It's available in three sizes (small, base and large) and two vocabularies (Google's T5 original and the Portuguese ours, trained on Portuguese Wikipedia data).
16
+
17
+ For further information or requests, please go to [PTT5 repository](https://github.com/unicamp-dl/PTT5).
18
+
19
+ ## Available models
20
+ | Model | Architecture | #Params | Vocabulary |
21
+ | :-: | :-: | :-: | :-: |
22
+ | `unicamp-dl/ptt5-small-t5-vocab` | t5-small | 60M | Google's T5 |
23
+ | `unicamp-dl/ptt5-base-t5-vocab` | t5-base | 220M | Google's T5 |
24
+ | `unicamp-dl/ptt5-large-t5-vocab` | t5-large | 740M | Google's T5 |
25
+ | `unicamp-dl/ptt5-small-portuguese-vocab` | t5-small | 60M | Portuguese |
26
+ | `unicamp-dl/ptt5-base-portuguese-vocab` | t5-base | 220M | Portuguese |
27
+ | `unicamp-dl/ptt5-large-portuguese-vocab` | t5-large | 740M | Portuguese |
28
+
29
+
30
+ ## Usage
31
+ ```python
32
+ # Tokenizer
33
+ from transformers import AutoTokenizer # or T5Tokenizer
34
+
35
+ # PyTorch (bare model, baremodel + language modeling head)
36
+ from transformers import T5Model, T5ForConditionalGeneration
37
+
38
+ # Tensorflow (bare model, baremodel + language modeling head)
39
+ from transformers import TFT5Model, TFT5ForConditionalGeneration
40
+
41
+ model_name = 'unicamp-dl/ptt5-base-portuguese-vocab'
42
+
43
+ tokenizer = T5Tokenizer.from_pretrained(model_name)
44
+
45
+ # PyTorch
46
+ model_pt = T5ForConditionalGeneration.from_pretrained(model_name)
47
+
48
+ # TensorFlow
49
+ model_tf = TFT5ForConditionalGeneration.from_pretrained(model_name)
50
+ ```
51
+
52
+
53
+ ## Citation
54
+ We are preparing an arXiv submission and soon will provide a citation. For now, if you need to cite use:
55
+ ```bibtex
56
+ @misc{ptt5_2020,
57
+ Author = {Carmo, Diedre and Piau, Marcos and Campiotti, Israel and Nogueira, Rodrigo and Lotufo, Roberto},
58
+ Title = {PTT5: Pre-training and validating the T5 transformer in Brazilian Portuguese data},
59
+ Year = {2020},
60
+ Publisher = {GitHub},
61
+ Journal = {GitHub repository},
62
+ Howpublished = {\url{https://github.com/unicamp-dl/PTT5}}
63
+ }
64
+ ```