Update README.md
Browse files
README.md
CHANGED
@@ -73,7 +73,7 @@ print(decoded_output) # I am happy
|
|
73 |
|
74 |
# Notes:
|
75 |
1. This is compatible with the latest version of transformers but was developed with version 4.3.2 so consider using 4.3.2 if possible.
|
76 |
-
2. While I have only shown how to
|
77 |
3. Note that the tokenizer I have used is based on sentencepiece and not BPE. Therefore, I used the AlbertTokenizer class and not the MBartTokenizer class.
|
78 |
|
79 |
# Fine-tuning on a downstream task
|
|
|
73 |
|
74 |
# Notes:
|
75 |
1. This is compatible with the latest version of transformers but was developed with version 4.3.2 so consider using 4.3.2 if possible.
|
76 |
+
2. While I have only shown how to get logits and loss and how to generate outputs, you can do pretty much everything the MBartForConditionalGeneration class can do as in https://huggingface.co/docs/transformers/model_doc/mbart#transformers.MBartForConditionalGeneration
|
77 |
3. Note that the tokenizer I have used is based on sentencepiece and not BPE. Therefore, I used the AlbertTokenizer class and not the MBartTokenizer class.
|
78 |
|
79 |
# Fine-tuning on a downstream task
|