Update README.md
Browse files
README.md
CHANGED
@@ -74,8 +74,11 @@ print(decoded_output) # I am happy
|
|
74 |
# Notes:
|
75 |
1. This is compatible with the latest version of transformers but was developed with version 4.3.2 so consider using 4.3.2 if possible.
|
76 |
2. While I have only shown how to let logits and loss and how to generate outputs, you can do pretty much everything the MBartForConditionalGeneration class can do as in https://huggingface.co/docs/transformers/model_doc/mbart#transformers.MBartForConditionalGeneration
|
77 |
-
3.
|
78 |
-
|
|
|
|
|
|
|
79 |
|
80 |
# Contributors
|
81 |
<ul>
|
|
|
74 |
# Notes:
|
75 |
1. This is compatible with the latest version of transformers but was developed with version 4.3.2 so consider using 4.3.2 if possible.
|
76 |
2. While I have only shown how to let logits and loss and how to generate outputs, you can do pretty much everything the MBartForConditionalGeneration class can do as in https://huggingface.co/docs/transformers/model_doc/mbart#transformers.MBartForConditionalGeneration
|
77 |
+
3. Note that the tokenizer I have used is based on sentencepiece and not BPE. Therefore, I used the AlbertTokenizer class and not the MBartTokenizer class.
|
78 |
+
|
79 |
+
# Fine-tuning on a downstream task
|
80 |
+
|
81 |
+
If you wish to fine-tune this model, then you can do so using the toolkit <a href="https://github.com/prajdabre/yanmtt">YANMTT</a> following the instructions here: https://github.com/AI4Bharat/indic-bart
|
82 |
|
83 |
# Contributors
|
84 |
<ul>
|