Update README.md
Browse files
README.md
CHANGED
@@ -67,3 +67,7 @@ For tasks other than MARC-ja, the maximum length is short, so the attention_type
|
|
67 |
| Waseda RoBERTa large (seq512) | 0.969 | 0.925 | 0.890 | 0.928 | 0.910 | 0.955 | 0.900 |
|
68 |
| BigBird base (original_full) |0.959 | 0.888 | 0.846 | 0.896 | 0.884 | 0.933 | 0.787 |
|
69 |
| BigBird base (block_sparse) |0.959 | - | - | - | - | - | - |
|
|
|
|
|
|
|
|
|
|
67 |
| Waseda RoBERTa large (seq512) | 0.969 | 0.925 | 0.890 | 0.928 | 0.910 | 0.955 | 0.900 |
|
68 |
| BigBird base (original_full) |0.959 | 0.888 | 0.846 | 0.896 | 0.884 | 0.933 | 0.787 |
|
69 |
| BigBird base (block_sparse) |0.959 | - | - | - | - | - | - |
|
70 |
+
|
71 |
+
## Acknowledgments
|
72 |
+
|
73 |
+
This work was supported by the "Construction of a Japanese Large-Scale General-Purpose Language Model that Handles Long Sequences" at the 3rd ABCI Grand Challenge 2022.
|