mkmkmkmk commited on
Commit
86d3899
·
1 Parent(s): 51ba6de

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -0
README.md CHANGED
@@ -67,3 +67,7 @@ For tasks other than MARC-ja, the maximum length is short, so the attention_type
67
  | Waseda RoBERTa large (seq512) | 0.969 | 0.925 | 0.890 | 0.928 | 0.910 | 0.955 | 0.900 |
68
  | BigBird base (original_full) |0.959 | 0.888 | 0.846 | 0.896 | 0.884 | 0.933 | 0.787 |
69
  | BigBird base (block_sparse) |0.959 | - | - | - | - | - | - |
 
 
 
 
 
67
  | Waseda RoBERTa large (seq512) | 0.969 | 0.925 | 0.890 | 0.928 | 0.910 | 0.955 | 0.900 |
68
  | BigBird base (original_full) |0.959 | 0.888 | 0.846 | 0.896 | 0.884 | 0.933 | 0.787 |
69
  | BigBird base (block_sparse) |0.959 | - | - | - | - | - | - |
70
+
71
+ ## Acknowledgments
72
+
73
+ This work was supported by the "Construction of a Japanese Large-Scale General-Purpose Language Model that Handles Long Sequences" at the 3rd ABCI Grand Challenge 2022.