quanghuy123 commited on
Commit
5b3a698
·
verified ·
1 Parent(s): ed69fd1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +21 -1
README.md CHANGED
@@ -28,10 +28,30 @@ tags:
28
 
29
  ### Model Specifications
30
  - **Maximum Sequence Length**: 512 tokens
31
- - **Output Dimensionality**: 512 tokens
32
  - **Language**: Primarily focused on **Vietnamese** legal texts.
33
  - **License**: Apache-2.0 License
34
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
35
  ## Usage
36
 
37
  This model is suitable for applications in legal domains, such as:
 
28
 
29
  ### Model Specifications
30
  - **Maximum Sequence Length**: 512 tokens
 
31
  - **Language**: Primarily focused on **Vietnamese** legal texts.
32
  - **License**: Apache-2.0 License
33
 
34
+
35
+ @inproceedings{zaib-2021-bert-coqac,
36
+ title = "BERT-CoQAC: BERT-based Conversational Question Answering in Context",
37
+ author = "Zaib, Munazza and Tran, Dai Hoang and Sagar, Subhash and Mahmood, Adnan and Zhang, Wei E. and Sheng, Quan Z.",
38
+ booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing",
39
+ month = "4",
40
+ year = "2021",
41
+ publisher = "Association for Computational Linguistics",
42
+ url = "https://arxiv.org/abs/2104.11394",
43
+ doi = "10.48550/arXiv.2104.11394"
44
+ }
45
+
46
+ @article{devlin-2018-bert,
47
+ title = "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding",
48
+ author = "Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina",
49
+ journal = "arXiv:1810.04805",
50
+ year = "2018",
51
+ url = "https://arxiv.org/abs/1810.04805",
52
+ doi = "10.48550/arXiv.1810.04805"
53
+ }
54
+
55
  ## Usage
56
 
57
  This model is suitable for applications in legal domains, such as: