Mihir3009 commited on
Commit
849e44a
·
1 Parent(s): 4c6812f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -6
README.md CHANGED
@@ -8,7 +8,7 @@ In-BoXBART
8
  An instruction-based unified model for performing various biomedical tasks.
9
 
10
  You may want to check out
11
- * Our paper (NAACL 2022 Findings): [In-BoXBART: Get Instructions into Biomedical Multi-Task Learning](https://arxiv.org/abs/2204.07600)
12
  * GitHub: [Click Here](https://github.com/Mihir3009/In-BoXBART)
13
 
14
  This work explores the impact of instructional prompts on biomedical Multi-Task Learning. We introduce the BoX, a collection of 32 instruction tasks for Biomedical NLP across (X) various categories. Using this meta-dataset, we propose a unified model termed In-BoXBART, that can jointly learn all tasks of the BoX without any task-specific modules. To the best of our knowledge, this is the first attempt to
@@ -37,10 +37,21 @@ BibTeX Entry and Citation Info
37
  If you are using our model, please cite our paper:
38
 
39
  ```bibtex
40
- @article{parmar2022boxbart,
41
- title={{In-BoXBART: Get Instructions into Biomedical Multi-Task Learning}},
42
- author={Parmar, Mihir and Mishra, Swaroop and Purohit, Mirali and Luo, Man and Murad, M Hassan and Baral, Chitta},
43
- journal={NAACL 2022 Findings},
44
- year={2022}
 
 
 
 
 
 
 
 
 
 
 
45
  }
46
  ```
 
8
  An instruction-based unified model for performing various biomedical tasks.
9
 
10
  You may want to check out
11
+ * Our paper (NAACL 2022 Findings): [In-BoXBART: Get Instructions into Biomedical Multi-Task Learning](https://aclanthology.org/2022.findings-naacl.10/)
12
  * GitHub: [Click Here](https://github.com/Mihir3009/In-BoXBART)
13
 
14
  This work explores the impact of instructional prompts on biomedical Multi-Task Learning. We introduce the BoX, a collection of 32 instruction tasks for Biomedical NLP across (X) various categories. Using this meta-dataset, we propose a unified model termed In-BoXBART, that can jointly learn all tasks of the BoX without any task-specific modules. To the best of our knowledge, this is the first attempt to
 
37
  If you are using our model, please cite our paper:
38
 
39
  ```bibtex
40
+ @inproceedings{parmar-etal-2022-boxbart,
41
+ title = "In-{B}o{XBART}: Get Instructions into Biomedical Multi-Task Learning",
42
+ author = "Parmar, Mihir and
43
+ Mishra, Swaroop and
44
+ Purohit, Mirali and
45
+ Luo, Man and
46
+ Mohammad, Murad and
47
+ Baral, Chitta",
48
+ booktitle = "Findings of the Association for Computational Linguistics: NAACL 2022",
49
+ month = jul,
50
+ year = "2022",
51
+ address = "Seattle, United States",
52
+ publisher = "Association for Computational Linguistics",
53
+ url = "https://aclanthology.org/2022.findings-naacl.10",
54
+ doi = "10.18653/v1/2022.findings-naacl.10",
55
+ pages = "112--128",
56
  }
57
  ```