Update README.md
Browse files
README.md
CHANGED
@@ -24,8 +24,8 @@ OpenBezoar-HH-RLHF-SFT is an LLM that is built upon the OpenLLaMA 3B v2 architec
|
|
24 |
|
25 |
### Model Sources
|
26 |
|
27 |
-
- **Repository:** [
|
28 |
-
- **Paper :** [
|
29 |
|
30 |
## Instruction Format
|
31 |
|
@@ -92,7 +92,14 @@ Refer to our self-reported evaluations in our paper (Section 4).
|
|
92 |
If you find our work useful, please cite our paper as follows:
|
93 |
|
94 |
```
|
95 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
96 |
```
|
97 |
|
98 |
## Model Authors
|
|
|
24 |
|
25 |
### Model Sources
|
26 |
|
27 |
+
- **Repository:** [Bitbucket Project](https://bitbucket.org/paladinanalytics/workspace/projects/OP)
|
28 |
+
- **Paper :** [Pre-Print](https://arxiv.org/abs/2404.12195)
|
29 |
|
30 |
## Instruction Format
|
31 |
|
|
|
92 |
If you find our work useful, please cite our paper as follows:
|
93 |
|
94 |
```
|
95 |
+
@misc{dissanayake2024openbezoar,
|
96 |
+
title={OpenBezoar: Small, Cost-Effective and Open Models Trained on Mixes of Instruction Data},
|
97 |
+
author={Chandeepa Dissanayake and Lahiru Lowe and Sachith Gunasekara and Yasiru Ratnayake},
|
98 |
+
year={2024},
|
99 |
+
eprint={2404.12195},
|
100 |
+
archivePrefix={arXiv},
|
101 |
+
primaryClass={cs.CL}
|
102 |
+
}
|
103 |
```
|
104 |
|
105 |
## Model Authors
|