Text Generation
Transformers
PyTorch
English
llama
text-generation-inference
Inference Endpoints
bleysg commited on
Commit
678c148
1 Parent(s): d120381

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -1
README.md CHANGED
@@ -72,13 +72,21 @@ We trained for 4 epochs and selected a snapshot at 3 epochs for peak performance
72
 
73
  Please await our full releases for further training details.
74
 
 
 
 
 
 
 
 
 
75
 
76
  # Citation
77
 
78
  ```bibtex
79
  @software{OpenOrca_Preview1,
80
  title = {OpenOrca_Preview1: A LLaMA-13B Model Fine-tuned on Small Portion of OpenOrcaV1 Dataset},
81
- author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong` and "Teknium"},
82
  year = {2023},
83
  publisher = {HuggingFace},
84
  journal = {HuggingFace repository},
 
72
 
73
  Please await our full releases for further training details.
74
 
75
+ # Prompting
76
+
77
+ It uses the Alpaca format (see [FastChat implementation example](https://github.com/lm-sys/FastChat/blob/daa2b9abe20597ebf34dc5df164d450456610c74/fastchat/conversation.py#L198-L229)):
78
+ ```
79
+ ### Instruction:
80
+
81
+ ### Response:
82
+ ```
83
 
84
  # Citation
85
 
86
  ```bibtex
87
  @software{OpenOrca_Preview1,
88
  title = {OpenOrca_Preview1: A LLaMA-13B Model Fine-tuned on Small Portion of OpenOrcaV1 Dataset},
89
+ author = {Wing Lian and Bleys Goodson and Eugene Pentland and Austin Cook and Chanvichet Vong and "Teknium"},
90
  year = {2023},
91
  publisher = {HuggingFace},
92
  journal = {HuggingFace repository},