LoupGarou commited on
Commit
4808647
1 Parent(s): e7c33ea

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -72,5 +72,9 @@ if __name__ == "__main__":
72
  # Training Procedure
73
  The base WizardCoder model was finetuned on the Guanaco dataset using QLORA, which was trimmed to within 2 standard deviations of token size for question sets and randomized. All non-English data was also removed from this finetuning dataset.
74
 
75
- # Acknowledgements
76
- This model is the result of finetuning efforts based on the WizardCoder base model and the Guanaco model. Many thanks to the creators and the community around these models. Special thanks to the Hugging Face team for providing the transformers library which made this work possible.
 
 
 
 
 
72
  # Training Procedure
73
  The base WizardCoder model was finetuned on the Guanaco dataset using QLORA, which was trimmed to within 2 standard deviations of token size for question sets and randomized. All non-English data was also removed from this finetuning dataset.
74
 
75
+ ## Acknowledgements
76
+ This model, WizardCoder-Guanaco-15B-V1.0, is simply building on the efforts of two great teams to evaluate the performance of a combined model with the strengths of the [WizardCoder base model](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0) and the [Guanaco dataset](https://huggingface.co/datasets/JosephusCheung/GuanacoDataset).
77
+
78
+ A sincere appreciation goes out to the developers and the community involved in the creation and refinement of these models. Their commitment to providing open source tools and datasets have been instrumental in making this project a reality.
79
+
80
+ Moreover, a special note of thanks to the [Hugging Face](https://huggingface.co/) team, whose transformative library has not only streamlined the process of model creation and adaptation, but also democratized the access to state-of-the-art machine learning technologies. Their impact on the development of this project cannot be overstated.