Hercules-phi-2 / README.md
Locutusque's picture
Update README.md
239e2a6 verified
---
library_name: transformers
license: apache-2.0
datasets:
- Locutusque/hercules-v4.5
language:
- en
inference:
parameters:
do_sample: true
temperature: 1
top_p: 0.7
top_k: 4
max_new_tokens: 250
repetition_penalty: 1.1
---
# Hercules-phi-2
<!-- Provide a quick summary of what the model is/does. -->
We fine-tuned phi2 on Locutusque's Hercules-v4.5.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This model has capabilities in math, coding, function calling, roleplay, and more. We fine-tuned it using all examples of Hercules-v4.5.
- **Developed by:** M4-ai
- **Language(s) (NLP):** English
- **License:** apache-2.0
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
General purpose assistant, question answering, chain-of-thought, etc..
## Evaluation
Coming soon
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
https://huggingface.co/datasets/Locutusque/hercules-v4.5
#### Training Hyperparameters
- **Training regime:** bf16 non-mixed precision
## Technical Specifications
#### Hardware
We used 8 Kaggle TPUs, and we trained at a global batch size of 1152.