Model Card for LDCC-SOLAR-10.7B
Developed by : Wonchul Kim (Lotte Data Communication AI Technical Team)
Hardware and Software
- Hardware: We utilized an A100x4 * 1 for training our model
- Training Factors: We fine-tuned this model using a combination of the DeepSpeed library and the HuggingFace TRL Trainer / HuggingFace Accelerate
Method
- This model was trained using the learning method introduced in the SOLAR paper.
Base Model
- yanolja/KoSOLAR-10.7B-v0.1 (This model is no longer supported due to a tokenizer issue.)
Caution
- If you want to fine-tune this model, it is recommended to use the tokenizer.json and tokenizer_config.json files from revision v1.1.
- Downloads last month
- 5,678
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.