(주)미디어그룹사람과숲과 (주)마커의 LLM 연구 컨소시엄에서 개발된 모델입니다
The license is cc-by-nc-sa
.
Model Details
Model Developers SeungyooLee (DopeorNope)
Input Models input text only.
Output Models generate text only.
Model Architecture
pub-llama-13b-v6 is an auto-regressive language model based on the LLaMA2 transformer architecture.
Base Model : beomi/llama-2-koen-13b
Training Dataset
DopeorNope/OpenOrca-near-dedup-v1 dataset was created by Near dedup algorithm to reduce similarity.
We will open it soon.
- Downloads last month
- 1,965
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.