jingyeom commited on
Commit
e3091a5
1 Parent(s): 5d9e03f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -57
README.md CHANGED
@@ -1,58 +1,16 @@
1
- ---
2
- license: apache-2.0
3
- base_model: yanolja/KoSOLAR-10.7B-v0.2
4
- tags:
5
- - generated_from_trainer
6
- model-index:
7
- - name: lion-32bit_FA2_neftune_5_leaderboard_inst_v1.3_dedup_KoSOLAR-10.7B-v0.2_L1024
8
- results: []
9
- ---
 
 
 
 
 
 
10
 
11
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
- should probably proofread and complete it, then remove this comment. -->
13
-
14
- # lion-32bit_FA2_neftune_5_leaderboard_inst_v1.3_dedup_KoSOLAR-10.7B-v0.2_L1024
15
-
16
- This model is a fine-tuned version of [yanolja/KoSOLAR-10.7B-v0.2](https://huggingface.co/yanolja/KoSOLAR-10.7B-v0.2) on an unknown dataset.
17
-
18
- ## Model description
19
-
20
- More information needed
21
-
22
- ## Intended uses & limitations
23
-
24
- More information needed
25
-
26
- ## Training and evaluation data
27
-
28
- More information needed
29
-
30
- ## Training procedure
31
-
32
- ### Training hyperparameters
33
-
34
- The following hyperparameters were used during training:
35
- - learning_rate: 1e-06
36
- - train_batch_size: 2
37
- - eval_batch_size: 8
38
- - seed: 42
39
- - distributed_type: multi-GPU
40
- - num_devices: 6
41
- - gradient_accumulation_steps: 16
42
- - total_train_batch_size: 192
43
- - total_eval_batch_size: 48
44
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
- - lr_scheduler_type: cosine
46
- - lr_scheduler_warmup_ratio: 0.03
47
- - num_epochs: 3
48
-
49
- ### Training results
50
-
51
-
52
-
53
- ### Framework versions
54
-
55
- - Transformers 4.35.2
56
- - Pytorch 2.0.1
57
- - Datasets 2.15.0
58
- - Tokenizers 0.15.0
 
1
+ Model
2
+ base_model : yanolja/KoSOLAR-10.7B-v0.2
3
+
4
+ Dataset
5
+ 공개 데이터 수집
6
+ Deduplicating Training Data Makes Language Models Better 알고리즘 활용
7
+ Code
8
+ from transformers import AutoModelForCausalLM, AutoTokenizer
9
+ import torch
10
+
11
+ model_name = "jingyeom/SOLAR_KO_1.3_deup"
12
+ model = AutoModelForCausalLM.from_pretrained(
13
+ model_name,
14
+ )
15
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
16