pszemraj commited on
Commit
f075e93
·
1 Parent(s): 6cb93fb

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +107 -0
README.md ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ pipeline_tag: text-generation
4
+ tags:
5
+ - multilingual
6
+ - PyTorch
7
+ - Transformers
8
+ - gpt3
9
+ - gpt2
10
+ - Deepspeed
11
+ - Megatron
12
+ datasets:
13
+ - mc4
14
+ - Wikipedia
15
+
16
+ widget:
17
+ - text: "I know you're tired, but can we go for another walk this evening?\npeter szemraj:\n\n"
18
+ example_title: "walk"
19
+ - text: "What do you call an alligator who's just had surgery to remove his left arm?\npeter szemraj:\n\n"
20
+ example_title: "alligator"
21
+ - text: "If you could live anywhere, where would it be?\npeter szemraj:\n\n"
22
+ example_title: "dream living place"
23
+ - text: "What really makes you angry?\npeter szemraj:\n\n"
24
+ example_title: "pet peeve"
25
+ - text: "My friend says that she knows every language, but she doesn't speak any of them.. what's wrong with her?\npeter szemraj:\n\n"
26
+ example_title: "language"
27
+ - text: "What would you change about yourself if you could?\npeter szemraj:\n\n"
28
+ example_title: "change"
29
+ - text: "My first is in Asia, my second is in Europe, my third is in North America, and my fourth is in South America. What am I?\npeter szemraj:\n\n"
30
+ example_title: "continent"
31
+ - text: "Can you take me for dinner somewhere nice this time?\npeter szemraj:\n\n"
32
+ example_title: "dinner"
33
+ - text: "Honey, I have clogged the toilet for the third time this month.. sorry..\npeter szemraj:\n\n"
34
+ example_title: "overflow"
35
+ - text: "A man pushes his car to a hotel and tells the owner he's bankrupt. Why?\npeter szemraj:\n\n"
36
+ example_title: "brain teaser"
37
+
38
+ inference:
39
+ parameters:
40
+ min_length: 2
41
+ max_length: 64
42
+ length_penalty: 0.4
43
+ no_repeat_ngram_size: 3
44
+ do_sample: True
45
+ top_p: 0.95
46
+ top_k: 30
47
+ temperature: 0.65
48
+ repetition_penalty: 3.5
49
+
50
+ ---
51
+
52
+
53
+ # mGPT: fine-tune on message data MWE
54
+
55
+ This model is a fine-tuned version of [sberbank-ai/mGPT](https://huggingface.co/sberbank-ai/mGPT) on 80k messages. This builds on the minimum-working-example checkpoint [here](https://huggingface.co/pszemraj/mGPT-Peter-mwe).
56
+
57
+ ## Model description
58
+
59
+ - testing if fine-tuned personality data bleeds over to other languages without being trained in them explicitly
60
+
61
+ ### Usage in python
62
+
63
+ Install the transformers library if you don't have it:
64
+
65
+ ```
66
+ pip install -U transformers
67
+ ```
68
+
69
+ load the model into a pipeline object:
70
+
71
+
72
+ ```
73
+ from transformers import pipeline
74
+ import torch
75
+ device = 'cuda' if torch.cuda.is_available() else 'cpu'
76
+ my_chatbot = pipeline('text-generation',
77
+ 'pszemraj/mGPT-Peter-2E',
78
+ device=0 if device == 'cuda' else -1,
79
+ )
80
+ ```
81
+
82
+
83
+
84
+
85
+ ## Training procedure
86
+
87
+ ### Training hyperparameters
88
+
89
+ The following hyperparameters were used during training:
90
+ - learning_rate: 5e-05
91
+ - train_batch_size: 4
92
+ - eval_batch_size: 4
93
+ - seed: 42
94
+ - distributed_type: multi-GPU
95
+ - gradient_accumulation_steps: 8
96
+ - total_train_batch_size: 32
97
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
98
+ - lr_scheduler_type: cosine_with_restarts
99
+ - lr_scheduler_warmup_ratio: 0.05
100
+ - num_epochs: 2
101
+
102
+ ### Framework versions
103
+
104
+ - Transformers 4.18.0
105
+ - Pytorch 1.11.0+cu113
106
+ - Datasets 2.1.0
107
+ - Tokenizers 0.12.1