File size: 3,777 Bytes
188adc4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ba686a7
 
 
 
 
 
 
 
 
188adc4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ba686a7
188adc4
 
 
 
ba686a7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
188adc4
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
base_model: google/pegasus-xsum
tags:
- generated_from_trainer
metrics:
- rouge
- precision
- recall
- f1
model-index:
- name: LLM_Teached_Pegasus_100k
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# LLM_Teached_Pegasus_100k

This model is a fine-tuned version of [google/pegasus-xsum](https://huggingface.co/google/pegasus-xsum) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5004
- Rouge1: 0.4923
- Rouge2: 0.2429
- Rougel: 0.4134
- Rougelsum: 0.4134
- Gen Len: 25.1335
- Precision: 0.9143
- Recall: 0.9124
- F1: 0.9132

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 16
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | F1     | Gen Len | Validation Loss | Precision | Recall | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:-----:|:------:|:-------:|:---------------:|:---------:|:------:|:------:|:------:|:------:|:---------:|
| 2.1501        | 1.0   | 781   | 0.9072 | 25.4655 | 1.7062          | 0.9082    | 0.9065 | 0.4566 | 0.209  | 0.3745 | 0.3744    |
| 1.7722        | 2.0   | 1562  | 0.9097 | 25.4298 | 1.6314          | 0.9107    | 0.909  | 0.4712 | 0.2226 | 0.3906 | 0.3904    |
| 1.7218        | 3.0   | 2343  | 0.9106 | 25.6569 | 1.5948          | 0.9112    | 0.9103 | 0.4776 | 0.2284 | 0.3965 | 0.3963    |
| 1.6668        | 4.0   | 3125  | 0.9112 | 25.3451 | 1.5708          | 0.9122    | 0.9107 | 0.481  | 0.2316 | 0.4002 | 0.4       |
| 1.6437        | 5.0   | 3906  | 0.9118 | 25.482  | 1.5565          | 0.9127    | 0.9113 | 0.4844 | 0.2346 | 0.4034 | 0.4031    |
| 1.6186        | 6.0   | 4687  | 0.912  | 25.4191 | 1.5476          | 0.9129    | 0.9115 | 0.4852 | 0.236  | 0.4047 | 0.4044    |
| 1.607         | 7.0   | 5468  | 0.9122 | 25.4949 | 1.5426          | 0.9129    | 0.9118 | 0.486  | 0.2367 | 0.4052 | 0.405     |
| 1.5972        | 8.0   | 6248  | 1.5380 | 0.4872  | 0.2387          | 0.407     | 0.4071 | 25.3836| 0.9131 | 0.9118 | 0.9123    |
| 1.5836        | 9.0   | 7029  | 1.5273 | 0.4891  | 0.2399          | 0.4088    | 0.4089 | 25.4995| 0.9133 | 0.9122 | 0.9126    |
| 1.5667        | 10.0  | 7810  | 1.5196 | 0.4906  | 0.2416          | 0.411     | 0.4112 | 25.3867| 0.9135 | 0.9123 | 0.9127    |
| 1.5521        | 11.0  | 8592  | 1.5124 | 0.4899  | 0.2406          | 0.4102    | 0.4103 | 25.2191| 0.9137 | 0.912  | 0.9127    |
| 1.5413        | 12.0  | 9373  | 1.5083 | 0.4914  | 0.2416          | 0.4118    | 0.412  | 25.3491| 0.9137 | 0.9123 | 0.9128    |
| 1.5291        | 13.0  | 10154 | 1.5044 | 0.4913  | 0.2419          | 0.4118    | 0.4119 | 25.2082| 0.914  | 0.9123 | 0.913     |
| 1.527         | 14.0  | 10935 | 1.5026 | 0.4917  | 0.2426          | 0.4126    | 0.4128 | 25.1069| 0.9141 | 0.9123 | 0.913     |
| 1.5203        | 15.0  | 11717 | 1.5006 | 0.4921  | 0.243           | 0.4135    | 0.4136 | 25.1062| 0.9143 | 0.9123 | 0.9131    |
| 1.5126        | 16.0  | 12496 | 1.5004 | 0.4923  | 0.2429          | 0.4134    | 0.4134 | 25.1335| 0.9143 | 0.9124 | 0.9132    |


### Framework versions

- Transformers 4.36.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.5
- Tokenizers 0.15.0