emilstabil commited on
Commit
45db536
·
1 Parent(s): 75568bd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -51
README.md CHANGED
@@ -5,71 +5,37 @@ tags:
5
  metrics:
6
  - rouge
7
  model-index:
8
- - name: DanSumT5-largeV_38143V_15157V_96478
9
  results: []
 
10
  ---
11
 
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
  should probably proofread and complete it, then remove this comment. -->
14
 
15
- # DanSumT5-largeV_38143V_15157V_96478
16
-
17
- This model is a fine-tuned version of [emilstabil/DanSumT5-largeV_38143V_15157](https://huggingface.co/emilstabil/DanSumT5-largeV_38143V_15157) on the None dataset.
18
- It achieves the following results on the evaluation set:
19
- - Loss: 1.9819
20
- - Rouge1: 35.982
21
- - Rouge2: 12.5438
22
- - Rougel: 22.7137
23
- - Rougelsum: 33.5334
24
- - Gen Len: 124.173
25
 
26
  ## Model description
27
 
28
- More information needed
29
-
30
- ## Intended uses & limitations
31
-
32
- More information needed
33
-
34
- ## Training and evaluation data
35
-
36
- More information needed
37
-
38
- ## Training procedure
39
-
40
- ### Training hyperparameters
41
-
42
- The following hyperparameters were used during training:
43
- - learning_rate: 1e-05
44
- - train_batch_size: 4
45
- - eval_batch_size: 4
46
- - seed: 42
47
- - gradient_accumulation_steps: 4
48
- - total_train_batch_size: 16
49
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
- - lr_scheduler_type: linear
51
- - num_epochs: 11
52
-
53
- ### Training results
54
 
55
- | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
- |:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:--------:|
57
- | No log | 0.99 | 118 | 1.9875 | 35.6378 | 12.3785 | 22.4666 | 33.224 | 123.1814 |
58
- | No log | 2.0 | 237 | 1.9991 | 35.9161 | 12.5761 | 22.7594 | 33.6048 | 123.5865 |
59
- | No log | 3.0 | 356 | 1.9994 | 36.0651 | 12.7545 | 22.9642 | 33.6968 | 123.6203 |
60
- | No log | 4.0 | 475 | 1.9980 | 35.9273 | 12.6691 | 22.818 | 33.609 | 123.4515 |
61
- | 1.4198 | 4.99 | 593 | 2.0076 | 35.5438 | 12.2242 | 22.5019 | 33.237 | 123.7257 |
62
- | 1.4198 | 6.0 | 712 | 2.0032 | 36.0019 | 12.7386 | 22.9014 | 33.7588 | 124.5443 |
63
- | 1.4198 | 7.0 | 831 | 2.0001 | 35.8585 | 12.7149 | 22.8298 | 33.6196 | 124.4008 |
64
- | 1.4198 | 8.0 | 950 | 1.9945 | 35.6975 | 12.4727 | 22.6524 | 33.3949 | 124.5316 |
65
- | 1.4397 | 8.99 | 1068 | 1.9898 | 35.944 | 12.6829 | 22.9022 | 33.5212 | 124.1181 |
66
- | 1.4397 | 10.0 | 1187 | 1.9843 | 36.0341 | 12.5681 | 22.7855 | 33.5415 | 124.0084 |
67
- | 1.4397 | 10.93 | 1298 | 1.9819 | 35.982 | 12.5438 | 22.7137 | 33.5334 | 124.173 |
68
 
 
 
 
 
69
 
70
  ### Framework versions
71
 
72
  - Transformers 4.30.2
73
  - Pytorch 1.12.1+git7548e2f
74
  - Datasets 2.13.2
75
- - Tokenizers 0.13.3
 
5
  metrics:
6
  - rouge
7
  model-index:
8
+ - name: DaMedSum-large
9
  results: []
10
+ pipeline_tag: summarization
11
  ---
12
 
13
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
  should probably proofread and complete it, then remove this comment. -->
15
 
16
+ ```
17
+ _____ ______ __ __ ______ _____ ______ __ __ __ __
18
+ /\ __-. /\ __ \ /\ "-./ \ /\ ___\ /\ __-. /\ ___\ /\ \/\ \ /\ "-./ \
19
+ \ \ \/\ \\ \ __ \\ \ \-./\ \\ \ __\ \ \ \/\ \\ \___ \\ \ \_\ \\ \ \-./\ \
20
+ \ \____- \ \_\ \_\\ \_\ \ \_\\ \_____\\ \____- \/\_____\\ \_____\\ \_\ \ \_\
21
+ \/____/ \/_/\/_/ \/_/ \/_/ \/_____/ \/____/ \/_____/ \/_____/ \/_/ \/_/
22
+
23
+ ```
 
 
24
 
25
  ## Model description
26
 
27
+ This repository contains a model for Danish abstractive summarisation of medicaltext.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
28
 
29
+ This model is a fine-tuned version of DanSum-large on a danish medical dataset.
 
 
 
 
 
 
 
 
 
 
 
 
30
 
31
+ ## Authors
32
+ Nicolaj Larsen,
33
+ Mikkel Kildeberg &
34
+ Emil Schledermann
35
 
36
  ### Framework versions
37
 
38
  - Transformers 4.30.2
39
  - Pytorch 1.12.1+git7548e2f
40
  - Datasets 2.13.2
41
+ - Tokenizers 0.13.3