File size: 3,454 Bytes
95f730e
 
 
e7e3d08
95f730e
e7e3d08
95f730e
e7e3d08
95f730e
 
e7e3d08
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
95f730e
 
 
 
 
 
 
 
 
 
 
 
 
e7e3d08
95f730e
 
e7e3d08
95f730e
 
e7e3d08
95f730e
 
e7e3d08
95f730e
 
 
 
e7e3d08
95f730e
 
e7e3d08
 
95f730e
 
 
 
 
 
e7e3d08
95f730e
e7e3d08
 
 
 
 
 
 
 
95f730e
 
 
 
 
e7e3d08
95f730e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e7e3d08
95f730e
e7e3d08
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
pipeline_tag: summarization
datasets:
  - samsum
language:
  - en
metrics:
  - rouge
library_name: transformers
widget:
  - text: |
      Rita: I'm so bloody tired. Falling asleep at work. :-(
      Tina: I know what you mean.
      Tina: I keep on nodding off at my keyboard hoping that the boss doesn't notice..
      Rita: The time just keeps on dragging on and on and on.... 
      Rita: I keep on looking at the clock and there's still 4 hours of this drudgery to go.
      Tina: Times like these I really hate my work.
      Rita: I'm really not cut out for this level of boredom.
      Tina: Neither am I.
  - text: |
      Beatrice: I am in town, shopping. They have nice scarfs in the shop next to the church. Do you want one?
      Leo: No, thanks
      Beatrice: But you don't have a scarf.
      Leo: Because I don't need it.
      Beatrice: Last winter you had a cold all the time. A scarf could help.
      Leo: I don't like them.
      Beatrice: Actually, I don't care. You will get a scarf.
      Leo: How understanding of you!
      Beatrice: You were complaining the whole winter that you're going to die. I've had enough.
      Leo: Eh.
  - text: |
      Jack: Cocktails later?
      May: YES!!!
      May: You read my mind...
      Jack: Possibly a little tightly strung today?
      May: Sigh... without question.
      Jack: Thought so.
      May: A little drink will help!
      Jack: Maybe two!

model-index:
  - name: bart-finetuned-samsum
    results:
      - task:
          name: Text Summarization
          type: summarization
        dataset:
          name: SamSum
          type: samsum
        metrics:
          - name: Validation ROUGE-1
            type: rouge-1
            value: 53.6163
          - name: Validation ROUGE-2
            type: rouge-2
            value: 28.914
          - name: Validation ROUGE-L
            type: rougeL
            value: 44.1443
          - name: Validation ROUGE-L Sum
            type: rougeLsum
            value: 49.2995
---

# Description

This model was trained by fine-tuning the [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) model using [these parameters](#training-parameters) and the [samsum dataset](https://huggingface.co/datasets/samsum).

## Development

- Jupyter Notebook: [Text Summarization With BART](https://github.com/adedamola26/text-summarization/blob/main/Text_Summarization_with_BART.ipynb)

## Usage

```python
from transformers import pipeline

model = pipeline("summarization", model="adedamolade26/bart-finetuned-samsum")

conversation = '''Jack: Cocktails later?
May: YES!!!
May: You read my mind...
Jack: Possibly a little tightly strung today?
May: Sigh... without question.
Jack: Thought so.
May: A little drink will help!
Jack: Maybe two!
'''
model(conversation)
```

## Training Parameters

```python
evaluation_strategy = "epoch",
save_strategy = 'epoch',
load_best_model_at_end = True,
metric_for_best_model = 'eval_loss',
seed = 42,
learning_rate=2e-5,
per_device_train_batch_size=4,
per_device_eval_batch_size=4,
gradient_accumulation_steps=2,
weight_decay=0.01,
save_total_limit=2,
num_train_epochs=4,
predict_with_generate=True,
fp16=True,
report_to="none"
```

## References

Model Training process was adapted from Luis Fernando Torres's [Kaggle Notebook](https://www.kaggle.com/code/lusfernandotorres/text-summarization-with-large-language-models): 📝 Text Summarization with Large Language Models