File size: 3,413 Bytes
47afd54
 
1a37be9
 
 
 
 
 
 
 
 
 
47afd54
1a37be9
 
 
 
 
 
 
 
 
1e17e14
1a37be9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d844044
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
---
license: mit
language:
- de
tags:
- title generation
- headline-generation
- teaser generation
- keyword generation
- tweet generation
- news
inference: false
---

# snip-igel-500

<!-- Provide a quick summary of what the model is/does. -->

snip-igel-500 
Version 1.0 / 13 April 2023

An adapter for [IGEL](https://huggingface.co/philschmid/instruct-igel-001) to generate german news snippets with human written instructions.
For usage example see this [notebook](https://github.com/snipaid-nlg/igel-lora-finetune-news-snippets/blob/main/getting-started-with-igel-lora-finetuned.ipynb).

# Model Details

## Model Description

<!-- Provide a longer summary of what this model is. -->

Test generation capabilities here: https://snipaid.tech

SNIP-IGEL is a continued instruction-tuned LoRa-Adapter to generate titles, teasers, summaries, tweets and keywords from the text of a news article in german language. [IGEL](https://huggingface.co/philschmid/instruct-igel-001) is an instruction-tuned model on top of the pre-trained german version of BLOOM ([bloom-6b4-clp-german](https://huggingface.co/malteos/bloom-6b4-clp-german)). It was developed by fine-tuning with a machine translated instruction-dataset, aimed to explore the potential of the BLOOM architecture for language modeling tasks requiring instruction-based responses.

- **Developed by:** snipaid
- **Model type:** bloom
- **Language(s) (NLP):** de
- **License:** MIT
- **Finetuned from model:** [IGEL](https://huggingface.co/philschmid/instruct-igel-001)

# Uses

<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->

SNIP-IGEL is intended to be used for generating snippets for german news articles. It can be used by researchers, journalists, content creators and news agencies to automatically generate snippets for their articles in german language.

# Bias, Risks, and Limitations

<!-- This section is meant to convey both technical and sociotechnical limitations. -->
Several common deficiencies can be observed, including hallucination, toxicity and stereotypes.

# Training Details

## Training Data

<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->

SNIP-IGEL has been fine-tuned on [instruct-snippet-mlsum](https://huggingface.co/datasets/snipaid/instruct-snippet-mlsum). MLSUM is a dataset containing a german subset with text, title and teaser for news articles from the newspaper "Süddeutsche Zeitung". The dataset has been augmented with snippet data generated using a composite prompt which involves generating a SERP, keywords and a tweet for the news articles using a student-teacher-approach. Also see [snippet-mlsum-500](https://huggingface.co/datasets/snipaid/snippet-mlsum-500) for the dataset without instructions and our [blogpost](https://snipaid-nlg.github.io/2023/04/13/SNIP-IGEL.html) for more information about the construction of the dataset.

# Environmental Impact

Carbon emissions were estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact/#compute) presented in Lacoste et al. (2019).

Hardware Type: RTX 4090  
Hours used: 1h 50min 21s  
Cloud Provider: Vast.ai  
Compute Region: Poland  
Carbon Emitted: ~0.54 kg of CO2e