File size: 1,571 Bytes
dd8c9c5
 
5322785
610a51e
 
 
 
 
 
 
dd8c9c5
 
610a51e
dd8c9c5
5322785
610a51e
dd8c9c5
610a51e
dd8c9c5
610a51e
 
dd8c9c5
610a51e
 
 
 
 
dd8c9c5
610a51e
dd8c9c5
178c8a5
dd8c9c5
610a51e
dd8c9c5
610a51e
dd8c9c5
610a51e
 
178c8a5
610a51e
 
dd8c9c5
610a51e
dd8c9c5
 
 
610a51e
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
base_model: meta-llama/Llama-2-7b-hf
datasets: stanfordnlp/imdb
library_name: transformers
model_name: llama2-7b-SFT
tags:
- generated_from_trainer
- trl
- sft
licence: license
---

# Model Card for llama2-7b-SFT

This model is a fine-tuned version of [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) on the [stanfordnlp/imdb](https://huggingface.co/datasets/stanfordnlp/imdb) dataset.
It has been trained using [TRL](https://github.com/huggingface/trl).

## Quick start

```python
from transformers import pipeline

question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
generator = pipeline("text-generation", model="mingxilei/llama2-7b-SFT", device="cuda")
output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
print(output["generated_text"])
```

## Training procedure



This model was trained with SFT.

### Framework versions

- TRL: 0.12.2
- Transformers: 4.46.3
- Pytorch: 2.5.1+cu124
- Datasets: 3.1.0
- Tokenizers: 0.20.3

## Citations



Cite TRL as:
    
```bibtex
@misc{vonwerra2022trl,
	title        = {{TRL: Transformer Reinforcement Learning}},
	author       = {Leandro von Werra and Younes Belkada and Lewis Tunstall and Edward Beeching and Tristan Thrush and Nathan Lambert and Shengyi Huang and Kashif Rasul and Quentin Gallouédec},
	year         = 2020,
	journal      = {GitHub repository},
	publisher    = {GitHub},
	howpublished = {\url{https://github.com/huggingface/trl}}
}
```