File size: 2,137 Bytes
e42b969 9a53e00 e42b969 56a4ff3 e42b969 56a4ff3 e42b969 56a4ff3 e42b969 56a4ff3 e42b969 56a4ff3 36d5bc2 56a4ff3 e42b969 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 |
---
license: mit
datasets:
- samsum
language:
- en
---
# Llama-2-7b Fine-Tuned Summarization Model
## Overview
The Llama-2-7b Fine-Tuned Summarization Model is a language model fine-tuned for the task of text summarization using QLora.
It has been fine-tuned on the samsum dataset, which contains a wide variety of coversation.
## Model Details
- Base Model: [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)
- Fine-Tuned on: [samsum dataset](https://huggingface.co/datasets/samsum)
- Language: English
## How to Use
You can use this model for text summarization tasks by utilizing the Hugging Face Transformers library. Here's a basic example in Python:
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig
model_id = "SalmanFaroz/Llama-2-7b-QLoRa-samsum"
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.bfloat16
)
model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained(model_id)
tokenizer.pad_token = tokenizer.eos_token
tokenizer.padding_side = "right"
# Define the input prompt
prompt = """
Summarize the following conversation.
### Input:
Itachi: Kakashi, you must understand the gravity of the situation. The Akatsuki's plans are far more sinister than you can imagine.
Kakashi: Itachi, I need more than vague warnings. Tell me what you know.
Itachi: Very well. The Akatsuki seeks to capture Naruto for the power of the Nine-Tails sealed within him, but there's an even darker secret lurking within their goals.
Kakashi: Darker than that? What are they truly after?
Itachi: They're hunting the Tailed Beasts for a cataclysmic plan to reshape the world, and only we can stop them, together.
### Summary:
"""
inputs = tokenizer(prompt, return_tensors='pt')
output = tokenizer.decode(
model.generate(
inputs["input_ids"],
max_new_tokens=100,
)[0],
skip_special_tokens=True
)
print("Output:",output)
|