|
--- |
|
language: |
|
- en |
|
tags: |
|
- marketing |
|
license: llama3 |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
# LLaMarketing: A Marketing Large Language Model |
|
|
|
LLaMarketing is an 8B parameter Domain-Specific Large Language Model (LLM). |
|
It was specifically adapted to the marketing domain from [LLaMA-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) through continuous pretraining on a meticulously curated and comprehensive marketing corpus of more than 43B tokens. |
|
LLaMarketing outperforms LLaMA-2 & LLaMA3 on specific marketing tasks. We are releasing this **early checkpoint** of the model to the AI community. |
|
|
|
![LLaMarketing/png](https://cdn-uploads.huggingface.co/production/uploads/65e468008629cedec7980db6/oYLBemG-elYEPJyWj2vi4.png) |
|
|
|
|
|
### Model Description |
|
|
|
LLaMarketing is a powerful tool that can aid in generating high-quality marketing content and conducting research in the field of marketing. |
|
It's a great resource for anyone looking to stay ahead in the rapidly changing world of marketing. |
|
|
|
While the model is designed to encode marketing knowledge, this checkpoint is not yet adapted to deliver knowledge appropriately, safely, or within professional actionable constraints. |
|
We recommend against deploying LLaMarketing in real-world practice settings. |
|
|
|
### Model Details |
|
- Developed by: [Marketeam](https://www.marketeam.ai/) |
|
- Model type: Causal decoder-only transformer language model |
|
- Model License: LLAMA 3 COMMUNITY LICENSE AGREEMENT |
|
- Continue-pretrained from model: LLaMA-3-8B |
|
- Context length: 3K tokens |
|
- Input & Output: Text-only |
|
- Language: English |
|
- Knowledge Cutoff: December 2023 |
|
|
|
## Uses |
|
|
|
LLaMarketing has been developed for further research of LLM for marketing applications. |
|
The potential use cases for this tool are diverse and varied, ranging from marketing question answering to general marketing information queries, and actions (function-calls) on marketing platforms. |
|
|
|
LLaMarketing is a Foundation Language Model (FLM) without finetuning or instruction-tuning. |
|
We recommend applying SFT or RLHF-tuned for specific downstream tasks. Or rather apply in-context learning with 1000-1500 tokens added to the prompt. |
|
|
|
|
|
## Training Details |
|
|
|
### Training Data |
|
|
|
Marketing data from publicly available and **internal** sources such as: |
|
- Blogs |
|
- Books |
|
- Websites |
|
- Podcasts |
|
- Newsletters |
|
- Publications |
|
- Social Media |
|
- Ad-Campaigns |
|
- Landing Pages |
|
- Press Releases |
|
- Email-Campaigns |
|
- Brochures & Flyers |
|
- Product Description |
|
- Testimonials & Reviews |
|
- ... |
|
And ±10% of previously seen data to avoid *catastrophic forgetting*. |
|
|
|
|
|
### Training Procedure |
|
|
|
Our training procedure includes using the AWS SageMaker framework, 4 NVIDIA A100 GPUs, p4de.24xlarge machine. |
|
With a total train time of ±250 hours, with a total training cost of ±10K$. |
|
This is an **early checkpoint** of the model that we are releasing to the community. |
|
|
|
#### Training Hyperparameters |
|
|
|
| Param | Value | |
|
|---------------|------------| |
|
| bf16 | true | |
|
| tf32 | true | |
|
| lr | 1e-4 | |
|
| optim | adamw | |
|
| epochs | 1 | |
|
| lr scheduler | constant | |
|
| warmup ratio | 0.03 | |
|
| max grad norm | 0.3 | |
|
| context len | 3072 | |
|
|
|
|
|
|
|
## How to use |
|
|
|
#### Using Transformers pipeline |
|
|
|
```python |
|
import transformers |
|
import torch |
|
|
|
model_id = "marketeam/LLaMarketing" |
|
tokenizer_id = "meta-llama/Meta-Llama-3-8B" |
|
token = "hf-token" |
|
|
|
pipeline = transformers.pipeline("text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, |
|
tokenizer=tokenizer_id, token=token, device_map='auto') |
|
|
|
pipeline("What are the key components of a digital marketing strategy?") |
|
``` |
|
|
|
#### Using Transformers generate |
|
|
|
```python |
|
from transformers import AutoTokenizer, AutoModelForCausalLM |
|
import torch |
|
|
|
model_id = "marketeam/LLaMarketing" |
|
tokenizer_id = "meta-llama/Meta-Llama-3-8B" |
|
token = "hf_token" |
|
device = "cuda" if torch.cuda.is_available() else "cpu" |
|
|
|
tokenizer = AutoTokenizer.from_pretrained(tokenizer_id, token=token) |
|
model = AutoModelForCausalLM.from_pretrained( |
|
model_id, torch_dtype=torch.bfloat16, token=token).to(device) |
|
|
|
message = "How do I calculate customer lifetime value?" |
|
inputs = tokenizer(message, return_tensors="pt").to(device) |
|
outputs = model.generate(**inputs) |
|
tokenizer.batch_decode(outputs, skip_special_tokens=True) |
|
``` |
|
|
|
|
|
## Intended Usage |
|
|
|
LLaMarketing is now available for further testing and assessment. Potential use cases include, but are not limited to: |
|
- Text Generation: This model can produce creative text formats in the marketing domain. |
|
- Knowledge Exploration: It can assist marketing researchers by generating valuable marketing information or answering questions about marketing-specific topics. |
|
- Natural Language Processing (NLP) Research: This model can form the basis for researchers to experiment with NLP techniques, develop algorithms, and contribute to the advancement of the field. |
|
|
|
|
|
## Contributers |
|
|
|
[Sahar Millis](https://www.linkedin.com/in/sahar-millis/) [Coby Benveniste](https://www.linkedin.com/in/coby-benveniste/) [Nofar Sachs](https://www.linkedin.com/in/nofar-sachs-2146801b3/) [Eran Mazur](https://www.linkedin.com/in/eranmazur/) |