Edit model card

SetFit with thenlper/gte-large

This is a SetFit model trained on the dvilasuero/banking77-topics-setfit dataset that can be used for Text Classification. This SetFit model uses thenlper/gte-large as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
2
  • 'The money I transferred does not show in the balance.'
  • 'I was wondering how I could have two charges for the same item happen more than once in a 7 day period. Is there anyway I could get this corrected asap.'
  • 'What is the source of my available funds?'
0
  • 'Do you support the EU?'
  • "Can you freeze my account? I just saw there are transactions on my account that I don't recognize. How can I fix this?"
  • 'Please close my account. I am unsatisfied with your service.'
5
  • 'Are you able to unblock my pin?'
  • 'I can not find my card pin.'
  • 'If I need a PIN for my card, where is it located?'
1
  • "I can't get money out of the ATM"
  • 'Where can I use this card at an ATM?'
  • 'Can I use my card at any ATMs?'
3
  • 'Can I get cash with this card anywhere?'
  • 'Can you please show me where I can find the location to link my card?'
  • 'Am I able to get a card in EU?'
6
  • 'My friends want to top up my account'
  • 'Can I be topped up once I hit a certain balance?'
  • 'Can you tell me why my top up was reverted?'
7
  • 'How do I send my account money through transfer?'
  • 'How do I transfer money to my account?'
  • 'How can I transfer money from an outside bank?'
4
  • 'Do you work with all fiat currencies?'
  • 'Can I exchange to EUR?'
  • 'Is my country supported'

Evaluation

Metrics

Label Accuracy
all 0.9231

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("HarshalBhg/gte-large-setfit-train-test2")
# Run inference
preds = model("I have a 1 euro fee on my statement.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 4 10.5833 40
Label Training Sample Count
0 10
1 19
2 28
3 36
4 13
5 14
6 15
7 21

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 20
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0026 1 0.3183 -
0.1282 50 0.0614 -
0.2564 100 0.0044 -
0.3846 150 0.001 -
0.5128 200 0.0008 -
0.6410 250 0.001 -
0.7692 300 0.0006 -
0.8974 350 0.0012 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.1
  • Sentence Transformers: 2.2.2
  • Transformers: 4.35.2
  • PyTorch: 2.1.0+cu121
  • Datasets: 2.15.0
  • Tokenizers: 0.15.0

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
2
Safetensors
Model size
335M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for HarshalBhg/gte-large-setfit-train-test2

Base model

thenlper/gte-large
Finetuned
(14)
this model

Dataset used to train HarshalBhg/gte-large-setfit-train-test2

Evaluation results