Edit model card

SetFit Aspect Model with sentence-transformers/all-MiniLM-L6-v2

This is a SetFit model that can be used for Aspect Based Sentiment Analysis (ABSA). This SetFit model uses sentence-transformers/all-MiniLM-L6-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification. In particular, this model is in charge of filtering aspect span candidates.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

This model was trained within the context of a larger system for ABSA, which looks like so:

  1. Use a spaCy model to select possible aspect span candidates.
  2. Use this SetFit model to filter these possible aspect span candidates.
  3. Use a SetFit model to classify the filtered aspect span candidates.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
aspect
  • 'pencarian lawan:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'game:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'bugnya:bugnya nakal banget y coc cr aja sukanya ngebug pas match suka hitam match relog kalo udah relog lawan udah 1 2 mahkota kecewa sih bintang nya 1 aja bug nya diurus bintang lawannya kadang g setara levelnya dahlah gk suka banget kalo main 2 vs 2 temen suka banget afk coba fitur report'
no aspect
  • 'player trophy mahkotanya jaraknya:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'peleton akun perbedaan level:kapada supercell game nya bagus seru tolong diperbaiki pencarian lawan bermain ketemu player trophy mahkotanya jaraknya dapet berpengaruh peleton akun perbedaan level'
  • 'y coc cr:bugnya nakal banget y coc cr aja sukanya ngebug pas match suka hitam match relog kalo udah relog lawan udah 1 2 mahkota kecewa sih bintang nya 1 aja bug nya diurus bintang lawannya kadang g setara levelnya dahlah gk suka banget kalo main 2 vs 2 temen suka banget afk coba fitur report'

Evaluation

Metrics

Label Accuracy
all 0.8307

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import AbsaModel

# Download from the 🤗 Hub
model = AbsaModel.from_pretrained(
    "Funnyworld1412/ABSA_bert-base_MiniLM-L6-aspect",
    "Funnyworld1412/ABSA_bert-base_MiniLM-L6-polarity",
)
# Run inference
preds = model("The food was great, but the venue is just way too busy.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 2 29.9357 80
Label Training Sample Count
no aspect 3834
aspect 1266

Training Hyperparameters

  • batch_size: (4, 4)
  • num_epochs: (1, 1)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 5
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.2715 -
0.0039 50 0.2364 -
0.0078 100 0.1076 -
0.0118 150 0.3431 -
0.0157 200 0.2411 -
0.0196 250 0.361 -
0.0235 300 0.2227 -
0.0275 350 0.2087 -
0.0314 400 0.1956 -
0.0353 450 0.2815 -
0.0392 500 0.1844 -
0.0431 550 0.2053 -
0.0471 600 0.2884 -
0.0510 650 0.1043 -
0.0549 700 0.2074 -
0.0588 750 0.1627 -
0.0627 800 0.3 -
0.0667 850 0.1658 -
0.0706 900 0.1582 -
0.0745 950 0.2692 -
0.0784 1000 0.1823 -
0.0824 1050 0.4098 -
0.0863 1100 0.1992 -
0.0902 1150 0.0793 -
0.0941 1200 0.3924 -
0.0980 1250 0.0339 -
0.1020 1300 0.2236 -
0.1059 1350 0.2262 -
0.1098 1400 0.111 -
0.1137 1450 0.0223 -
0.1176 1500 0.3994 -
0.1216 1550 0.0417 -
0.1255 1600 0.3319 -
0.1294 1650 0.3223 -
0.1333 1700 0.2943 -
0.1373 1750 0.1273 -
0.1412 1800 0.2863 -
0.1451 1850 0.0988 -
0.1490 1900 0.1593 -
0.1529 1950 0.2209 -
0.1569 2000 0.5017 -
0.1608 2050 0.1392 -
0.1647 2100 0.1372 -
0.1686 2150 0.3491 -
0.1725 2200 0.2693 -
0.1765 2250 0.1988 -
0.1804 2300 0.2765 -
0.1843 2350 0.238 -
0.1882 2400 0.0577 -
0.1922 2450 0.2253 -
0.1961 2500 0.16 -
0.2 2550 0.0262 -
0.2039 2600 0.0099 -
0.2078 2650 0.0132 -
0.2118 2700 0.2356 -
0.2157 2750 0.2975 -
0.2196 2800 0.154 -
0.2235 2850 0.0308 -
0.2275 2900 0.0497 -
0.2314 2950 0.0523 -
0.2353 3000 0.158 -
0.2392 3050 0.0473 -
0.2431 3100 0.208 -
0.2471 3150 0.2126 -
0.2510 3200 0.081 -
0.2549 3250 0.0134 -
0.2588 3300 0.1107 -
0.2627 3350 0.0249 -
0.2667 3400 0.0259 -
0.2706 3450 0.1008 -
0.2745 3500 0.0335 -
0.2784 3550 0.0119 -
0.2824 3600 0.2982 -
0.2863 3650 0.1516 -
0.2902 3700 0.1217 -
0.2941 3750 0.1558 -
0.2980 3800 0.0359 -
0.3020 3850 0.0215 -
0.3059 3900 0.2906 -
0.3098 3950 0.0599 -
0.3137 4000 0.1528 -
0.3176 4050 0.0144 -
0.3216 4100 0.298 -
0.3255 4150 0.0174 -
0.3294 4200 0.0093 -
0.3333 4250 0.0329 -
0.3373 4300 0.1795 -
0.3412 4350 0.0712 -
0.3451 4400 0.3703 -
0.3490 4450 0.0873 -
0.3529 4500 0.3223 -
0.3569 4550 0.0045 -
0.3608 4600 0.2188 -
0.3647 4650 0.0085 -
0.3686 4700 0.2089 -
0.3725 4750 0.0052 -
0.3765 4800 0.1459 -
0.3804 4850 0.0711 -
0.3843 4900 0.4268 -
0.3882 4950 0.1842 -
0.3922 5000 0.1661 -
0.3961 5050 0.1028 -
0.4 5100 0.067 -
0.4039 5150 0.1708 -
0.4078 5200 0.1001 -
0.4118 5250 0.065 -
0.4157 5300 0.0279 -
0.4196 5350 0.1101 -
0.4235 5400 0.1923 -
0.4275 5450 0.5491 -
0.4314 5500 0.0726 -
0.4353 5550 0.0085 -
0.4392 5600 0.194 -
0.4431 5650 0.2527 -
0.4471 5700 0.7134 -
0.4510 5750 0.4542 -
0.4549 5800 0.2779 -
0.4588 5850 0.1024 -
0.4627 5900 0.2483 -
0.4667 5950 0.0163 -
0.4706 6000 0.0095 -
0.4745 6050 0.2902 -
0.4784 6100 0.0111 -
0.4824 6150 0.0296 -
0.4863 6200 0.3792 -
0.4902 6250 0.4387 -
0.4941 6300 0.1547 -
0.4980 6350 0.0617 -
0.5020 6400 0.1384 -
0.5059 6450 0.0677 -
0.5098 6500 0.0454 -
0.5137 6550 0.0074 -
0.5176 6600 0.1994 -
0.5216 6650 0.0168 -
0.5255 6700 0.0416 -
0.5294 6750 0.1898 -
0.5333 6800 0.0207 -
0.5373 6850 0.1046 -
0.5412 6900 0.1994 -
0.5451 6950 0.0435 -
0.5490 7000 0.0149 -
0.5529 7050 0.0067 -
0.5569 7100 0.0122 -
0.5608 7150 0.2406 -
0.5647 7200 0.4473 -
0.5686 7250 0.0469 -
0.5725 7300 0.1782 -
0.5765 7350 0.3386 -
0.5804 7400 0.2804 -
0.5843 7450 0.0072 -
0.5882 7500 0.0451 -
0.5922 7550 0.0188 -
0.5961 7600 0.01 -
0.6 7650 0.0048 -
0.6039 7700 0.2349 -
0.6078 7750 0.2052 -
0.6118 7800 0.0838 -
0.6157 7850 0.3052 -
0.6196 7900 0.3667 -
0.6235 7950 0.0044 -
0.6275 8000 0.3612 -
0.6314 8050 0.2082 -
0.6353 8100 0.3384 -
0.6392 8150 0.022 -
0.6431 8200 0.0764 -
0.6471 8250 0.2879 -
0.6510 8300 0.1827 -
0.6549 8350 0.1104 -
0.6588 8400 0.2096 -
0.6627 8450 0.2103 -
0.6667 8500 0.0742 -
0.6706 8550 0.2186 -
0.6745 8600 0.0109 -
0.6784 8650 0.0326 -
0.6824 8700 0.3056 -
0.6863 8750 0.0941 -
0.6902 8800 0.3731 -
0.6941 8850 0.2185 -
0.6980 8900 0.0228 -
0.7020 8950 0.0141 -
0.7059 9000 0.2242 -
0.7098 9050 0.3303 -
0.7137 9100 0.2383 -
0.7176 9150 0.0026 -
0.7216 9200 0.1718 -
0.7255 9250 0.053 -
0.7294 9300 0.0023 -
0.7333 9350 0.221 -
0.7373 9400 0.0021 -
0.7412 9450 0.2333 -
0.7451 9500 0.0565 -
0.7490 9550 0.0271 -
0.7529 9600 0.2156 -
0.7569 9650 0.2349 -
0.7608 9700 0.0047 -
0.7647 9750 0.1273 -
0.7686 9800 0.0139 -
0.7725 9850 0.0231 -
0.7765 9900 0.0048 -
0.7804 9950 0.0022 -
0.7843 10000 0.0026 -
0.7882 10050 0.0223 -
0.7922 10100 0.5488 -
0.7961 10150 0.0281 -
0.8 10200 0.0999 -
0.8039 10250 0.2154 -
0.8078 10300 0.0109 -
0.8118 10350 0.0019 -
0.8157 10400 0.1264 -
0.8196 10450 0.0029 -
0.8235 10500 0.3785 -
0.8275 10550 0.0366 -
0.8314 10600 0.0527 -
0.8353 10650 0.2355 -
0.8392 10700 0.0833 -
0.8431 10750 0.1612 -
0.8471 10800 0.0071 -
0.8510 10850 0.1128 -
0.8549 10900 0.2521 -
0.8588 10950 0.0403 -
0.8627 11000 0.2196 -
0.8667 11050 0.1441 -
0.8706 11100 0.0295 -
0.8745 11150 0.0047 -
0.8784 11200 0.3089 -
0.8824 11250 0.1055 -
0.8863 11300 0.0064 -
0.8902 11350 0.2119 -
0.8941 11400 0.2145 -
0.8980 11450 0.0128 -
0.9020 11500 0.0086 -
0.9059 11550 0.1803 -
0.9098 11600 0.2277 -
0.9137 11650 0.0204 -
0.9176 11700 0.0105 -
0.9216 11750 0.005 -
0.9255 11800 0.0099 -
0.9294 11850 0.004 -
0.9333 11900 0.1824 -
0.9373 11950 0.0021 -
0.9412 12000 0.2231 -
0.9451 12050 0.0017 -
0.9490 12100 0.0752 -
0.9529 12150 0.0129 -
0.9569 12200 0.1644 -
0.9608 12250 0.0305 -
0.9647 12300 0.0133 -
0.9686 12350 0.0687 -
0.9725 12400 0.0039 -
0.9765 12450 0.1179 -
0.9804 12500 0.1867 -
0.9843 12550 0.0225 -
0.9882 12600 0.1914 -
0.9922 12650 0.0592 -
0.9961 12700 0.0059 -
1.0 12750 0.1016 0.2295

Framework Versions

  • Python: 3.10.13
  • SetFit: 1.0.3
  • Sentence Transformers: 3.0.1
  • spaCy: 3.7.5
  • Transformers: 4.36.2
  • PyTorch: 2.1.2
  • Datasets: 2.19.2
  • Tokenizers: 0.15.2

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
13
Safetensors
Model size
22.7M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for Funnyworld1412/ABSA_bert-base_MiniLM-L6-aspect

Finetuned
(153)
this model

Evaluation results