SetFit with mini1013/master_domain

This is a SetFit model that can be used for Text Classification. This SetFit model uses mini1013/master_domain as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
3
  • '네일팁 실크익스텐션 311160L1720771597 티타늄금 물방울 (풀값 ) LotteOn > 뷰티 > 네일케어 > 네일케어도구 > 손톱깎이 LotteOn > 뷰티 > 네일케어 > 네일케어도구 > 손톱깎이'
  • '엔비베베 어린이 화장품 선물세트 어린이 썬쿠션+키즈네일스티커+워시패드 1개 (#M)쿠팡 홈>뷰티>어린이화장품>세트/키트 Coupang > 뷰티 > 어린이화장품 > 세트/키트'
  • '래쉬톡 원터치 인조 속눈썹 섹시 걸 × 3개입 LotteOn > 뷰티 > 뷰티기기/소품 > 아이/브로우소품 > 속눈썹관리 LotteOn > 뷰티 > 뷰티기기/소품 > 아이/브로우소품 > 속눈썹관리'
0
  • '오피아이 넌아세톤 리무버 빨강 30ml × 5개 (#M)쿠팡 홈>뷰티>네일>일반네일>리무버 Coupang > 뷰티 > 네일 > 일반네일 > 리무버'
  • '[OPI][리무버] 넌아세톤리무버 30ml ssg > 뷰티 > 메이크업 > 네일 ssg > 뷰티 > 메이크업 > 네일'
  • '포먼트 젤네일 O.4 블러쉬 뷰티 × 1개 (#M)쿠팡 홈>뷰티>네일>젤네일>컬러 젤 Coupang > 뷰티 > 네일 > 젤네일 > 컬러 젤'
2
  • '오피아이 프로스파 오일투고 큐티클 오일2197877 1 7.5ml x 1개2197877 1 (#M)SSG.COM/메이크업/베이스메이크업/컨실러 ssg > 뷰티 > 메이크업 > 베이스메이크업 > 컨실러'
  • '구찌 뷰티 [구찌] 베르니 아 옹글 하이 샤인 네일 라커 712 멜린다 그린 × 선택완료 (#M)쿠팡 홈>뷰티>네일>일반네일>컬러 매니큐어 Coupang > 뷰티 > 네일 > 일반네일 > 컬러 매니큐어'
  • 'OPI ProSpa 각질 제거 큐티클 크림, 27ml SSG.COM/메이크업/베이스메이크업/메이크업베이스;ssg > 뷰티 > 메이크업 > 베이스메이크업 > 메이크업베이스 ssg > 뷰티 > 메이크업 > 베이스메이크업 > 메이크업베이스'
1
  • '르 베르니 루쥬 느와르 DepartmentLotteOn > 뷰티 > 헤어/바디 > 핸드/풋케어 > 네일케어 DepartmentLotteOn > 뷰티 > 헤어/바디 > 핸드/풋케어 > 네일케어'
  • '베씨 베이스젤 + 탑젤 + 지브라파일 2p 세트 베이스젤, 탑젤, 지브라파일(100/150) × 1세트 LotteOn > 뷰티 > 네일 > 네일아트소품 LotteOn > 뷰티 > 네일 > 네일아트소품'
  • 'OPI OPI Chrome Effects Nail Lacquer Top Coat CPT31 - 0.5 oz 상세내용참조 × 상세내용참조 (#M)쿠팡 홈>뷰티>메이크업>베이스 메이크업>베이스/프라이머 Coupang > 뷰티 > 메이크업 > 베이스 메이크업 > 베이스/프라이머'

Evaluation

Metrics

Label Accuracy
all 0.5302

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_bt1_test_flat_top_cate")
# Run inference
preds = model("디올 베르니 212 튀튀 LotteOn > 뷰티 > 메이크업 > 메이크업세트 LotteOn > 뷰티 > 메이크업 > 메이크업세트")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 13 22.7236 41
Label Training Sample Count
0 49
1 50
2 50
3 50

Training Hyperparameters

  • batch_size: (64, 64)
  • num_epochs: (30, 30)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 100
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • l2_weight: 0.01
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0032 1 0.4603 -
0.1608 50 0.4502 -
0.3215 100 0.4315 -
0.4823 150 0.3996 -
0.6431 200 0.365 -
0.8039 250 0.2954 -
0.9646 300 0.2647 -
1.1254 350 0.2378 -
1.2862 400 0.2257 -
1.4469 450 0.2165 -
1.6077 500 0.213 -
1.7685 550 0.1999 -
1.9293 600 0.1838 -
2.0900 650 0.1614 -
2.2508 700 0.1164 -
2.4116 750 0.0553 -
2.5723 800 0.0366 -
2.7331 850 0.0279 -
2.8939 900 0.0219 -
3.0547 950 0.0166 -
3.2154 1000 0.0111 -
3.3762 1050 0.0067 -
3.5370 1100 0.0084 -
3.6977 1150 0.0066 -
3.8585 1200 0.0048 -
4.0193 1250 0.0028 -
4.1801 1300 0.0005 -
4.3408 1350 0.0003 -
4.5016 1400 0.0004 -
4.6624 1450 0.0001 -
4.8232 1500 0.0001 -
4.9839 1550 0.0001 -
5.1447 1600 0.0001 -
5.3055 1650 0.0001 -
5.4662 1700 0.0002 -
5.6270 1750 0.0 -
5.7878 1800 0.0 -
5.9486 1850 0.0 -
6.1093 1900 0.0001 -
6.2701 1950 0.0 -
6.4309 2000 0.0 -
6.5916 2050 0.0 -
6.7524 2100 0.0 -
6.9132 2150 0.0002 -
7.0740 2200 0.0002 -
7.2347 2250 0.0 -
7.3955 2300 0.0 -
7.5563 2350 0.0 -
7.7170 2400 0.0 -
7.8778 2450 0.0 -
8.0386 2500 0.0 -
8.1994 2550 0.0 -
8.3601 2600 0.0 -
8.5209 2650 0.0 -
8.6817 2700 0.0 -
8.8424 2750 0.0 -
9.0032 2800 0.0 -
9.1640 2850 0.0 -
9.3248 2900 0.0 -
9.4855 2950 0.0 -
9.6463 3000 0.0 -
9.8071 3050 0.0 -
9.9678 3100 0.0 -
10.1286 3150 0.0 -
10.2894 3200 0.0 -
10.4502 3250 0.0 -
10.6109 3300 0.0 -
10.7717 3350 0.0 -
10.9325 3400 0.0 -
11.0932 3450 0.0 -
11.2540 3500 0.0 -
11.4148 3550 0.0 -
11.5756 3600 0.0 -
11.7363 3650 0.0 -
11.8971 3700 0.0 -
12.0579 3750 0.0004 -
12.2186 3800 0.0 -
12.3794 3850 0.0001 -
12.5402 3900 0.0001 -
12.7010 3950 0.0 -
12.8617 4000 0.0001 -
13.0225 4050 0.0002 -
13.1833 4100 0.0009 -
13.3441 4150 0.0037 -
13.5048 4200 0.0025 -
13.6656 4250 0.0009 -
13.8264 4300 0.0002 -
13.9871 4350 0.0002 -
14.1479 4400 0.0 -
14.3087 4450 0.0002 -
14.4695 4500 0.0001 -
14.6302 4550 0.0004 -
14.7910 4600 0.0008 -
14.9518 4650 0.0 -
15.1125 4700 0.0 -
15.2733 4750 0.0001 -
15.4341 4800 0.0 -
15.5949 4850 0.0 -
15.7556 4900 0.0002 -
15.9164 4950 0.0 -
16.0772 5000 0.0 -
16.2379 5050 0.0001 -
16.3987 5100 0.0 -
16.5595 5150 0.0 -
16.7203 5200 0.0 -
16.8810 5250 0.0 -
17.0418 5300 0.0 -
17.2026 5350 0.0 -
17.3633 5400 0.0 -
17.5241 5450 0.0 -
17.6849 5500 0.0 -
17.8457 5550 0.0 -
18.0064 5600 0.0 -
18.1672 5650 0.0 -
18.3280 5700 0.0 -
18.4887 5750 0.0 -
18.6495 5800 0.0 -
18.8103 5850 0.0 -
18.9711 5900 0.0 -
19.1318 5950 0.0 -
19.2926 6000 0.0 -
19.4534 6050 0.0 -
19.6141 6100 0.0 -
19.7749 6150 0.0 -
19.9357 6200 0.0 -
20.0965 6250 0.0 -
20.2572 6300 0.0 -
20.4180 6350 0.0 -
20.5788 6400 0.0 -
20.7395 6450 0.0 -
20.9003 6500 0.0 -
21.0611 6550 0.0 -
21.2219 6600 0.0 -
21.3826 6650 0.0 -
21.5434 6700 0.0 -
21.7042 6750 0.0 -
21.8650 6800 0.0 -
22.0257 6850 0.0 -
22.1865 6900 0.0 -
22.3473 6950 0.0 -
22.5080 7000 0.0 -
22.6688 7050 0.0 -
22.8296 7100 0.0 -
22.9904 7150 0.0 -
23.1511 7200 0.0 -
23.3119 7250 0.0 -
23.4727 7300 0.0 -
23.6334 7350 0.0 -
23.7942 7400 0.0 -
23.9550 7450 0.0 -
24.1158 7500 0.0 -
24.2765 7550 0.0 -
24.4373 7600 0.0 -
24.5981 7650 0.0 -
24.7588 7700 0.0 -
24.9196 7750 0.0 -
25.0804 7800 0.0 -
25.2412 7850 0.0 -
25.4019 7900 0.0 -
25.5627 7950 0.0 -
25.7235 8000 0.0 -
25.8842 8050 0.0 -
26.0450 8100 0.0 -
26.2058 8150 0.0 -
26.3666 8200 0.0 -
26.5273 8250 0.0 -
26.6881 8300 0.0 -
26.8489 8350 0.0 -
27.0096 8400 0.0 -
27.1704 8450 0.0 -
27.3312 8500 0.0 -
27.4920 8550 0.0 -
27.6527 8600 0.0 -
27.8135 8650 0.0 -
27.9743 8700 0.0 -
28.1350 8750 0.0 -
28.2958 8800 0.0 -
28.4566 8850 0.0 -
28.6174 8900 0.0 -
28.7781 8950 0.0 -
28.9389 9000 0.0 -
29.0997 9050 0.0 -
29.2605 9100 0.0 -
29.4212 9150 0.0 -
29.5820 9200 0.0 -
29.7428 9250 0.0 -
29.9035 9300 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.1.0
  • Sentence Transformers: 3.3.1
  • Transformers: 4.44.2
  • PyTorch: 2.2.0a0+81ea7a4
  • Datasets: 3.2.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
1
Safetensors
Model size
111M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mini1013/master_cate_bt1_test_flat_top_cate

Base model

klue/roberta-base
Finetuned
(136)
this model

Evaluation results