SetFit with BAAI/bge-large-en-v1.5

This is a SetFit model trained on the nazhan/brahmaputra-full-datasets-iter-8 dataset that can be used for Text Classification. This SetFit model uses BAAI/bge-large-en-v1.5 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
Lookup_1
  • 'Analyze product category revenue impact.'
  • 'Show me monthly EBIT by product.'
  • 'Visualize M&A deal size distribution.'
Tablejoin
  • 'Could you link the Orders and Employees tables to find out which departments are processing the most orders?'
  • 'Is it possible to combine the Employees and Orders tables to see which employees are assigned to specific order types?'
  • 'Join data_asset_kpi_cf with data_asset_001_kpm tables.'
Lookup
  • "Show me the details of employees with the last name 'Smith'."
  • "Filter by customers with the first name 'Emily' and show me their email addresses."
  • "Show me the products with 'Tablet' in the name and filter by price above 200."
Rejection
  • "Let's not worry about generating additional data."
  • "I'd prefer not to apply any filters."
  • "I don't want to sort or filter right now."
Viewtables
  • 'What is the inventory of tables held in the starhub_data_asset database?'
  • 'What tables are available in the starhub_data_asset database for performing basic data explorations?'
  • 'What is the complete list of all the tables stored in the starhub_data_asset database that require a join operation for data analysis?'
Generalreply
  • "Oh, I enjoy spending my free time doing a few different things! Sometimes I like to read, other times I might go for a walk or watch a movie. It really just depends on what I'm in the mood for. What about you, how do you like to spend your free time?"
  • 'What is your favorite color?'
  • "that's not good."
Aggregation
  • 'What’s the total number of products sold in the Electronics category?'
  • 'Determine the total number of orders placed during promotional periods.'
  • 'What’s the total sales amount recorded in the Orders table?'

Evaluation

Metrics

Label Accuracy
all 1.0

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("nazhan/bge-large-en-v1.5-brahmaputra-iter-8-2-epoch")
# Run inference
preds = model("How's your day going?")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 3 11.0696 62
Label Training Sample Count
Tablejoin 112
Rejection 67
Aggregation 71
Lookup 56
Generalreply 69
Viewtables 73
Lookup_1 69

Training Hyperparameters

  • batch_size: (16, 16)
  • num_epochs: (2, 2)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: True

Training Results

Epoch Step Training Loss Validation Loss
0.0001 1 0.1865 -
0.0035 50 0.1599 -
0.0070 100 0.1933 -
0.0106 150 0.1595 -
0.0141 200 0.0899 -
0.0176 250 0.1334 -
0.0211 300 0.0722 -
0.0246 350 0.0411 -
0.0282 400 0.0171 -
0.0317 450 0.0293 -
0.0352 500 0.0218 -
0.0387 550 0.0057 -
0.0422 600 0.0065 -
0.0458 650 0.0047 -
0.0493 700 0.0045 -
0.0528 750 0.0048 -
0.0563 800 0.0032 -
0.0599 850 0.0038 -
0.0634 900 0.0033 -
0.0669 950 0.0027 -
0.0704 1000 0.0025 -
0.0739 1050 0.0024 -
0.0775 1100 0.0021 -
0.0810 1150 0.0025 -
0.0845 1200 0.0016 -
0.0880 1250 0.0019 -
0.0915 1300 0.0017 -
0.0951 1350 0.0016 -
0.0986 1400 0.0025 -
0.1021 1450 0.0016 -
0.1056 1500 0.0015 -
0.1091 1550 0.0012 -
0.1127 1600 0.001 -
0.1162 1650 0.0012 -
0.1197 1700 0.0012 -
0.1232 1750 0.0013 -
0.1267 1800 0.0012 -
0.1303 1850 0.0009 -
0.1338 1900 0.0011 -
0.1373 1950 0.001 -
0.1408 2000 0.0009 -
0.1443 2050 0.0009 -
0.1479 2100 0.0008 -
0.1514 2150 0.0007 -
0.1549 2200 0.0008 -
0.1584 2250 0.0008 -
0.1619 2300 0.0008 -
0.1655 2350 0.0007 -
0.1690 2400 0.0008 -
0.1725 2450 0.0006 -
0.1760 2500 0.0005 -
0.1796 2550 0.0006 -
0.1831 2600 0.0005 -
0.1866 2650 0.0006 -
0.1901 2700 0.0005 -
0.1936 2750 0.0007 -
0.1972 2800 0.0006 -
0.2007 2850 0.0005 -
0.2042 2900 0.0006 -
0.2077 2950 0.0007 -
0.2112 3000 0.0006 -
0.2148 3050 0.0005 -
0.2183 3100 0.0005 -
0.2218 3150 0.0005 -
0.2253 3200 0.0006 -
0.2288 3250 0.0005 -
0.2324 3300 0.0006 -
0.2359 3350 0.0004 -
0.2394 3400 0.0005 -
0.2429 3450 0.0005 -
0.2464 3500 0.0004 -
0.2500 3550 0.0006 -
0.2535 3600 0.0004 -
0.2570 3650 0.0004 -
0.2605 3700 0.0004 -
0.2640 3750 0.0004 -
0.2676 3800 0.0003 -
0.2711 3850 0.0004 -
0.2746 3900 0.0005 -
0.2781 3950 0.0004 -
0.2817 4000 0.0004 -
0.2852 4050 0.0003 -
0.2887 4100 0.0004 -
0.2922 4150 0.0004 -
0.2957 4200 0.0004 -
0.2993 4250 0.0005 -
0.3028 4300 0.0004 -
0.3063 4350 0.0004 -
0.3098 4400 0.0003 -
0.3133 4450 0.0004 -
0.3169 4500 0.0004 -
0.3204 4550 0.0003 -
0.3239 4600 0.0003 -
0.3274 4650 0.0004 -
0.3309 4700 0.0003 -
0.3345 4750 0.0003 -
0.3380 4800 0.0003 -
0.3415 4850 0.0003 -
0.3450 4900 0.0004 -
0.3485 4950 0.0003 -
0.3521 5000 0.0003 -
0.3556 5050 0.0003 -
0.3591 5100 0.0003 -
0.3626 5150 0.0004 -
0.3661 5200 0.0002 -
0.3697 5250 0.0004 -
0.3732 5300 0.0003 -
0.3767 5350 0.0003 -
0.3802 5400 0.0002 -
0.3837 5450 0.0003 -
0.3873 5500 0.0003 -
0.3908 5550 0.0003 -
0.3943 5600 0.0002 -
0.3978 5650 0.0003 -
0.4014 5700 0.0003 -
0.4049 5750 0.0002 -
0.4084 5800 0.0003 -
0.4119 5850 0.0003 -
0.4154 5900 0.0003 -
0.4190 5950 0.0002 -
0.4225 6000 0.0002 -
0.4260 6050 0.0002 -
0.4295 6100 0.0003 -
0.4330 6150 0.0003 -
0.4366 6200 0.0002 -
0.4401 6250 0.0003 -
0.4436 6300 0.0003 -
0.4471 6350 0.0002 -
0.4506 6400 0.0002 -
0.4542 6450 0.0002 -
0.4577 6500 0.0002 -
0.4612 6550 0.0002 -
0.4647 6600 0.0002 -
0.4682 6650 0.0002 -
0.4718 6700 0.0002 -
0.4753 6750 0.0003 -
0.4788 6800 0.0003 -
0.4823 6850 0.0002 -
0.4858 6900 0.0003 -
0.4894 6950 0.0002 -
0.4929 7000 0.0003 -
0.4964 7050 0.0002 -
0.4999 7100 0.0002 -
0.5035 7150 0.0002 -
0.5070 7200 0.0003 -
0.5105 7250 0.0002 -
0.5140 7300 0.0003 -
0.5175 7350 0.0004 -
0.5211 7400 0.0002 -
0.5246 7450 0.0002 -
0.5281 7500 0.0002 -
0.5316 7550 0.0002 -
0.5351 7600 0.0002 -
0.5387 7650 0.0002 -
0.5422 7700 0.0002 -
0.5457 7750 0.0002 -
0.5492 7800 0.0003 -
0.5527 7850 0.0002 -
0.5563 7900 0.0002 -
0.5598 7950 0.0002 -
0.5633 8000 0.0002 -
0.5668 8050 0.0002 -
0.5703 8100 0.0002 -
0.5739 8150 0.0002 -
0.5774 8200 0.0003 -
0.5809 8250 0.0002 -
0.5844 8300 0.0002 -
0.5879 8350 0.0002 -
0.5915 8400 0.0002 -
0.5950 8450 0.0001 -
0.5985 8500 0.0001 -
0.6020 8550 0.0001 -
0.6055 8600 0.0001 -
0.6091 8650 0.0002 -
0.6126 8700 0.0002 -
0.6161 8750 0.0002 -
0.6196 8800 0.0002 -
0.6232 8850 0.0002 -
0.6267 8900 0.0001 -
0.6302 8950 0.0001 -
0.6337 9000 0.0002 -
0.6372 9050 0.0002 -
0.6408 9100 0.0002 -
0.6443 9150 0.0001 -
0.6478 9200 0.0002 -
0.6513 9250 0.0003 -
0.6548 9300 0.0002 -
0.6584 9350 0.0003 -
0.6619 9400 0.0001 -
0.6654 9450 0.0001 -
0.6689 9500 0.0001 -
0.6724 9550 0.0001 -
0.6760 9600 0.0001 -
0.6795 9650 0.0002 -
0.6830 9700 0.0002 -
0.6865 9750 0.0002 -
0.6900 9800 0.0001 -
0.6936 9850 0.0001 -
0.6971 9900 0.0002 -
0.7006 9950 0.0001 -
0.7041 10000 0.0001 -
0.7076 10050 0.0001 -
0.7112 10100 0.0002 -
0.7147 10150 0.0001 -
0.7182 10200 0.0002 -
0.7217 10250 0.0002 -
0.7252 10300 0.0001 -
0.7288 10350 0.0001 -
0.7323 10400 0.0001 -
0.7358 10450 0.0001 -
0.7393 10500 0.0002 -
0.7429 10550 0.0001 -
0.7464 10600 0.0002 -
0.7499 10650 0.0001 -
0.7534 10700 0.0001 -
0.7569 10750 0.0002 -
0.7605 10800 0.0002 -
0.7640 10850 0.0001 -
0.7675 10900 0.0001 -
0.7710 10950 0.0001 -
0.7745 11000 0.0001 -
0.7781 11050 0.0001 -
0.7816 11100 0.0001 -
0.7851 11150 0.0001 -
0.7886 11200 0.0001 -
0.7921 11250 0.0001 -
0.7957 11300 0.0001 -
0.7992 11350 0.0001 -
0.8027 11400 0.0002 -
0.8062 11450 0.0001 -
0.8097 11500 0.0001 -
0.8133 11550 0.0001 -
0.8168 11600 0.0001 -
0.8203 11650 0.0001 -
0.8238 11700 0.0001 -
0.8273 11750 0.0001 -
0.8309 11800 0.0001 -
0.8344 11850 0.0001 -
0.8379 11900 0.0001 -
0.8414 11950 0.0001 -
0.8450 12000 0.0001 -
0.8485 12050 0.0001 -
0.8520 12100 0.0001 -
0.8555 12150 0.0001 -
0.8590 12200 0.0001 -
0.8626 12250 0.0002 -
0.8661 12300 0.0002 -
0.8696 12350 0.0002 -
0.8731 12400 0.0002 -
0.8766 12450 0.0001 -
0.8802 12500 0.0001 -
0.8837 12550 0.0001 -
0.8872 12600 0.0001 -
0.8907 12650 0.0002 -
0.8942 12700 0.0001 -
0.8978 12750 0.0001 -
0.9013 12800 0.0001 -
0.9048 12850 0.0001 -
0.9083 12900 0.0001 -
0.9118 12950 0.0001 -
0.9154 13000 0.0001 -
0.9189 13050 0.0001 -
0.9224 13100 0.0001 -
0.9259 13150 0.0001 -
0.9294 13200 0.0001 -
0.9330 13250 0.0001 -
0.9365 13300 0.0001 -
0.9400 13350 0.0001 -
0.9435 13400 0.0001 -
0.9470 13450 0.0001 -
0.9506 13500 0.0001 -
0.9541 13550 0.0001 -
0.9576 13600 0.0001 -
0.9611 13650 0.0001 -
0.9647 13700 0.0001 -
0.9682 13750 0.0001 -
0.9717 13800 0.0001 -
0.9752 13850 0.0001 -
0.9787 13900 0.0001 -
0.9823 13950 0.0001 -
0.9858 14000 0.0001 -
0.9893 14050 0.0001 -
0.9928 14100 0.0001 -
0.9963 14150 0.0002 -
0.9999 14200 0.0001 -
1.0 14202 - 0.0082
1.0034 14250 0.0001 -
1.0069 14300 0.0001 -
1.0104 14350 0.0001 -
1.0139 14400 0.0001 -
1.0175 14450 0.0001 -
1.0210 14500 0.0001 -
1.0245 14550 0.0001 -
1.0280 14600 0.0001 -
1.0315 14650 0.0001 -
1.0351 14700 0.0001 -
1.0386 14750 0.0001 -
1.0421 14800 0.0001 -
1.0456 14850 0.0001 -
1.0491 14900 0.0001 -
1.0527 14950 0.0001 -
1.0562 15000 0.0001 -
1.0597 15050 0.0001 -
1.0632 15100 0.0001 -
1.0668 15150 0.0001 -
1.0703 15200 0.0001 -
1.0738 15250 0.0001 -
1.0773 15300 0.0001 -
1.0808 15350 0.0001 -
1.0844 15400 0.0001 -
1.0879 15450 0.0001 -
1.0914 15500 0.0001 -
1.0949 15550 0.0001 -
1.0984 15600 0.0001 -
1.1020 15650 0.0001 -
1.1055 15700 0.0001 -
1.1090 15750 0.0001 -
1.1125 15800 0.0001 -
1.1160 15850 0.0001 -
1.1196 15900 0.0001 -
1.1231 15950 0.0001 -
1.1266 16000 0.0001 -
1.1301 16050 0.0001 -
1.1336 16100 0.0001 -
1.1372 16150 0.0001 -
1.1407 16200 0.0001 -
1.1442 16250 0.0001 -
1.1477 16300 0.0001 -
1.1512 16350 0.0001 -
1.1548 16400 0.0001 -
1.1583 16450 0.0001 -
1.1618 16500 0.0001 -
1.1653 16550 0.0001 -
1.1688 16600 0.0001 -
1.1724 16650 0.0001 -
1.1759 16700 0.0001 -
1.1794 16750 0.0001 -
1.1829 16800 0.0001 -
1.1865 16850 0.0001 -
1.1900 16900 0.0001 -
1.1935 16950 0.0001 -
1.1970 17000 0.0001 -
1.2005 17050 0.0001 -
1.2041 17100 0.0001 -
1.2076 17150 0.0001 -
1.2111 17200 0.0001 -
1.2146 17250 0.0001 -
1.2181 17300 0.0001 -
1.2217 17350 0.0001 -
1.2252 17400 0.0001 -
1.2287 17450 0.0001 -
1.2322 17500 0.0001 -
1.2357 17550 0.0001 -
1.2393 17600 0.0001 -
1.2428 17650 0.0001 -
1.2463 17700 0.0001 -
1.2498 17750 0.0001 -
1.2533 17800 0.0001 -
1.2569 17850 0.0001 -
1.2604 17900 0.0001 -
1.2639 17950 0.0001 -
1.2674 18000 0.0001 -
1.2709 18050 0.0001 -
1.2745 18100 0.0001 -
1.2780 18150 0.0001 -
1.2815 18200 0.0001 -
1.2850 18250 0.0001 -
1.2886 18300 0.0001 -
1.2921 18350 0.0001 -
1.2956 18400 0.0001 -
1.2991 18450 0.0001 -
1.3026 18500 0.0001 -
1.3062 18550 0.0001 -
1.3097 18600 0.0001 -
1.3132 18650 0.0001 -
1.3167 18700 0.0001 -
1.3202 18750 0.0001 -
1.3238 18800 0.0001 -
1.3273 18850 0.0001 -
1.3308 18900 0.0001 -
1.3343 18950 0.0001 -
1.3378 19000 0.0001 -
1.3414 19050 0.0001 -
1.3449 19100 0.0001 -
1.3484 19150 0.0001 -
1.3519 19200 0.0001 -
1.3554 19250 0.0001 -
1.3590 19300 0.0001 -
1.3625 19350 0.0001 -
1.3660 19400 0.0001 -
1.3695 19450 0.0001 -
1.3730 19500 0.0001 -
1.3766 19550 0.0001 -
1.3801 19600 0.0001 -
1.3836 19650 0.0001 -
1.3871 19700 0.0001 -
1.3906 19750 0.0001 -
1.3942 19800 0.0001 -
1.3977 19850 0.0001 -
1.4012 19900 0.0001 -
1.4047 19950 0.0001 -
1.4083 20000 0.0001 -
1.4118 20050 0.0001 -
1.4153 20100 0.0001 -
1.4188 20150 0.0001 -
1.4223 20200 0.0001 -
1.4259 20250 0.0001 -
1.4294 20300 0.0001 -
1.4329 20350 0.0001 -
1.4364 20400 0.0 -
1.4399 20450 0.0001 -
1.4435 20500 0.0001 -
1.4470 20550 0.0001 -
1.4505 20600 0.0001 -
1.4540 20650 0.0001 -
1.4575 20700 0.0001 -
1.4611 20750 0.0001 -
1.4646 20800 0.0001 -
1.4681 20850 0.0 -
1.4716 20900 0.0001 -
1.4751 20950 0.0001 -
1.4787 21000 0.0 -
1.4822 21050 0.0001 -
1.4857 21100 0.0001 -
1.4892 21150 0.0001 -
1.4927 21200 0.0001 -
1.4963 21250 0.0001 -
1.4998 21300 0.0001 -
1.5033 21350 0.0 -
1.5068 21400 0.0001 -
1.5104 21450 0.0001 -
1.5139 21500 0.0001 -
1.5174 21550 0.0 -
1.5209 21600 0.0001 -
1.5244 21650 0.0001 -
1.5280 21700 0.0001 -
1.5315 21750 0.0001 -
1.5350 21800 0.0001 -
1.5385 21850 0.0001 -
1.5420 21900 0.0 -
1.5456 21950 0.0001 -
1.5491 22000 0.0001 -
1.5526 22050 0.0001 -
1.5561 22100 0.0001 -
1.5596 22150 0.0001 -
1.5632 22200 0.0001 -
1.5667 22250 0.0 -
1.5702 22300 0.0 -
1.5737 22350 0.0001 -
1.5772 22400 0.0001 -
1.5808 22450 0.0001 -
1.5843 22500 0.0001 -
1.5878 22550 0.0001 -
1.5913 22600 0.0001 -
1.5948 22650 0.0001 -
1.5984 22700 0.0 -
1.6019 22750 0.0001 -
1.6054 22800 0.0001 -
1.6089 22850 0.0001 -
1.6124 22900 0.0001 -
1.6160 22950 0.0001 -
1.6195 23000 0.0001 -
1.6230 23050 0.0001 -
1.6265 23100 0.0001 -
1.6301 23150 0.0 -
1.6336 23200 0.0001 -
1.6371 23250 0.0001 -
1.6406 23300 0.0 -
1.6441 23350 0.0001 -
1.6477 23400 0.0 -
1.6512 23450 0.0001 -
1.6547 23500 0.0 -
1.6582 23550 0.0001 -
1.6617 23600 0.0001 -
1.6653 23650 0.0 -
1.6688 23700 0.0 -
1.6723 23750 0.0001 -
1.6758 23800 0.0001 -
1.6793 23850 0.0 -
1.6829 23900 0.0001 -
1.6864 23950 0.0 -
1.6899 24000 0.0 -
1.6934 24050 0.0 -
1.6969 24100 0.0001 -
1.7005 24150 0.0001 -
1.7040 24200 0.0001 -
1.7075 24250 0.0001 -
1.7110 24300 0.0001 -
1.7145 24350 0.0001 -
1.7181 24400 0.0001 -
1.7216 24450 0.0 -
1.7251 24500 0.0001 -
1.7286 24550 0.0 -
1.7322 24600 0.0001 -
1.7357 24650 0.0001 -
1.7392 24700 0.0 -
1.7427 24750 0.0001 -
1.7462 24800 0.0001 -
1.7498 24850 0.0001 -
1.7533 24900 0.0 -
1.7568 24950 0.0 -
1.7603 25000 0.0001 -
1.7638 25050 0.0001 -
1.7674 25100 0.0001 -
1.7709 25150 0.0001 -
1.7744 25200 0.0 -
1.7779 25250 0.0001 -
1.7814 25300 0.0 -
1.7850 25350 0.0 -
1.7885 25400 0.0 -
1.7920 25450 0.0 -
1.7955 25500 0.0 -
1.7990 25550 0.0 -
1.8026 25600 0.0001 -
1.8061 25650 0.0 -
1.8096 25700 0.0001 -
1.8131 25750 0.0001 -
1.8166 25800 0.0 -
1.8202 25850 0.0 -
1.8237 25900 0.0 -
1.8272 25950 0.0 -
1.8307 26000 0.0001 -
1.8342 26050 0.0 -
1.8378 26100 0.0 -
1.8413 26150 0.0 -
1.8448 26200 0.0 -
1.8483 26250 0.0 -
1.8519 26300 0.0 -
1.8554 26350 0.0001 -
1.8589 26400 0.0 -
1.8624 26450 0.0 -
1.8659 26500 0.0 -
1.8695 26550 0.0 -
1.8730 26600 0.0 -
1.8765 26650 0.0 -
1.8800 26700 0.0 -
1.8835 26750 0.0001 -
1.8871 26800 0.0 -
1.8906 26850 0.0 -
1.8941 26900 0.0 -
1.8976 26950 0.0 -
1.9011 27000 0.0001 -
1.9047 27050 0.0 -
1.9082 27100 0.0 -
1.9117 27150 0.0 -
1.9152 27200 0.0001 -
1.9187 27250 0.0 -
1.9223 27300 0.0001 -
1.9258 27350 0.0 -
1.9293 27400 0.0 -
1.9328 27450 0.0 -
1.9363 27500 0.0 -
1.9399 27550 0.0 -
1.9434 27600 0.0 -
1.9469 27650 0.0 -
1.9504 27700 0.0 -
1.9540 27750 0.0001 -
1.9575 27800 0.0 -
1.9610 27850 0.0 -
1.9645 27900 0.0 -
1.9680 27950 0.0001 -
1.9716 28000 0.0 -
1.9751 28050 0.0 -
1.9786 28100 0.0001 -
1.9821 28150 0.0 -
1.9856 28200 0.0 -
1.9892 28250 0.0 -
1.9927 28300 0.0 -
1.9962 28350 0.0 -
1.9997 28400 0.0001 -
2.0 28404 - 0.0076
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.11.9
  • SetFit: 1.1.0.dev0
  • Sentence Transformers: 3.0.1
  • Transformers: 4.44.2
  • PyTorch: 2.4.0+cu121
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
24
Safetensors
Model size
335M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for nazhan/bge-large-en-v1.5-brahmaputra-iter-8-2-epoch

Finetuned
(24)
this model

Dataset used to train nazhan/bge-large-en-v1.5-brahmaputra-iter-8-2-epoch

Evaluation results