IlhamEbdesk's picture
Add new SentenceTransformer model.
ada6ac6 verified
|
raw
history blame
26.5 kB
metadata
base_model: BAAI/bge-base-en-v1.5
datasets: []
language:
  - my
library_name: sentence-transformers
license: apache-2.0
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:389
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
widget:
  - source_sentence: >-
      Tukang kayu adalah individu yang bekerja dengan kayu untuk membina atau
      membaiki struktur dan perabot.
    sentences:
      - Apakah itu pakar latihan?
      - Apakah itu tukang kayu?
      - Apakah itu pakar mikrobiologi?
  - source_sentence: >-
      Pakar pemakanan adalah profesional yang memberi nasihat mengenai pemakanan
      dan diet untuk meningkatkan kesihatan.
    sentences:
      - Apakah itu penulis kreatif?
      - Apakah itu ahli geologi marin?
      - Apakah itu pakar pemakanan?
  - source_sentence: >-
      Dokter adalah profesional medis yang mendiagnosis dan merawat penyakit
      serta cedera pasien.
    sentences:
      - Apa itu dokter?
      - Apakah itu pengurus kargo?
      - Apakah itu pakar teknologi nano?
  - source_sentence: >-
      Juruteknik pembinaan kapal adalah individu yang terlibat dalam proses
      pembinaan dan pembaikan kapal, memastikan struktur dan sistem kapal dibina
      mengikut spesifikasi.
    sentences:
      - Apakah itu juruteknik pembinaan kapal?
      - Apakah itu pengurus projek IT?
      - Apakah itu pakar perkapalan?
  - source_sentence: >-
      Penyelaras kempen iklan adalah individu yang menyelaraskan semua aspek
      kempen iklan, termasuk jadual, pelaksanaan, dan laporan prestasi.
    sentences:
      - Apakah itu jurutera sistem propulsi?
      - Apakah itu pembuat roti?
      - Apakah itu penyelaras kempen iklan?
model-index:
  - name: BGE base Financial Matryoshka
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.8226221079691517
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9768637532133676
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.987146529562982
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9974293059125964
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.8226221079691517
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.32562125107112255
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1974293059125964
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09974293059125963
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.8226221079691517
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9768637532133676
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.987146529562982
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9974293059125964
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9255252859780915
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.9009670706328802
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.9011023703216912
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.8046272493573264
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.974293059125964
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.987146529562982
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9922879177377892
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.8046272493573264
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.324764353041988
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.1974293059125964
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0992287917737789
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.8046272493573264
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.974293059125964
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.987146529562982
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9922879177377892
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9158947182791948
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8895519647447668
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8900397092700132
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.7892030848329049
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9665809768637532
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.974293059125964
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.987146529562982
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7892030848329049
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.3221936589545844
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.19485861182519276
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0987146529562982
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7892030848329049
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9665809768637532
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.974293059125964
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.987146529562982
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.9046037741833534
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8764455053658137
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8770676096874822
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.7480719794344473
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.9408740359897172
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9537275064267352
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9691516709511568
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7480719794344473
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.31362467866323906
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.190745501285347
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09691516709511568
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7480719794344473
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.9408740359897172
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9537275064267352
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9691516709511568
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8765083941585068
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8449820459460564
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8461326502118156
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.7223650385604113
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.897172236503856
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.9254498714652957
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.9434447300771208
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.7223650385604113
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.29905741216795206
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.18508997429305912
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.09434447300771207
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.7223650385604113
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.897172236503856
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.9254498714652957
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.9434447300771208
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.8455216956566762
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.8126851511812953
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.8145628077638951
            name: Cosine Map@100

BGE base Financial Matryoshka

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Language: my
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("IlhamEbdesk/bge-base-financial-matryoshka_test_my")
# Run inference
sentences = [
    'Penyelaras kempen iklan adalah individu yang menyelaraskan semua aspek kempen iklan, termasuk jadual, pelaksanaan, dan laporan prestasi.',
    'Apakah itu penyelaras kempen iklan?',
    'Apakah itu pembuat roti?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.8226
cosine_accuracy@3 0.9769
cosine_accuracy@5 0.9871
cosine_accuracy@10 0.9974
cosine_precision@1 0.8226
cosine_precision@3 0.3256
cosine_precision@5 0.1974
cosine_precision@10 0.0997
cosine_recall@1 0.8226
cosine_recall@3 0.9769
cosine_recall@5 0.9871
cosine_recall@10 0.9974
cosine_ndcg@10 0.9255
cosine_mrr@10 0.901
cosine_map@100 0.9011

Information Retrieval

Metric Value
cosine_accuracy@1 0.8046
cosine_accuracy@3 0.9743
cosine_accuracy@5 0.9871
cosine_accuracy@10 0.9923
cosine_precision@1 0.8046
cosine_precision@3 0.3248
cosine_precision@5 0.1974
cosine_precision@10 0.0992
cosine_recall@1 0.8046
cosine_recall@3 0.9743
cosine_recall@5 0.9871
cosine_recall@10 0.9923
cosine_ndcg@10 0.9159
cosine_mrr@10 0.8896
cosine_map@100 0.89

Information Retrieval

Metric Value
cosine_accuracy@1 0.7892
cosine_accuracy@3 0.9666
cosine_accuracy@5 0.9743
cosine_accuracy@10 0.9871
cosine_precision@1 0.7892
cosine_precision@3 0.3222
cosine_precision@5 0.1949
cosine_precision@10 0.0987
cosine_recall@1 0.7892
cosine_recall@3 0.9666
cosine_recall@5 0.9743
cosine_recall@10 0.9871
cosine_ndcg@10 0.9046
cosine_mrr@10 0.8764
cosine_map@100 0.8771

Information Retrieval

Metric Value
cosine_accuracy@1 0.7481
cosine_accuracy@3 0.9409
cosine_accuracy@5 0.9537
cosine_accuracy@10 0.9692
cosine_precision@1 0.7481
cosine_precision@3 0.3136
cosine_precision@5 0.1907
cosine_precision@10 0.0969
cosine_recall@1 0.7481
cosine_recall@3 0.9409
cosine_recall@5 0.9537
cosine_recall@10 0.9692
cosine_ndcg@10 0.8765
cosine_mrr@10 0.845
cosine_map@100 0.8461

Information Retrieval

Metric Value
cosine_accuracy@1 0.7224
cosine_accuracy@3 0.8972
cosine_accuracy@5 0.9254
cosine_accuracy@10 0.9434
cosine_precision@1 0.7224
cosine_precision@3 0.2991
cosine_precision@5 0.1851
cosine_precision@10 0.0943
cosine_recall@1 0.7224
cosine_recall@3 0.8972
cosine_recall@5 0.9254
cosine_recall@10 0.9434
cosine_ndcg@10 0.8455
cosine_mrr@10 0.8127
cosine_map@100 0.8146

Training Details

Training Dataset

Unnamed Dataset

  • Size: 389 training samples
  • Columns: positive and anchor
  • Approximate statistics based on the first 1000 samples:
    positive anchor
    type string string
    details
    • min: 27 tokens
    • mean: 61.59 tokens
    • max: 139 tokens
    • min: 8 tokens
    • mean: 15.26 tokens
    • max: 24 tokens
  • Samples:
    positive anchor
    Dokter adalah profesional medis yang mendiagnosis dan merawat penyakit serta cedera pasien. Apa itu dokter?
    Pereka sistem akuakultur adalah individu yang merancang dan membangunkan sistem untuk membiakkan ikan secara berkesan, termasuk reka bentuk kolam, sistem aliran air, dan pemantauan kualiti air. Apakah itu pereka sistem akuakultur?
    Ahli sejarah seni adalah individu yang mengkaji perkembangan seni sepanjang sejarah dan konteks sosial, politik, dan budaya yang mempengaruhi penciptaannya. Mereka bekerja di muzium, galeri, dan institusi akademik, menganalisis karya seni Apakah itu ahli sejarah seni?
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 2e-05
  • num_train_epochs: 4
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • tf32: False
  • load_best_model_at_end: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 32
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 4
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: False
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step dim_128_cosine_map@100 dim_256_cosine_map@100 dim_512_cosine_map@100 dim_64_cosine_map@100 dim_768_cosine_map@100
1.0 1 0.6375 0.7065 0.7339 0.5984 0.7483
2.0 3 0.8282 0.8712 0.8821 0.7994 0.8929
2.4615 4 0.8461 0.8771 0.89 0.8146 0.9011
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.2+cu121
  • Accelerate: 0.32.1
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning}, 
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}