SentenceTransformer based on Snowflake/snowflake-arctic-embed-l

This is a sentence-transformers model finetuned from Snowflake/snowflake-arctic-embed-l. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: Snowflake/snowflake-arctic-embed-l
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("tabesink92/mg_alloy-snowflake-arctic-embed-l-ft-v2")
# Run inference
sentences = [
    "content='1. What does MSE stand for in the context provided?\\n2. Which technique is abbreviated as OES?' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 24, 'prompt_tokens': 196, 'total_tokens': 220, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-mini-2024-07-18', 'system_fingerprint': 'fp_709714d124', 'finish_reason': 'stop', 'logprobs': None} id='run-798f8f65-4b52-4198-a267-dc6f61a192d0-0' usage_metadata={'input_tokens': 196, 'output_tokens': 24, 'total_tokens': 220, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}}",
    '# MSE – Mean Squared Error\n\n# NRMSE – Normalized Root Mean Squared Error\n\n# OES – Optical Emission Spectrometry\n\n# PSN – Particle Stimulated Nucleation\n\n# ReLU – Rectified linear unit\n\n# RT – Room Temperature\n\n# SDAS – Secondary Dendrite Arm Spacing\n\n# SEM – Scanning Electron Microscopy\n\n# SGD – Stochastic Gradient Descent\n\n# SVM – Support Vector Machine\n\n# TD – Transverse Direction\n\n# UTS – Ultimate Tensile Strength\n\n# VAE – Variational Autoencoder\n\n# XCT – X-ray Computed Tomography\n\n# YS – Yield Strength\n\n# xxi',
    'analytical or even empirical model that could relate a specific process parameter to a selected property. Also, compared to steel or aluminum alloys, there are still not enough experimental results that guide researchers and manufacturers in selecting proper processing routes and finding the optimum combination of process parameters.\n\nMachine learning techniques, as computational tools that use data gathered from experience to improve performance, have shown to be successful in establishing such relationships between interacting variables. Machine learning methods can be used to connect sparse findings on production of magnesium alloys to provide some insight into missing pieces of information. Such a link between different parameters not only help researchers to predict desired properties or select optimum parameters, but also can be used in theoretical studies of controlling mechanisms, which can finally give rise to more robust analytical or empirical models.\n\nIn the current study, the feasibility of using machine learning approaches in establishing process-microstructure-property relationship would be put to the test. For this purpose, cast-forging process of AZ80 magnesium alloy, as a cost-effective hybrid manufacturing method, is going to be studied through advanced characterization methods. The effect of different process parameters of casting, forging, and intermediate thermo-mechanical processes and microstructural and mechanical properties of material in various stages of manufacturing are parameters of interest which will also be linked through machine learning models.\n\n# 1.2 Challenges and Opportunities\n\nTo establish a link between process, microstructure, and property, a complete study of all the controlling parameters is required. Previous studies on the viability of production of magnesium-based structural components cover a variety of parameters and properties [3,4,7,8], but since these studies have focused on different aspects of production, a complete experimental study on the effect of parameters of casting and forging processes for AZ80 magnesium alloy is selected as a basis for the establishment of this relationship. It also contributes to broadening the knowledge of cast-forging manufacturing of magnesium alloys, where thorough experimental work is unavailable.\n\nSuch an experimental study allows us to measure all properties in a controlled, consistent approach, while it also allows us to incorporate different processing parameters and, if necessary, append new variables based on the performance of machine learning models.\n\nAs the experimental part progresses through different processing steps like wedge casting, cylinder casting, homogenization, I-beam forging, optimized preform casting, and forging the final structural components, data-driven models are developed to link process parameters to microstructural features.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.9949
cosine_accuracy@3 1.0
cosine_accuracy@5 1.0
cosine_accuracy@10 1.0
cosine_precision@1 0.9949
cosine_precision@3 0.3333
cosine_precision@5 0.2
cosine_precision@10 0.1
cosine_recall@1 0.9949
cosine_recall@3 1.0
cosine_recall@5 1.0
cosine_recall@10 1.0
cosine_ndcg@10 0.9981
cosine_mrr@10 0.9974
cosine_map@100 0.9974

Training Details

Training Dataset

Unnamed Dataset

  • Size: 892 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 892 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 340 tokens
    • mean: 371.01 tokens
    • max: 410 tokens
    • min: 29 tokens
    • mean: 439.26 tokens
    • max: 512 tokens
  • Samples:
    sentence_0 sentence_1
    content='1. What experimental method was used to investigate the fatigue properties of extruded AZ80 magnesium alloys in the study? \n2. How does the crack initiation mechanism change with varying stress amplitudes in the AZ80 magnesium alloys?' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 46, 'prompt_tokens': 731, 'total_tokens': 777, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-mini-2024-07-18', 'system_fingerprint': 'fp_7fcd609668', 'finish_reason': 'stop', 'logprobs': None} id='run-d649e508-ccc4-420e-b812-67719c92f2cb-0' usage_metadata={'input_tokens': 731, 'output_tokens': 46, 'total_tokens': 777, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}} # Fatigue behaviour and fractography of extruded AZ80 magnesium alloys in very high cycle regime

    # Kazuaki Shiozawaa *, Tomoki Kashiwagi b, Tutomu Murai c, Tooru Takahashi c

    b Tohoku Electric Power Co.Inc., Sendai980-8550, Japan

    c Sankyo-Tateyama Aluminum Industry Co. Ltd., Imizu934-8577, Japan

    Received 26 February 2010; revised 11 March 2010; accepted 15 March 2010

    # Abstract

    In order to investigate the fatigue properties of extruded magnesium alloy in very high-cycle regime, rotary bending fatigue test was performed in ambient atmosphere at room temperature using the hourglass shaped specimens of AZ80 alloys extruded (F-specimen) and treated by an artificial aging after extrusion (T5-specimen). From the experimental results, both specimens show a clear step-wise S-N curve on which two knees exists. Specific stress amplitude formed the knee corresponded to the 0.2% offset proof stress of 160MPa in compression. From the detailed observation of fracture surface, small facet-like s...
    content='1. What experimental method was used to investigate the fatigue properties of extruded AZ80 magnesium alloys in the study? \n2. How does the crack initiation mechanism change with varying stress amplitudes in the AZ80 magnesium alloys?' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 46, 'prompt_tokens': 731, 'total_tokens': 777, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-mini-2024-07-18', 'system_fingerprint': 'fp_7fcd609668', 'finish_reason': 'stop', 'logprobs': None} id='run-5e3c339c-30e6-4dc7-bacb-24b9143312ad-0' usage_metadata={'input_tokens': 731, 'output_tokens': 46, 'total_tokens': 777, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}} # Fatigue behaviour and fractography of extruded AZ80 magnesium alloys in very high cycle regime

    # Kazuaki Shiozawaa *, Tomoki Kashiwagi b, Tutomu Murai c, Tooru Takahashi c

    b Tohoku Electric Power Co.Inc., Sendai980-8550, Japan

    c Sankyo-Tateyama Aluminum Industry Co. Ltd., Imizu934-8577, Japan

    Received 26 February 2010; revised 11 March 2010; accepted 15 March 2010

    # Abstract

    In order to investigate the fatigue properties of extruded magnesium alloy in very high-cycle regime, rotary bending fatigue test was performed in ambient atmosphere at room temperature using the hourglass shaped specimens of AZ80 alloys extruded (F-specimen) and treated by an artificial aging after extrusion (T5-specimen). From the experimental results, both specimens show a clear step-wise S-N curve on which two knees exists. Specific stress amplitude formed the knee corresponded to the 0.2% offset proof stress of 160MPa in compression. From the detailed observation of fracture surface, small facet-like s...
    content='1. What is the chemical composition of the AZ80 magnesium alloy used in the study?\n2. How was the T5 treatment applied to the specimens, and what was its purpose in the experiment?' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 41, 'prompt_tokens': 617, 'total_tokens': 658, 'completion_tokens_details': {'accepted_prediction_tokens': 0, 'audio_tokens': 0, 'reasoning_tokens': 0, 'rejected_prediction_tokens': 0}, 'prompt_tokens_details': {'audio_tokens': 0, 'cached_tokens': 0}}, 'model_name': 'gpt-4o-mini-2024-07-18', 'system_fingerprint': 'fp_7fcd609668', 'finish_reason': 'stop', 'logprobs': None} id='run-1ea8b034-255f-4994-b459-394267605838-0' usage_metadata={'input_tokens': 617, 'output_tokens': 41, 'total_tokens': 658, 'input_token_details': {'audio': 0, 'cache_read': 0}, 'output_token_details': {'audio': 0, 'reasoning': 0}} # 2. Experimental Procedures

    # 2.1. Testing materials and specimen

    The material used in this study was commercial Mg-Al-Zn magnesium alloy, AZ80. The chemical composition of these materials (in mass percentage) is 8.24Al, 0.67Zn, 0.20Mn, 0.005Fe, 0.012Si, 0.0008Cu, 0.0007Ni and balanced Mg. The bar with 16mm in diameter was extruded from a billet of 160mm diameter (an extrusion ratio of 99.4:1) under the extrusion ram speed of 1.5m/min at temperature of 623K. Hour-glass shaped specimens with a grip diameter of 10 mm and minimum diameter of 5 mm (Fig. 1) was machined from the extruded bar with the loading axis parallel to their extrusion directions. The elastic stress concentration factor, Kt, of these specimens was 1.065. On the other hand, to investigate the effect of an aging treatment on fatigue behavior, specimens were prepared from the extruded bar which was heated at 473K for 32h and then air-cooled (T5 treatment). From now on, the specimen extruded and no heat-treated is refer...
  • Loss: MatryoshkaLoss with these parameters:
    {
        "loss": "MultipleNegativesRankingLoss",
        "matryoshka_dims": [
            768,
            512,
            256,
            128,
            64
        ],
        "matryoshka_weights": [
            1,
            1,
            1,
            1,
            1
        ],
        "n_dims_per_step": -1
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • num_train_epochs: 25
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 25
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Epoch Step Training Loss cosine_ndcg@10
0.8929 50 - 0.9525
1.0 56 - 0.9638
1.7857 100 - 0.9801
2.0 112 - 0.9743
2.6786 150 - 0.9849
3.0 168 - 0.9812
3.5714 200 - 0.9868
4.0 224 - 0.9962
4.4643 250 - 0.9925
5.0 280 - 0.9944
5.3571 300 - 0.9944
6.0 336 - 0.9944
6.25 350 - 0.9944
7.0 392 - 0.9981
7.1429 400 - 0.9962
8.0 448 - 0.9981
8.0357 450 - 0.9981
8.9286 500 0.3646 0.9981
9.0 504 - 0.9981
9.8214 550 - 0.9981
10.0 560 - 0.9981
10.7143 600 - 0.9981
11.0 616 - 0.9981
11.6071 650 - 0.9981
12.0 672 - 0.9944
12.5 700 - 0.9981
13.0 728 - 0.9981
13.3929 750 - 0.9981
14.0 784 - 0.9981
14.2857 800 - 0.9981
15.0 840 - 0.9981
15.1786 850 - 0.9981
16.0 896 - 0.9981
16.0714 900 - 0.9981
16.9643 950 - 0.9981
17.0 952 - 0.9981
17.8571 1000 0.064 0.9981
18.0 1008 - 0.9981
18.75 1050 - 0.9981
19.0 1064 - 0.9981
19.6429 1100 - 0.9981
20.0 1120 - 0.9981
20.5357 1150 - 0.9981
21.0 1176 - 0.9981
21.4286 1200 - 0.9981
22.0 1232 - 0.9981
22.3214 1250 - 0.9981
23.0 1288 - 0.9981
23.2143 1300 - 0.9981
24.0 1344 - 0.9981
24.1071 1350 - 0.9981
25.0 1400 - 0.9981

Framework Versions

  • Python: 3.11.11
  • Sentence Transformers: 3.4.1
  • Transformers: 4.48.3
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.3.0
  • Datasets: 3.3.2
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning},
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
6
Safetensors
Model size
334M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for tabesink92/mg_alloy-snowflake-arctic-embed-l-ft-v2

Finetuned
(82)
this model

Evaluation results