--- base_model: microsoft/deberta-v3-small datasets: - sentence-transformers/all-nli - sentence-transformers/stsb - tals/vitaminc - nyu-mll/glue - allenai/scitail - sentence-transformers/xsum - sentence-transformers/sentence-compression - allenai/sciq - allenai/qasc - sentence-transformers/msmarco-msmarco-distilbert-base-v3 - sentence-transformers/natural-questions - sentence-transformers/trivia-qa - sentence-transformers/quora-duplicates - sentence-transformers/gooaq language: - en library_name: sentence-transformers metrics: - pearson_cosine - spearman_cosine - pearson_manhattan - spearman_manhattan - pearson_euclidean - spearman_euclidean - pearson_dot - spearman_dot - pearson_max - spearman_max pipeline_tag: sentence-similarity tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:267363 - loss:AdaptiveLayerLoss - loss:CoSENTLoss - loss:GISTEmbedLoss - loss:OnlineContrastiveLoss - loss:MultipleNegativesSymmetricRankingLoss - loss:MultipleNegativesRankingLoss widget: - source_sentence: The term technology describes the application of knowledge to real-world problems and is practiced by engineers. sentences: - What term describes the application of knowledge to real-world problems and is practiced by engineers? - What bodily function do the triceps help perform? - Exposure to what can increase the amount of pigment in the skin and make it appear darker? - source_sentence: Most of food's chemical energy ultimately comes from sunlight. sentences: - Catecholamines are a class of amine hormones synthesised form which amino acid? - Most of food's chemical energy ultimately comes from what source? - How many types of bosons are there? - source_sentence: Someone is shredding cabbage leaves with a knife. sentences: - The man is erasing the chalk board. - Kittens are eating from dishes. - Someone is chopping some cabbage leaves. - source_sentence: Three men are dancing. sentences: - Women are dancing. - The woman is pouring oil into the pan. - A man is dancing. - source_sentence: What percentage of Warsaw's population was Protestant in 1901? sentences: - (The name of the theater refers to a well-known landmark water tower, which is actually in another nearby area). - This is primarily accomplished through normal faulting and through the ductile stretching and thinning. - After the war, the new communist authorities of Poland discouraged church construction and only a small number were rebuilt. model-index: - name: SentenceTransformer based on microsoft/deberta-v3-small results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test type: sts-test metrics: - type: pearson_cosine value: 0.7002978854888552 name: Pearson Cosine - type: spearman_cosine value: 0.6718756239728468 name: Spearman Cosine - type: pearson_manhattan value: 0.7050517306003169 name: Pearson Manhattan - type: spearman_manhattan value: 0.6824201536078427 name: Spearman Manhattan - type: pearson_euclidean value: 0.6963744527231541 name: Pearson Euclidean - type: spearman_euclidean value: 0.6742379556154348 name: Spearman Euclidean - type: pearson_dot value: 0.5685392445320393 name: Pearson Dot - type: spearman_dot value: 0.5416448961602434 name: Spearman Dot - type: pearson_max value: 0.7050517306003169 name: Pearson Max - type: spearman_max value: 0.6824201536078427 name: Spearman Max --- # SentenceTransformer based on microsoft/deberta-v3-small This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on the [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli), [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb), [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc), [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue), [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail), [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail), [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum), [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression), [compression-pairs2](https://huggingface.co/datasets/sentence-transformers/sentence-compression), [compression-pairs3](https://huggingface.co/datasets/sentence-transformers/sentence-compression), [sciq_pairs](https://huggingface.co/datasets/allenai/sciq), [qasc_pairs](https://huggingface.co/datasets/allenai/qasc), openbookqa_pairs, [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [msmarco_pairs2](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3), [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions), [nq_pairs2](https://huggingface.co/datasets/sentence-transformers/natural-questions), [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa), [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates), [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq), [gooaq_pairs2](https://huggingface.co/datasets/sentence-transformers/gooaq) and [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Datasets:** - [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) - [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb) - [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) - [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) - [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) - [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) - [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) - [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) - [compression-pairs2](https://huggingface.co/datasets/sentence-transformers/sentence-compression) - [compression-pairs3](https://huggingface.co/datasets/sentence-transformers/sentence-compression) - [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) - [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) - openbookqa_pairs - [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) - [msmarco_pairs2](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) - [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) - [nq_pairs2](https://huggingface.co/datasets/sentence-transformers/natural-questions) - [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) - [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) - [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) - [gooaq_pairs2](https://huggingface.co/datasets/sentence-transformers/gooaq) - [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue) - **Language:** en ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp") # Run inference sentences = [ "What percentage of Warsaw's population was Protestant in 1901?", 'After the war, the new communist authorities of Poland discouraged church construction and only a small number were rebuilt.', '(The name of the theater refers to a well-known landmark water tower, which is actually in another nearby area).', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Evaluation ### Metrics #### Semantic Similarity * Dataset: `sts-test` * Evaluated with [EmbeddingSimilarityEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.7003 | | **spearman_cosine** | **0.6719** | | pearson_manhattan | 0.7051 | | spearman_manhattan | 0.6824 | | pearson_euclidean | 0.6964 | | spearman_euclidean | 0.6742 | | pearson_dot | 0.5685 | | spearman_dot | 0.5416 | | pearson_max | 0.7051 | | spearman_max | 0.6824 | ## Training Details ### Training Datasets #### nli-pairs * Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 25,000 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------|:-------------------------------------------------| | A person on a horse jumps over a broken down airplane. | A person is outdoors, on a horse. | | Children smiling and waving at camera | There are children present | | A boy is jumping on skateboard in the middle of a red bridge. | The boy does a skateboarding trick. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### sts-label * Dataset: [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb) at [ab7a5ac](https://huggingface.co/datasets/sentence-transformers/stsb/tree/ab7a5ac0e35aa22088bdcf23e7fd99b220e53308) * Size: 5,749 training samples * Columns: sentence1, sentence2, and score * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------| | type | string | string | float | | details | | | | * Samples: | sentence1 | sentence2 | score | |:-----------------------------------------------------------|:----------------------------------------------------------------------|:------------------| | A plane is taking off. | An air plane is taking off. | 1.0 | | A man is playing a large flute. | A man is playing a flute. | 0.76 | | A man is spreading shreded cheese on a pizza. | A man is spreading shredded cheese on an uncooked pizza. | 0.76 | * Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` #### vitaminc-pairs * Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0) * Size: 25,000 training samples * Columns: claim and evidence * Approximate statistics based on the first 1000 samples: | | claim | evidence | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | claim | evidence | |:-------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Manchester had a population of more than 540,000 in 2017 and was the 5th most populous English district . | Manchester ( ) is a major city and metropolitan borough in Greater Manchester , England , with a population of 545,500 as of 2017 ( 5th most populous English district ) . | | Manchester had a population of less than 540,000 in 2018 and was the 4th most populous English district . | Manchester ( ) is a major city and metropolitan borough in Greater Manchester , England , with a population of 534,982 as of 2018 ( 4th most populous English district ) . | | Traditional Chinese medicine is founded on more than 4000 years of ancient Chinese medical science and practice . | Traditional Chinese medicine ( TCM ; ) is an ancient system of medical diagnosis and treatment of illnesses with a holistic focus on disease prevention through diet , healthy lifestyle changes , exercise and is built on a patient centered clinically oriented foundation of more than 6,500 years of ancient Chinese medical science and practice that includes various forms of herbal medicine , acupuncture , massage ( tui na ) , exercise ( qigong ) , and dietary therapy , but recently also influenced by modern Western medicine . | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### qnli-contrastive * Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 22,500 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------| | type | string | string | int | | details | | | | * Samples: | sentence1 | sentence2 | label | |:---------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | What is HMG-CoA responsible for producing? | In 1971, Akira Endo, a Japanese biochemist working for the pharmaceutical company Sankyo, identified mevastatin (ML-236B), a molecule produced by the fungus Penicillium citrinum, as an inhibitor of HMG-CoA reductase, a critical enzyme used by the body to produce cholesterol. | 0 | | What seperates Tajikistan and Pakistan? | Pakistan lies to the south, separated by the narrow Wakhan Corridor. | 0 | | Where was the Agucadoura Wave Farm located? | Since the turn of the 21st century, there has been a trend towards the development of a renewable resource industry and reduction of both consumption and use of fossil fuel resources. | 0 | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "OnlineContrastiveLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` #### scitail-pairs-qa * Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 14,537 training samples * Columns: sentence2 and sentence1 * Approximate statistics based on the first 1000 samples: | | sentence2 | sentence1 | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence2 | sentence1 | |:------------------------------------------------------------------------|:-----------------------------------------------------------------------| | Lithium has three electrons. | How many electrons does lithium have? | | Mammals may either be herbivores, omnivores or herbivores. | Mammals may either be herbivores, omnivores or what else? | | The fetal period lasts approximately 30 weeks weeks. | Approximately how many weeks does the fetal period last? | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### scitail-pairs-pos * Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 8,600 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------| | Ionic bonds are formed when atoms become ions by gaining or losing electrons. | When an atom gains or loses an electron it becames a(n) ion. | | For example, all atoms of carbon contain six protons, all atoms of oxygen contain eight protons. | All carbon atoms have six protons. | | form of energy : Forms of energy include heat, light, electrical, mechanical, nuclear, sound and chemical. | Heat, light, and sound are all different forms of energy. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### xsum-pairs * Dataset: [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) at [788ddaf](https://huggingface.co/datasets/sentence-transformers/xsum/tree/788ddafe04e539956d56b567bc32a036ee7b9206) * Size: 10,000 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------| | The boat lost contact after departing the eastern state of Sabah on Saturday.
The Malaysian Maritime Enforcement Agency said search and rescue efforts were being hampered by bad weather.
The incident coincides with the first day of China's week-long Lunar New Year celebration, which is also marked by ethnic Chinese in Malaysia.
The catamaran boat left Kota Kinabalu on Saturday at 09:00 local time (01:00 GMT) and was heading towards Pulau Mengalum, a popular tourist island 60km (37 miles) west of the city.
The Malaysian Maritime Enforcement Agency said it received a distress call from the boat but contact was lost soon after.
"I, like all the relatives of those on board, am hoping for progress in the search and rescue operation," the tourism minister for Sabah state, Masidi Manun, told the AFP news agency.
The search area covers 400 nautical square miles between Kota Kinabalu and Pulau Mengalum, according to the New Strait Times.
Storms are common in the area at this time of year.
Three crew members were on board the vessel, alongside the 31 passengers.
| A boat carrying 31 people, including at least 28 Chinese tourists, has gone missing off the Malaysian coast, maritime authorities say. | | The midfielder, 24, tested positive for Benzoylecgonine, a metabolite of cocaine, after the club's match against Hartlepool United on 22 November.
He admitted breaking anti-doping rules.
The suspension, which followed an Independent Regulatory Commission Hearing, is effective from 15 December 2016 to 14 February 2018.
Lacey, who joined Accrington in July 2016, made 17 appearances in 2016-17 but has not played since 10 December having been immediately suspended by the Lancashire club pending the outcome of the disciplinary action.
"Accrington Stanley has strong values on anyone taking any prohibited substances and will always act in the strongest possible way to protect the integrity of the football club," read a club statement.
| League Two club Accrington Stanley have terminated the contract of Paddy Lacey after he was given a 14-month drugs ban by the Football Association. | | They declined to give the reason, saying it was confidential.
Mr Assange, an Australian national, had hoped to create a base for Wikileaks in the Nordic country due to its laws protecting whistle-blowers.
The rejection comes ahead of the expected publication of some 400,000 Iraq war documents on Wikileaks.
The US military has already assembled a 120-member team to prepare for the publication of the documents which are thought to concern battle activity, Iraqi security forces and civilian casualties.
Wikileaks' release in July of thousands of documents on the war in Afghanistan prompted US military officials to warn that the whistleblower website might cause the deaths of US soldiers and Afghan civilians because some of the documents contained the names of locals who had helped coalition forces.
"We have decided not to grant him (Mr Assange) a residence permit," Sweden's Migration Board official Gunilla Wikstroem told the AFP news agency.
"He did not fulfil the requirements," she added without giving any further details.
Mr Assange applied for a residence permit on 18 August.
He is currently being investigated in Sweden over an alleged sex crime.
Mr Assange denies any wrongdoing and says the allegations are part of a smear campaign by opponents of his whistle-blowing website.
| The founder of the Wikileaks website, Julian Assange, has been denied residency in Sweden, the country's migration board officials say. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "MultipleNegativesSymmetricRankingLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` #### compression-pairs * Dataset: [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90) * Size: 14,062 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------| | A YOUNG Irish construction worker murdered in a suspected Mozambique car-jacking was beaten to death. | Irish worker was beaten to death | | Virgin Media has turned its first annual profit since its 2006 launch, the ISP said on Wednesday. | Virgin Media turns its first annual profit | | Videos and reports are surfacing on the Internet that show that two men were thrown out of a Santorum rally for standing up and kissing each other. | Two men thrown out of Santorum rally for kissing | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "MultipleNegativesSymmetricRankingLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` #### compression-pairs2 * Dataset: [compression-pairs2](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90) * Size: 6,631 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------| | Iran rules out pulling out of the Nuclear Non-Proliferation Treaty but insists it will not abandon the right to peaceful nuclear technology. | Iran rules out pulling out of nuclear Non-Proliferation Treaty | | On the heels of the Walsh Brothers' epic planking around the city, Jimmy Kimmel jumped on the planking bandwagon last night when he ``sent'' old Uncle Frank to plank around Los Angeles. | Jimmy Kimmel's Uncle Frank planks around Los Angeles | | Israel carried out seven airstrikes in Gaza overnight, killing at least one Palestinian and injuring 33, the official Palestinian Wafa news agency reported Monday. | Israeli airstrikes kill at least one Palestinian, | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": 2, "last_layer_weight": 0.25, "prior_layers_weight": 2.5, "kl_div_weight": 0.75, "kl_temperature": 0.75 } ``` #### compression-pairs3 * Dataset: [compression-pairs3](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90) * Size: 6,631 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------| | US President Barack Obama, joined by visiting German Chancellor Angela Merkel, touted a climate change bill being debated by Congress Friday as representing ``enormous progress'' but said more needed to be done. | Obama touts climate change bill | | Pakistan fast bowler Mohammad Asif on Thursday released from a British jail after completing half of his one-year sentence for his role in a spot-fixing scandal. | Pakistan cricketer Asif released from jail | | Mostly sunny weather is expected in the morning, partly cloudy after midday, rains and thunderstorm are expected in the western regions, shower is possible in Syunik and Artsakh in the evening. | Mostly sunny weather expected | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "MultipleNegativesRankingLoss", "n_layers_per_step": 4, "last_layer_weight": 0.1, "prior_layers_weight": 10, "kl_div_weight": 3, "kl_temperature": 0.25 } ``` #### sciq_pairs * Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815) * Size: 11,328 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:--------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | What is the capacity to cause change? | | | The wavelength of a wave is the distance between corresponding points on what? | Q: The wavelength of a wave is the distance between corresponding points on adjacent waves. For example, it is the distance between two adjacent crests in the transverse waves in the diagram. Infer how wave frequency is related to wavelength. | | What are modified leaves that bear sporangia? | | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### qasc_pairs * Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070) * Size: 7,889 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Deltas are formed by deposition of what at the mouth of a river by water fanning out? | a delta is formed by deposition of sediment at the mouth of a river by water fanning out. Sand is an example of a clastic sediment.. a delta is formed by deposition of sand at the mouth of a river by water fanning out | | What is important in preventing heat loss from the body? | Head hair is especially important in preventing heat loss from the body.. Hair is actually composed of a protein called keratin.. Keratin is especially important in preventing heat loss from the body. | | What can happen when a body of water receives more water then it can hold? | when a body of water receives more water than it can hold , a flood occurs. Flooding can wipe out an entire crop.. when a body of water receives more water then it can hold, it can destroy crops | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### openbookqa_pairs * Dataset: openbookqa_pairs * Size: 4,505 training samples * Columns: question and fact * Approximate statistics based on the first 1000 samples: | | question | fact | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | question | fact | |:-----------------------------------------------------------------------------|:--------------------------------------------------------------------------------------| | What is animal competition? | if two animals eat the same prey then those animals compete for that pey | | If you wanted to make a metal bed frame, where would you start? | alloys are made of two or more metals | | Places lacking warmth have few what | cold environments contain few organisms | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### msmarco_pairs * Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9) * Size: 13,337 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | what is the connection between the central nervous system and naegleria fowleri? | N. fowleri is the causative agent of primary amoebic meningoencephalitis (PAM), a rare but nearly always fatal disease of the central nervous system. Cases of PAM resulting from infection with Naegleria fowleri have been reported in over fifteen countries in Africa, Asia, Europe, and North and South America. | | cortana how many pounds in a ton | ›› Definition: Ton. The short ton is a unit of mass equal to 2000 lb (exactly 907.18474 kg). In the United States it is often called simply ton without distinguishing it from the metric ton (or tonne) and the long ton rather, the other two are specifically noted. | | does duloxetine cause anxiety? | The most common side effects of Cymbalta are nausea, dry mouth, constipation, diarrhea, fatigue, drowsiness, difficulty sleeping, loss of appetite, and dizziness. Some patients may experience withdrawal reactions such anxiety, nausea, nervousness, and insomnia. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### msmarco_pairs2 * Dataset: [msmarco_pairs2](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9) * Size: 10,913 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | how long to book driving test | How long do I have to wait. The good news is that waiting lists are getting shorter. We aim to have a national average waiting time for a driving test of 10 weeks.Our ability to meet this target depends on the number of applications received. Accordingly, the average waiting time in driving test centres may vary above or below this 10-week target.ow long do I have to wait. The good news is that waiting lists are getting shorter. We aim to have a national average waiting time for a driving test of 10 weeks. | | how much does insurance pay a dietitian | 1 For patients not covered by health insurance, a one-hour initial consultation with a registered dietitian, or RD, typically costs about $100 to $200 -- usually on the higher end if the dietitian comes to your home. If follow up visits are required, they typically cost $50 to $150 each, depending on length of consultation and whether the dietitian comes to your home. | | is hydrogen the most flammable gas on earth | Properties: Common hydrogen has a molecular weight of 2,01594 g. As a gas it has a density of 0.071 g/l at 0ºC and 1 atm. Its relative density, compared with that of the air, is 0.0695. Hydrogen is the most flammable of all the known substances. Hydrogen is slightly more soluble in organic solvents than in water. Many metals absorb hydrogen. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": 2, "last_layer_weight": 0.25, "prior_layers_weight": 2.5, "kl_div_weight": 0.75, "kl_temperature": 0.75 } ``` #### nq_pairs * Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) * Size: 18,187 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | when does lucas find out about peyton and julian | Julian Baker Julian Baker is first introduced as a director eager to make Lucas's first novel into a movie. He introduces himself as the person interested in making a film out of his book and tells him he wants to get a feel for the place where the book takes place. After a while Lucas agrees. Julian then says it would pay $300,000 grand and Lucas suddenly rethinks. Julian then asks about Peyton. After persuading Lucas to option of his movie, he ran into Peyton (leaving the restrooms at TRIC after he and Lucas sign the contract) and she realizes Julian is the one making the movie and it is soon revealed that he is her ex-boyfriend. Lucas finds out about Julian and Peyton, and punches Julian although they still carry on with the movie although with heated tension between the two men. | | when did the capitals last won the stanley cup | Washington Capitals The Capitals were founded in 1974 as an expansion franchise, alongside the Kansas City Scouts. Since purchasing the team in 1999, Leonsis revitalized the franchise by drafting star players such as Alexander Ovechkin, Nicklas Backstrom, Mike Green and Braden Holtby. The 2009–10 Capitals won the franchise's first-ever Presidents' Trophy for being the team with the most points at the end of the regular season. They won it a second time in 2015–16, and did so for a third time the following season in 2016–17. In addition to eleven division titles and three Presidents' Trophies, the Capitals have reached the Stanley Cup Finals twice (in 1998 and 2018), winning in 2018. | | where are the powers of congress listed in the constitution | Powers of the United States Congress Article I of the Constitution sets forth most of the powers of Congress, which include numerous explicit powers enumerated in Section 8. Constitutional amendments have granted Congress additional powers. Congress also has implied powers derived from the Necessary and Proper Clause of the Constitution. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### nq_pairs2 * Dataset: [nq_pairs2](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) * Size: 6,063 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | have messi and ronaldo ever played against each other | Messi–Ronaldo rivalry At club level, Messi and Ronaldo represented rivals FC Barcelona and Real Madrid C.F., the two players facing each other at least twice every season in the world's most popular regular-season club game, El Clásico (among the most viewed annual sporting events), until Ronaldo's transfer to Italian club Juventus F.C. in 2018.[18][19] Off the field, they are the face of two rival sportswear manufacturers, Messi of Adidas and Ronaldo of Nike, which are also the kit suppliers of their national teams and the opposite for their clubs.[20][21][22] The two highest paid players in football, Ronaldo and Messi are among the world's best paid sports' stars in combined income from salaries, bonuses and off-field earnings. In 2016, Ronaldo was first on Forbes list of the best paid athletes, earning $88 million, with Messi being second at $81.4 million.[23] They have the two biggest social media followings in the world among sportspeople with a combined 211 million Facebook fans, with Ronaldo having 122 million and Messi having 89 million.[24] With a combined Facebook, Instagram and Twitter, Ronaldo has 321 million to be the most famous celebrity on social media and Messi has 181 million excluding Twitter.[25][26] | | what year did otis redding song sitting on the dock of the bay | (Sittin' On) The Dock of the Bay "(Sittin' On) The Dock of the Bay" is a song co-written by soul singer Otis Redding and guitarist Steve Cropper. It was recorded by Redding twice in 1967, including once just days before his death in a plane crash. The song was released on Stax Records' Volt label in 1968,[2] becoming the first posthumous single to top the charts in the US.[3] It reached number 3 on the UK Singles Chart. | | why does kansas city have a texas logo | Logos and uniforms of the Kansas City Chiefs When the Texans began playing in 1960, the team's logo consisted of the state of Texas in white with a yellow star marking the location of the city of Dallas. Originally, Hunt chose Columbia Blue and Orange for the Texans' uniforms, but Bud Adams chose the colors for his Houston Oilers franchise.[1] Hunt reverted to red and gold for the Texans' uniforms, which even after the team relocated to Kansas City, remain as the franchise's colors to this day.[1] | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": 2, "last_layer_weight": 0.25, "prior_layers_weight": 2.5, "kl_div_weight": 0.75, "kl_temperature": 0.75 } ``` #### trivia_pairs * Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0) * Size: 24,250 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Who played the title role in the UK television series ‘Dr Quinn, Medicine Woman’? | Dr. Quinn, Medicine Woman (TV Series 1993–1998) - IMDb IMDb Doctor Strange Confirmed to Appear in ‘Thor: Ragnarok’ 7 hours ago There was an error trying to load your rating for this title. Some parts of this page won't work property. Please reload or try later. X Beta I'm Watching This! Keep track of everything you watch; tell your friends. Error The trials and adventures of a female doctor in a small wild west town. Creator: Bolstered by the long-awaited arrival of the railroad, the citizens of Colorado Springs busily prepare for the biggest social event in the town's history, the wedding of Dr. Mike and Sully. The first... 8.7 Robert E and Grace by a house in town. A leader of the KKK comes to town and causes trouble amongst the citizens. 8.7 Brian's dog Pup gets bitten by a rabid raccoon. Then Matthew's fiancée gives the dog water. She also gets bitten. 8.6 "No Small Parts" IMDb Exclusive: "Westworld" Star Thandie Newton Actress Thandie Newton has been nominated for a Golden Globe Award for her performance as Maeve in the HBO's " Westworld ." What other significant parts has she played over the years? Visit IMDb's Golden Globes section for red-carpet photos, videos, and more. a list of 34 titles created 03 Apr 2012 a list of 31 titles created 03 Apr 2013 a list of 34 titles created 05 Apr 2013 a list of 27 images created 01 Jun 2014 a list of 31 titles created 6 months ago Title: Dr. Quinn, Medicine Woman (1993–1998) 6.7/10 Want to share IMDb's rating on your own site? Use the HTML below. You must be a registered user to use the IMDb rating plugin. Won 1 Golden Globe. Another 15 wins & 43 nominations. See more awards  » Photos Edit Storyline The fifth daughter of a wealthy Boston physician, Michaela Quinn defies the conventions of post-Civil War society by following in her father's footsteps. After his death, 'Dr. Mike' leaves Boston and moves to the frontier town of Colorado Springs, where she finds the citizens less than thrilled by the concept of a woman doctor. While she struggles to earn their trust, Mike's life is complicated by a growing relationship with mountain man Byron Sully, and the unexpected responsibility of raising three orphaned children. Written by Marg Baskin 1 January 1993 (USA) See more  » Also Known As: Docteur Quinn, femme médecin See more  » Filming Locations: Did You Know? Trivia Jane Seymour, Joe Lando, Chad Allen and Shawn Toovey, in their DVD commentaries, all expressed interest in a future "Dr. Quinn" project. Only Allen declined to reprise his role in the two movies that continued the series after its cancellation. See more » Goofs Mr Bray's store doors change from having windows to solid wood and back to having windows. See more » Quotes Dr. Michaela 'Mike' Quinn : You can't close your heart and give up on people. When you shut down the door, no one can get in. (United Kingdom) – See all my reviews Despite being such a simple series, It is probably one of the best for that reason. The 'EastEnders' of nowadays is becoming way too stereotypical and predictable that this masterpiece of a series has a somewhat timelessness to it. I mean, I'm a 16 year old, and I'M ADDICTED TO IT! The acting is of a high standard and there is no part of it I can deem as typical. The best thing is, that if you miss a few episodes and pick it up from a random episode, it still makes sense and you still enjoy watching it. It's not like one of those series that if you miss one episode, you don't really understand what's going on in all the episodes following it. That is why this series is so viewer-friendly. I don't know about you, but I'm considering buying the entire box-set! 27 of 28 people found this review helpful.  Was this review helpful to you? Yes | | Which medieval city was the centre of the Italian earthquake of April 2009? | The Frame: Earthquake in Italy Earthquake in Italy L'AQUILA, Italy (AP) -- A powerful earthquake in mountainous central Italy knocked down whole blocks of buildings early Monday as residents slept, killing more than 70 people in the country's deadliest quake in nearly three decades, officials said. Tens of thousands were homeless and 1,500 were injured. Ambulances screamed through the medieval city L'Aquila as firefighters with dogs worked feverishly to reach people trapped in fallen buildings, including a dormitory where half a dozen university students were believed still inside. Outside the half-collapsed building, tearful young people huddled together, wrapped in blankets, some still in their slippers after being roused from sleep by the quake. Dozens managed to escape as the walls fell around them. L'Aquila, capital of the Abruzzo region, was near the epicenter about 70 miles northeast of Rome. (29 images)   updated 6:30 p.m., April 6 Follow The Frame on Twitter at sacbee_theframe Police, volunteers and rescuers work on a collapsed house on April 6, 2009 in the center of the Abruzzo capital L'Aquila, the epicenter of an earthquake measuring 5.8-magnitude on the open-ended Richter scale. At least 27 people were killed in an earthquake that struck central Italy as most people lay sleeping early on April 6, and the death toll was rising steadily after many homes collapsed in the Abruzzo region. AFP / Getty Images / Vincenzo Pinto MORE IMAGES Two men hug each other as people and volunteers stand amidst debris in the city of L'Aquila, after a strong earthquake rocked central Italy early Monday, April 6, 2009. A powerful earthquake struck central Italy early Monday, killing at least 16 people, collapsing buildings and leaving thousands of people homeless, officials and news reports said. Officials said the death toll was likely to rise as rescue crews made their way through the debris. AP / Pier Paolo Cito Rescuers carry a stretcher in the village of Onna, central Italy, Monday, April 6, 2009. A powerful earthquake in mountainous central Italy knocked down whole blocks of buildings early Monday as residents slept, killing at least 50 people and trapping many more, officials said. AP / Sandro Perozzi Rescuers at work following a strong earthquake, in the village of Onna, central Italy, Monday, April 6, 2009. A powerful earthquake in mountainous central Italy knocked down whole blocks of buildings early Monday as residents slept, killing at least 50 people and trapping many more, officials said. The earthquake's epicenter was about 70 miles (110 kilometers) northeast of Rome near the medieval city of L'Aquila. It struck at 3:32 a.m. local time (0132 GMT, EDT Sunday) in a quake-prone region that has had at least nine smaller jolts since the beginning of April. AP / Sandro Perozzi Cars are covered with debris and rubble following a strong earthquake, in the village of Onna, central Italy, Monday, April 6, 2009. A powerful earthquake in mountainous central Italy knocked down whole blocks of buildings early Monday as residents slept, killing at least 50 people and trapping many more, officials said. The earthquake's epicenter was about 70 miles (110 kilometers) northeast of Rome near the medieval city of L'Aquila. It struck at 3:32 a.m. local time (0132 GMT, EDT Sunday) in a quake-prone region that has had at least nine smaller jolts since the beginning of April. AP / Sandro Perozzi A doctors unpacks a rescue pack beside a collapsed building in the centre of L'Aquila on April 6, 2009. A powerful earthquake tore through central Italy devastating historic mountain towns and killing at least 40 people, authorities said. AFP / Getty Images / Filippo Monteforte Antonello Colangeli reacts as rescuers work to remove his son Giulio from the rubbles, in the city of L'Aquila, after a strong earthquake rocked central Italy, early Monday, April 6, 2009. A powerful earthquake struck central Italy early Monday, killing at least 20 people, collapsing buildings and leaving thousands of people homeless, officials and news reports said. AP / Pie | | Colin Blunstone fronted which 1960s group? | Colin Blunstone - Music on Google Play Colin Blunstone About the artist Colin Edward Michael Blunstone is an English singer-songwriter and musician. In a career spanning more than 50 years, Blunstone came to prominence in the mid 1960s as the lead singer of the English rock band The Zombies, which released four singles that entered the Top 75 charts in the United States during the 1960s, including "She's Not There", "Tell Her No", "She's Coming Home", and "Time of the Season". Blunstone began his solo career in 1969, releasing three singles under a pseudonym of Neil MacArthur. Since then, he has released ten studio albums, and one live album under his real name. His solo hits include "She's Not There", "Say You Don't Mind", "I Don't Believe in Miracles", "How Could We Dare to Be Wrong", "What Becomes of the Brokenhearted", and "The Tracks of My Tears". He is also known for his participation on various albums with the Alan Parsons Project. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### quora_pairs * Dataset: [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb) * Size: 5,457 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------| | How do you write questions that attract more and better answers? | How can I write more popular questions and answers? | | How can we take back Pakistan and China occupied kashmir? | Can India get its occupied land back from neighbors like Pakistan and China? If yes, how? | | How should I prepare for SBI SO Assistant Manager System? | What and how to study for the post of assistant manager (system) in SBI? | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": 2, "last_layer_weight": 0.25, "prior_layers_weight": 2.5, "kl_div_weight": 0.75, "kl_temperature": 0.75 } ``` #### gooaq_pairs * Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c) * Size: 18,187 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:--------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | what is the health benefits of eating jackfruit? | For example, it's rich in vitamin C and one of the few fruits that's high in B vitamins. Jackfruit also contains folate, niacin, riboflavin, potassium, and magnesium. | | fscs how is it funded? | The FSCS is funded by the financial services industry. Every firm authorised by the UK regulators is obliged to pay an annual levy, which goes towards our running costs and the compensation payments we make. This levy is split into six broad classes covering each type of firm: deposits. | | can you get a home equity loan to pay off debt? | Debt consolidation A HELOC or home equity loan can be used to consolidate high-interest debts to a lower interest rate. Homeowners sometimes use home equity to pay off other personal debts such as a car loan or a credit card. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### gooaq_pairs2 * Dataset: [gooaq_pairs2](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c) * Size: 6,063 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | is going to the gym a hobby? | A hobby is something you are passionate about and pursue seriously. ... Just like reading,playing a specific game or trekking, going to gym can also be listed as hobby. But I would suggest you to use some professional word such as “bodybuilding” “working out/exercising”. | | are fruit enzymes good for you? | The main functions of drinkable fruit based probiotic enzyme Maintain a high alkaline blood ph. Improve the immune system. Improve the digestive system and relieve constipation. An antioxidant that protects body cells from oxidation by neutralising free radicals. | | are all haze strains sativa? | HAZE STRAINS — THEIR HISTORY AND ORIGINS. Haze strains are among the most popular sativa-dominant, or nearly pure sativa hybrids. ... The flowering time of Hazes can also be excruciatingly long: some haze strains take up to 16 weeks for the plants to mature. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": 2, "last_layer_weight": 0.25, "prior_layers_weight": 2.5, "kl_div_weight": 0.75, "kl_temperature": 0.75 } ``` #### mrpc_pairs * Dataset: [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 2,474 training samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:--------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------| | Not only is this the oldest known planet , it 's also the most distant . | Astronomers have found the oldest and most distant planet known in the universe . | | Mohcine Douali , who lives in the centre of Algiers , said : " It was a great shock . | " It was a great shock , " said Mohcine Douali , who lives in central Algiers . | | Malaysia has launched an aggressive media campaign over its water dispute with Singapore . | MALAYSIA will launch a publicity campaign in local newspapers today giving its version of the water dispute with Singapore . | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "MultipleNegativesSymmetricRankingLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` ### Evaluation Datasets #### nli-pairs * Dataset: [nli-pairs](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) * Size: 200 evaluation samples * Columns: anchor and positive * Approximate statistics based on the first 1000 samples: | | anchor | positive | |:--------|:---------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | anchor | positive | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------| | Two women are embracing while holding to go packages. | Two woman are holding packages. | | Two young children in blue jerseys, one with the number 9 and one with the number 2 are standing on wooden steps in a bathroom and washing their hands in a sink. | Two kids in numbered jerseys wash their hands. | | A man selling donuts to a customer during a world exhibition event held in the city of Angeles | A man selling donuts to a customer. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### vitaminc-pairs * Dataset: [vitaminc-pairs](https://huggingface.co/datasets/tals/vitaminc) at [be6febb](https://huggingface.co/datasets/tals/vitaminc/tree/be6febb761b0b2807687e61e0b5282e459df2fa0) * Size: 166 evaluation samples * Columns: claim and evidence * Approximate statistics based on the first 1000 samples: | | claim | evidence | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | claim | evidence | |:------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Dragon Con had over 5000 guests . | Among the more than 6000 guests and musical performers at the 2009 convention were such notables as Patrick Stewart , William Shatner , Leonard Nimoy , Terry Gilliam , Bruce Boxleitner , James Marsters , and Mary McDonnell . | | COVID-19 has reached more than 185 countries . | As of , more than cases of COVID-19 have been reported in more than 190 countries and 200 territories , resulting in more than deaths . | | In March , Italy had 3.6x times more cases of coronavirus than China . | As of 12 March , among nations with at least one million citizens , Italy has the world 's highest per capita rate of positive coronavirus cases at 206.1 cases per million people ( 3.6x times the rate of China ) and is the country with the second-highest number of positive cases as well as of deaths in the world , after China . | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### sts-label * Dataset: [sts-label](https://huggingface.co/datasets/sentence-transformers/stsb) at [ab7a5ac](https://huggingface.co/datasets/sentence-transformers/stsb/tree/ab7a5ac0e35aa22088bdcf23e7fd99b220e53308) * Size: 200 evaluation samples * Columns: sentence1, sentence2, and score * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | score | |:--------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------------------------|:---------------------------------------------------------------| | type | string | string | float | | details | | | | * Samples: | sentence1 | sentence2 | score | |:--------------------------------------------------|:------------------------------------------------------|:------------------| | A man with a hard hat is dancing. | A man wearing a hard hat is dancing. | 1.0 | | A young child is riding a horse. | A child is riding a horse. | 0.95 | | A man is feeding a mouse to a snake. | The man is feeding a mouse to the snake. | 1.0 | * Loss: [CoSENTLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosentloss) with these parameters: ```json { "scale": 20.0, "similarity_fct": "pairwise_cos_sim" } ``` #### qnli-contrastive * Dataset: [qnli-contrastive](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 200 evaluation samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | label | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------| | type | string | string | int | | details | | | | * Samples: | sentence1 | sentence2 | label | |:--------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| | What came into force after the new constitution was herald? | As of that day, the new constitution heralding the Second Republic came into force. | 0 | | What is the first major city in the stream of the Rhine? | The most important tributaries in this area are the Ill below of Strasbourg, the Neckar in Mannheim and the Main across from Mainz. | 0 | | What is the minimum required if you want to teach in Canada? | In most provinces a second Bachelor's Degree such as a Bachelor of Education is required to become a qualified teacher. | 0 | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "OnlineContrastiveLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` #### scitail-pairs-qa * Dataset: [scitail-pairs-qa](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 200 evaluation samples * Columns: sentence2 and sentence1 * Approximate statistics based on the first 1000 samples: | | sentence2 | sentence1 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence2 | sentence1 | |:--------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------| | Anions are formed by atoms gaining electrons. | What are formed by atoms gaining electrons? | | Seed dormancy ensures that seeds germinate only when conditions for seedling survival are optimal. | What ensures that seeds germinate only when conditions for seedling survival are optimal? | | A blizzard usually includes heavy precipitation, strong winds, and surface air temperatures below 0°c. | Which weather event usually includes heavy precipitation, strong winds, and surface air temperatures below 0°C? | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### scitail-pairs-pos * Dataset: [scitail-pairs-pos](https://huggingface.co/datasets/allenai/scitail) at [0cc4353](https://huggingface.co/datasets/allenai/scitail/tree/0cc4353235b289165dfde1c7c5d1be983f99ce44) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:----------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------| | An introduction to atoms and elements, compounds, atomic structure and bonding, the molecule and chemical reactions. | Replace another in a molecule happens to atoms during a substitution reaction. | | Wavelength The distance between two consecutive points on a sinusoidal wave that are in phase; | Wavelength is the distance between two corresponding points of adjacent waves called. | | humans normally have 23 pairs of chromosomes. | Humans typically have 23 pairs pairs of chromosomes. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### xsum-pairs * Dataset: [xsum-pairs](https://huggingface.co/datasets/sentence-transformers/xsum) at [788ddaf](https://huggingface.co/datasets/sentence-transformers/xsum/tree/788ddafe04e539956d56b567bc32a036ee7b9206) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------| | Should Christensen remain out, Michael Kightly is likely to keep his place, while former Ipswich striker Luke Varney will hope for a recall.
Ipswich midfielder Cole Skuse is set to return to action, having missed two matches after suffering concussion.
Forward Tom Lawrence is expected to start after a groin injury.
| Burton will check on midfielder Lasse Vigen Christensen, who missed the draw with Aston Villa because of a tight gluteal muscle. | | The UK's benchmark index closed down 203.2 points at 5673.58, and has now entered a "bear market" having fallen 20% from its record high in April.
Indexes across Europe also tumbled, with Germany's Dax down 2.8% and the Cac-40 in Paris dropping 3.5%.
Shares in Shell were down about 7% after it said that annual profits would be slightly below City expectations.
Oil shares were also hit by the continued fall in crude prices. Brent crude fell 4% $1.16 to $27.60, while US crude dropped more than 5% to $27.01.
Crude oil prices have been falling since 2014 but despite that fall, producer countries have maintained output.
On Tuesday, the International Energy Agency warned that oil markets could "drown in oversupply" in 2016.
Mining shares were also hit hard. Glencore shares fell nearly 10% while BHP Billiton fell more than 7%.
BHP Billiton released a production report containing what investors interpreted as gloomy comments about the outlook for commodity prices.
Shares in WH Smith led the FTSE 250 higher, with a 5.8% gain. The company said it expects annual profits to be "slightly ahead" of expectations, due to strong sales over the five-week Christmas period.
On the currency markets the pound was one fifth of a cent higher against the dollar at $1.4178, and one tenth of a euro cent higher against the euro at €1.2990.
| The FTSE 100 slumped 3.5% as investors fretted over global growth prospects and falling oil prices. | | Alexandra Kinova had four boys and a girl by caesarean section on Sunday, they say.
The births took place "without any complications", according to doctors at Prague's Institute for the Care of Mother and Child.
The mother and babies were placed in an intensive care unit but are believed to be in a good condition.
The Czech Republic's first quintuplets, who were conceived naturally without IVF, have a 95% chance of growing up healthy, the Associated Press quoted Zbynek Stranak, chief doctor at the neonatal section of the institute, as saying.
The boys' names are reportedly Deniel, Michael, Alex and Martin, while the girl is called Terezka.
Their mother, who is from the town of Milovice, about 20km (12 miles) north-east of the capital, Prague, already had one son.
She originally believed she was pregnant with twins, but in March doctors upped it to four - and then five in April.
The father of the quintuplets was present at the delivery despite his train being delayed, according to the newspaper Ceske Noviny.
"I was crying all the way since I feared I would not manage it," he said.
| A 23-year-old woman has given birth to quintuplets in the Czech Republic, officials say, a first for the country. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "MultipleNegativesSymmetricRankingLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` #### compression-pairs * Dataset: [compression-pairs](https://huggingface.co/datasets/sentence-transformers/sentence-compression) at [605bc91](https://huggingface.co/datasets/sentence-transformers/sentence-compression/tree/605bc91d95631895ba25b6eda51a3cb596976c90) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------| | Keira Knightley tells us about being spanked by Michael Fassbender for new movie A Dangerous Method. | Keira Knightley spanked by Michael Fassbender | | The owners of a Segway tour operator in Washington DC sued the city last week in federal court to protest a requirement that their tours guides be licensed, arguing that the city's new regulations infringe upon their right to free speech. | Should tour guides be licensed? | | A NEW memorial to one of the pioneers of the water cure is being proposed by Malvern Spa Association. | NEW memorial to pioneers of the water cure proposed | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "MultipleNegativesSymmetricRankingLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` #### sciq_pairs * Dataset: [sciq_pairs](https://huggingface.co/datasets/allenai/sciq) at [2c94ad3](https://huggingface.co/datasets/allenai/sciq/tree/2c94ad3e1aafab77146f384e23536f97a4849815) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | An electric transformer connects two circuits with an iron core that becomes what? | An electric transformer connects two circuits with an iron core that becomes an electromagnet. | | What is the quantity of force multiplied by the time it is applied called? | The quantity of force multiplied by the time it is applied is called impulse. | | After the amino acid molecule has been bound to its what, protein synthesis can take place? | How can a molecule containing just 4 different nucleotides specify the sequence of the 20 amino acids that occur in proteins? If each nucleotide coded for 1 amino acid, then obviously the nucleic acids could code for only 4 amino acids. What if amino acids were coded for by groups of 2 nucleotides? There are 42, or 16, different combinations of 2 nucleotides (AA, AU, AC, AG, UU, and so on). Such a code is more extensive but still not adequate to code for 20 amino acids. However, if the nucleotides are arranged in groups of 3, the number of different possible combinations is 43, or 64. Here we have a code that is extensive enough to direct the synthesis of the primary structure of a protein molecule. The genetic code can therefore be described as the identification of each group of three nucleotides and its particular amino acid. The sequence of these triplet groups in the mRNA dictates the sequence of the amino acids in the protein. Each individual three-nucleotide coding unit, as we have seen, is called a codon. Protein synthesis is accomplished by orderly interactions between mRNA and the other ribonucleic acids (transfer RNA [tRNA] and ribosomal RNA [rRNA]), the ribosome, and more than 100 enzymes. The mRNA formed in the nucleus during transcription is transported across the nuclear membrane into the cytoplasm to the ribosomes—carrying with it the genetic instructions. The process in which the information encoded in the mRNA is used to direct the sequencing of amino acids and thus ultimately to synthesize a protein is referred to as translation. Before an amino acid can be incorporated into a polypeptide chain, it must be attached to its unique tRNA. This crucial process requires an enzyme known as aminoacyl-tRNA synthetase (Figure 19.12 "Binding of an Amino Acid to Its tRNA"). There is a specific aminoacyl-tRNA synthetase for each amino acid. This high degree of specificity is vital to the incorporation of the correct amino acid into a protein. After the amino acid molecule has been bound to its tRNA carrier, protein synthesis can take place. Figure 19.13 "The Elongation Steps in Protein Synthesis" depicts a schematic stepwise representation of this all-important process. Figure 19.12 Binding of an Amino Acid to Its tRNA Saylor URL: http://www. saylor. org/books. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### qasc_pairs * Dataset: [qasc_pairs](https://huggingface.co/datasets/allenai/qasc) at [a34ba20](https://huggingface.co/datasets/allenai/qasc/tree/a34ba204eb9a33b919c10cc08f4f1c8dae5ec070) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | what requires water? | Fertilization occurs when sperm swim to an egg inside an archegonium.. Water is needed to transport flagellated sperm to archegonium.. fertilization requires water | | When does the northern hemisphere experience cooler temperatures? | when a hemisphere is tilted away from the sun , that hemisphere receives less direct sunlight. In winter, the northern hemisphere is tilted away from the sun.. the northern hemisphere gets less sun in the winter | | what does smoking accelerate? | Aging is associated with the death of cells.. Smoking accelerates the aging process.. smoking accelerates the death of cells | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### openbookqa_pairs * Dataset: openbookqa_pairs * Size: 200 evaluation samples * Columns: question and fact * Approximate statistics based on the first 1000 samples: | | question | fact | |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | question | fact | |:-----------------------------------------------------------------------|:-----------------------------------------------------------------------------| | The thermal production of a stove is generically used for | a stove generates heat for cooking usually | | What creates a valley? | a valley is formed by a river flowing | | when it turns day and night on a planet, what cause this? | a planet rotating causes cycles of day and night on that planet | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### msmarco_pairs * Dataset: [msmarco_pairs](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3) at [28ff31e](https://huggingface.co/datasets/sentence-transformers/msmarco-msmarco-distilbert-base-v3/tree/28ff31e4c97cddd53d298497f766e653f1e666f9) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | how tall is renamon | (Note from anonymous publisher): According to Digimon Masters Online, Renamon is 222.3529411765 centimeters which is 73.54052802' in feet. All digimon have height information on them. If I were to get a Renamon with that is 180 in height, It also provides me the percentage of it's height compared to real height. So I would calculate to get it's real height. — Preceding unsigned comment added by 173.22.75.44 (talk) 20:41, 1 June 2014 (UTC) | | what is alt level? | The normal level of ALT in the bloodstream is 5 to 45 U/L (units per liter). This range can slightly vary depending where you take the blood test. When a blood test shows elevated ALT levels outside the normal range, even a small amount, fatty liver and other liver disorders might be the cause. If the cause of the ALT enzyme level increase is due to a severe liver disease (like Cirrhosis), the levels would be higher than what’s found in fatty livers patients. Please note that some labs name ALT as “Alanine Transaminase”, “Alanine Aminotransferase” or “SGPT”. | | where is the decussations of pyramids located and what functions is it involved in | Section of the medulla oblongata at the level of the decussation of the pyramids. The two pyramids contain the motor fibers that pass from the brain to the medulla oblongata and spinal cord. These are the corticobulbar and corticospinal fibers that make up the pyramidal tracts.ibers of the posterior column, which transmit sensory and proprioceptive information, are located behind the pyramids on the medulla oblongata. The medullary pyramids contain motor fibers that are known as the corticobulbar and corticospinal tracts. The corticospinal tracts are on the anterior surface of the pyramids. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### nq_pairs * Dataset: [nq_pairs](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | which team won the world cup 2015 who did they defeat | 2015 Cricket World Cup Final It was New Zealand's first World Cup Final.[5] They had previously lost the semi-final on six occasions between 1975 and 2011. Australia played in their record seventh final, having won four (1987, 1999, 2003 and 2007) and lost two (1975, 1996). | | when did i can only imagine get released | I Can Only Imagine (MercyMe song) "I Can Only Imagine" was released in 2001 as the album's lead single. It gained significant airplay on Christian radio formats before crossing over to mainstream radio formats such as adult contemporary and Top 40 in late 2003 and into 2004; to aid in promotion to these markets, a double A-side physical single (combined with "Word of God Speak") was released in 2003. It charted on several formats, including the Billboard Adult Contemporary (where it peaked at No. 5) and the Hot 100 (where it peaked at No. 71). In 2002, "I Can Only Imagine" earned the Dove Awards for 'Pop/Contemporary Recorded Song of the Year' and 'Song of the Year'; Millard earned the Dove Award 'Songwriter of the Year' at the same ceremony. With 2.5 million copies sold, it is the best-selling Christian single of all time, having been certified 3x platinum by the RIAA. As of 2018, it is the only Christian song to reach that milestone. | | how did the catherine wheel get its name | Catherine wheel (firework) The firework is named after Saint Catherine of Alexandria who, according to Christian tradition, was condemned to death by “breaking on the wheel”. When she touched the wheel it miraculously flew to pieces. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### trivia_pairs * Dataset: [trivia_pairs](https://huggingface.co/datasets/sentence-transformers/trivia-qa) at [a7c36e3](https://huggingface.co/datasets/sentence-transformers/trivia-qa/tree/a7c36e3c8c8c01526bc094d79bf80d4c848b0ad0) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Situated in Paris, what was the original name of the ‘Place Charles de Gaulle’? | Place Charles-de-Gaulle was merged with this page 4.5 ★ · 3.7K public ratings | | Which French phrase commonly used in English means literally 'bottom of the bag' ? | French Words and Expressions in English French Words and Expressions in English Learn the true meanings of French words and expressions commonly used in English Sign Up for Our Free Newsletters Thanks, You're in! What I Learned About Today You might also enjoy: Health Tip of the Day Recipe of the Day There was an error. Please try again. Please select a newsletter. Please enter a valid email address. Did you mean ?    Chic sounds more chic than "stylish." cinéma vérité   "cinema truth" comme il faut    "as it must"    The proper way, as it should be cordon bleu   "blue ribbon" coup de foudre    "bolt of lightning"    Love at first sight coup de grâce   "mercy blow"    Deathblow, final blow, decisive stroke coup de main   "stroke of hand"    Somehow the English meaning (surprise attack) got completely separated from the French (assistance, helping hand). coup de maître   "master stroke"    A stroke of genius coup de théâtre   "stroke of the theater"    Sudden, unexpected turn of events in a play coup d'etat    "state blow"    Overthrow of the government. Note that the last word is capitalized and accented in French: coup d'État. coup d'œil    "stroke of the eye"    A glance    Baked custard with carmelized crust crème caramel   "caramel cream"    Synonym of flan - custard lined with caramel crème de cacao   "cream of cacao"    Chocolate-flavored liqueur crème de la crème   "cream of the cream"    Synonymous with the English expression "cream of the crop" - refers to the best of the best. continue reading below our video 4 Tips for Improving Test Performance crème de menthe   "cream of mint"    Mint-flavored liqueur crème fraîche    "fresh cream"    This is a funny term. Despite its meaning, crème fraîche is in fact slightly fermented, thickened cream. crêpe de Chine   "Chinese crepe"    Type of silk cri de cœur   "cry of heart"    The correct way to say "heartfelt cry" in French is cri du cœur (literally, "cry of the heart") crime passionnel   "passionate crime"    Crime of passion critique   "critical, judgment"    Critique is an adjective and noun in French, but a noun and verb in English; it refers to a critical review of something or the act of performing such a review. cuisine   "kitchen, food style"    In English, cuisine refers only to a particular type of food/cooking, such as French cuisine, Southern cuisine, etc. cul-de-sac   "bottom (butt) of the bag"    Dead-end street debutante   "beginner"    In French, débutante is the feminine form of débutant - beginner (noun) or beginning (adj). In both languages, it also refers to a young girl making her formal début into society. Interestingly, this usage is not original in French; it was adopted back from English. décolletage, décolleté   "low neckline, lowered neckline"    The first is a noun, the second an adjective, but both refer to low necklines on women's clothing. dégustation   "tasting"    The French word simply refers to the act of tasting, while in English "degustation" is used for a tasting event or party, as in wine or cheese tasting. déjà vu    "already seen"    This is a grammatical structure in French, as in Je l'ai déjà vu=> I've already seen it, whereas in English, déjà vu refers to the phenomenon of feeling like you've already seen or done something when you're sure that you haven't. demimonde   "half world"    In French, it's hyphenated: demi-monde. In English, there are two meanings:    1. A marginal or disrespectful group    2. Prostitutes and/or kept women demitasse   "half cup"    In French, it's hyphenated: demi-tasse. Refers to a small cup of espresso or other strong coffee. démodé   "out of fashion"    Same meaning in both languages: outmoded, out of fashion de rigueur   "of rigueur" de trop    "of too much"    Excessive, superfluous Dieu et mon droit   "God and my right"    Motto of the British monarch divorcé, divorcée   "divorced man, divorced woman"    In English, the feminine, divorcée, is far more common, and is often written without the accent: divorcee double entendre   "double hearing"    A word play or pun. For example, you're looki | | Developed in 1941, the Nickel-Strunz classification scheme categorizes? | Sabine Grunwald Soil Classification Classification of natural phenomena like soils is typically done for the following purposes: Organize knowledge about the subject to enable investigation and communication to be both logical and comprehensive (structure/organization for scientific/technical development). Provide a framework for establishing relationships among soils and their environment that leads to advancement of theoretical and experimental aspects of soil and related science (academic focus). Establish groupings of soils for which useful and reliable interpretations can be made (utilitarian focus). For example: optimal use(s), hazard/limitation/remediation assessment, potential productivity, framework for technology transfer/information dissemination. There are two different classification approaches, i.e., technical vs. natural classification. Technical classifications are designed for specific applied purposes (e.g., soil engineering classifications, based largely on physical properties), whereas natural classifications attempt to organize the divisions of soils from a more holistic appraisal of soil attributes. The general principles of natural soil classification systems are: A natural system of classification should express general or universal relationships that exist in nature. One should be able to understand, remember, generalize, or predict from information obtained. The scheme should be based on characteristics or attributes of things classified as related to their genesis. It should place similar things together on the basis of their properties. It is technically impossible to use all of the properties of soils to classify them. Judgment based on existing knowledge must be used to determine which properties are most important.   Historical Perspective One of the earliest land evaluation systems that incorporated a soil classification was established during the Vao dynasty (2357-2261 B.C.) in China. Soils were graded into nine classes, based on their productivity. It has been suggested that property taxes were based on the size of the individual land holding and soil productivity. In former times (< 1600 A.C.), soil was solely considered as a medium for plant growth. Knowledge of soil behavior and crop growth was passed from generation to generation gained by observation. For example, in the Middle Ages it was well know that manure applied to soils improved crop growth. For instance, the 'Plaggen cultivation' was practiced for a long time in Europe, which left 'Plaggen soils': The top of grassland was peeled off and used as litter in the stables. This material mixed with manure was applied to arable land to improve crop production. In 1840, the German chemist Justus von Liebig initiated a revolution in soil science and agriculture. He proved that plants assimilate mineral nutrients from the soil and proposed the use of mineral fertilizers to fortify deficient soils. Crop production was increased tremendously using mineral fertilizers. Another effect was the shift from extensive to intensive techniques in agriculture, which influenced soils. Thaer (1853) published a classification that combined texture as a primary subdivision with further subdivisions based on agricultural suitability and productivity. Several classifications based largely on geologic origin of soil material were also proposed in the 19th century (Fallou, 1862; Richtofen, 1886). From the 1660s onwards, various members of The Royal Society of London proposed schemes of soil classification that incorporated elements of a natural or scientific approach in their criteria. From this period on, the disciplines of agricultural chemistry (with a strong | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### quora_pairs * Dataset: [quora_pairs](https://huggingface.co/datasets/sentence-transformers/quora-duplicates) at [451a485](https://huggingface.co/datasets/sentence-transformers/quora-duplicates/tree/451a4850bd141edb44ade1b5828c259abd762cdb) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------|:---------------------------------------------------------------------------------------------------| | What animals prey on meerkats? | What animals prey of the meerkats? | | How do you become the top writer on Quora? | If I wanted to make it a goal to be a top writer on Quora, how can I accomplish this? | | Who do you think would win the 2016 USA Election? | Who will win the Election? Trump or Clinton? | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": 2, "last_layer_weight": 0.25, "prior_layers_weight": 2.5, "kl_div_weight": 0.75, "kl_temperature": 0.75 } ``` #### gooaq_pairs * Dataset: [gooaq_pairs](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | what does it mean when you keep having the same person in your dream? | According to a clinical psychologist, having recurring dreams about the same person shouldn't be taken too literally, whether they are your best friend or a sworn enemy. ... These dreams may not mean that you are obsessed with this individual, but may symbolise your feelings and worries. | | is listerine good for a sore throat? | Can LISTERINE® mouthwash prevent sore throat? No. LISTERINE® mouthwash products are only intended to be used to help prevent common oral health problems like bad breath, plaque, cavities, gingivitis and tooth stains. Please consult with your doctor on how to treat, prevent or relieve the pain of a sore throat. | | what is the difference between a bb clarinet and a eb clarinet? | The E-flat Clarinet, also called as Piccolo Clarinet, is the small brother of the Clarinet and is the highest instruments of the Clarinet family. The only difference is that it is smaller than the B-flat Clarinet. It is played like the Clarinet and is made of the same materials. | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "GISTEmbedLoss", "n_layers_per_step": -1, "last_layer_weight": 1.5, "prior_layers_weight": 0.5, "kl_div_weight": 1.25, "kl_temperature": 1.1 } ``` #### mrpc_pairs * Dataset: [mrpc_pairs](https://huggingface.co/datasets/nyu-mll/glue) at [bcdcba7](https://huggingface.co/datasets/nyu-mll/glue/tree/bcdcba79d07bc864c1c254ccfcedcce55bcc9a8c) * Size: 200 evaluation samples * Columns: sentence1 and sentence2 * Approximate statistics based on the first 1000 samples: | | sentence1 | sentence2 | |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------| | type | string | string | | details | | | * Samples: | sentence1 | sentence2 | |:---------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------| | Eric Gagne pitched a perfect ninth for his 23rd save in as many opportunities . | Gagne struck out two in a perfect ninth inning for his 23rd save . | | University of Michigan President Mary Sue Coleman said in a statement on the university 's Web site , " Our fundamental values haven 't changed . | " Our fundamental values haven 't changed , " Mary Sue Coleman , president of the university , said in a statement in Ann Arbor . | | Mr Annan also warned the US should not use the war on terror as an excuse to suppress " long-cherished freedoms " . | Annan warned that the dangers of extremism after September 11 should not be used as an excuse to suppress " long-cherished " freedoms . | * Loss: [AdaptiveLayerLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#adaptivelayerloss) with these parameters: ```json { "loss": "MultipleNegativesSymmetricRankingLoss", "n_layers_per_step": -1, "last_layer_weight": 0.75, "prior_layers_weight": 1, "kl_div_weight": 0.9, "kl_temperature": 0.75 } ``` ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: steps - `per_device_train_batch_size`: 42 - `per_device_eval_batch_size`: 42 - `learning_rate`: 3.5e-05 - `weight_decay`: 0.0001 - `lr_scheduler_type`: cosine_with_restarts - `lr_scheduler_kwargs`: {'num_cycles': 2} - `warmup_ratio`: 0.25 - `save_safetensors`: False - `fp16`: True - `push_to_hub`: True - `hub_model_id`: bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp - `hub_strategy`: all_checkpoints - `batch_sampler`: no_duplicates #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: steps - `prediction_loss_only`: True - `per_device_train_batch_size`: 42 - `per_device_eval_batch_size`: 42 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 3.5e-05 - `weight_decay`: 0.0001 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 3 - `max_steps`: -1 - `lr_scheduler_type`: cosine_with_restarts - `lr_scheduler_kwargs`: {'num_cycles': 2} - `warmup_ratio`: 0.25 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: False - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: True - `resume_from_checkpoint`: None - `hub_model_id`: bobox/DeBERTa-ST-AllLayers-v3-checkpoints-tmp - `hub_strategy`: all_checkpoints - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: no_duplicates - `multi_dataset_batch_sampler`: proportional
### Training Logs | Epoch | Step | Training Loss | openbookqa pairs loss | mrpc pairs loss | scitail-pairs-pos loss | compression-pairs loss | nli-pairs loss | sts-label loss | vitaminc-pairs loss | gooaq pairs loss | quora pairs loss | trivia pairs loss | qasc pairs loss | scitail-pairs-qa loss | sciq pairs loss | msmarco pairs loss | nq pairs loss | xsum-pairs loss | qnli-contrastive loss | sts-test_spearman_cosine | |:------:|:----:|:-------------:|:---------------------:|:---------------:|:----------------------:|:----------------------:|:--------------:|:--------------:|:-------------------:|:----------------:|:----------------:|:-----------------:|:---------------:|:---------------------:|:---------------:|:------------------:|:-------------:|:---------------:|:---------------------:|:------------------------:| | 0.0075 | 48 | 12.3074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0151 | 96 | 15.7221 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0226 | 144 | 10.8027 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0301 | 192 | 8.9559 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0376 | 240 | 8.8511 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0452 | 288 | 9.3478 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0527 | 336 | 8.8892 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0602 | 384 | 8.3008 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0678 | 432 | 7.3455 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0751 | 479 | - | 7.6210 | 2.1153 | 4.2950 | 2.8060 | 6.5910 | 3.3717 | 6.2479 | 7.4574 | 1.1983 | 8.1779 | 7.7198 | 5.4017 | 10.6215 | 8.3536 | 7.9954 | 3.5407 | 3.5311 | 0.3998 | | 0.0753 | 480 | 8.0369 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0828 | 528 | 6.2732 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0903 | 576 | 7.8529 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.0979 | 624 | 5.8643 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1054 | 672 | 6.3179 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1129 | 720 | 6.1175 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1205 | 768 | 5.2392 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1280 | 816 | 5.8324 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1355 | 864 | 5.1523 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1430 | 912 | 6.0303 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1503 | 958 | - | 5.0058 | 1.0854 | 2.4681 | 1.6288 | 4.3176 | 3.7677 | 6.3070 | 3.9322 | 0.9212 | 4.9134 | 3.7815 | 1.4301 | 9.7878 | 4.5749 | 5.2812 | 2.1537 | 2.8634 | 0.6459 | | 0.1506 | 960 | 5.7748 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1581 | 1008 | 4.8728 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1656 | 1056 | 4.7375 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1731 | 1104 | 4.6766 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1807 | 1152 | 4.3209 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1882 | 1200 | 3.7761 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.1957 | 1248 | 4.2161 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2033 | 1296 | 4.9089 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2108 | 1344 | 4.3406 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2183 | 1392 | 3.5664 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2254 | 1437 | - | 4.3604 | 0.7371 | 1.9232 | 1.1713 | 3.2246 | 4.0164 | 6.7152 | 3.0770 | 0.8747 | 3.7819 | 2.9044 | 1.0403 | 9.4438 | 3.5160 | 4.0163 | 1.6821 | 1.9995 | 0.6719 | | 0.2258 | 1440 | 4.7194 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2334 | 1488 | 3.6345 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2409 | 1536 | 3.5947 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2484 | 1584 | 4.0526 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2560 | 1632 | 3.7962 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2635 | 1680 | 4.1927 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2710 | 1728 | 3.6351 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2785 | 1776 | 3.4256 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2861 | 1824 | 3.3175 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | | 0.2936 | 1872 | 3.4984 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - | ### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.41.2 - PyTorch: 2.3.0+cu121 - Accelerate: 0.32.1 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ``` #### AdaptiveLayerLoss ```bibtex @misc{li20242d, title={2D Matryoshka Sentence Embeddings}, author={Xianming Li and Zongxi Li and Jing Li and Haoran Xie and Qing Li}, year={2024}, eprint={2402.14776}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` #### CoSENTLoss ```bibtex @online{kexuefm-8847, title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT}, author={Su Jianlin}, year={2022}, month={Jan}, url={https://kexue.fm/archives/8847}, } ``` #### GISTEmbedLoss ```bibtex @misc{solatorio2024gistembed, title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning}, author={Aivin V. Solatorio}, year={2024}, eprint={2402.16829}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` #### MultipleNegativesRankingLoss ```bibtex @misc{henderson2017efficient, title={Efficient Natural Language Response Suggestion for Smart Reply}, author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil}, year={2017}, eprint={1705.00652}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```