|
--- |
|
base_model: lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter11-final |
|
datasets: [] |
|
language: [] |
|
library_name: sentence-transformers |
|
metrics: |
|
- pearson_cosine |
|
- spearman_cosine |
|
- pearson_manhattan |
|
- spearman_manhattan |
|
- pearson_euclidean |
|
- spearman_euclidean |
|
- pearson_dot |
|
- spearman_dot |
|
- pearson_max |
|
- spearman_max |
|
- cosine_accuracy@1 |
|
- cosine_accuracy@3 |
|
- cosine_accuracy@5 |
|
- cosine_accuracy@10 |
|
- cosine_precision@1 |
|
- cosine_precision@3 |
|
- cosine_precision@5 |
|
- cosine_precision@10 |
|
- cosine_recall@1 |
|
- cosine_recall@3 |
|
- cosine_recall@5 |
|
- cosine_recall@10 |
|
- cosine_ndcg@10 |
|
- cosine_mrr@10 |
|
- cosine_map@100 |
|
- dot_accuracy@1 |
|
- dot_accuracy@3 |
|
- dot_accuracy@5 |
|
- dot_accuracy@10 |
|
- dot_precision@1 |
|
- dot_precision@3 |
|
- dot_precision@5 |
|
- dot_precision@10 |
|
- dot_recall@1 |
|
- dot_recall@3 |
|
- dot_recall@5 |
|
- dot_recall@10 |
|
- dot_ndcg@10 |
|
- dot_mrr@10 |
|
- dot_map@100 |
|
pipeline_tag: sentence-similarity |
|
tags: |
|
- sentence-transformers |
|
- sentence-similarity |
|
- feature-extraction |
|
- generated_from_trainer |
|
- dataset_size:480616 |
|
- loss:MSELoss |
|
widget: |
|
- source_sentence: 'query: 맨체스터 해운 운하 프로젝트를 위한 두 가지 계획은 무엇이었습니까?' |
|
sentences: |
|
- 'passage: 수업에 적극적으로 참여하는 것은 교사에게 당신의 동기와 호기심을 보여주는 중요한 방법입니다. 이는 교사가 학생에게 더 많은 |
|
관심을 기울이도록 격려하는 효과를 가져옵니다. 수업에 적극적으로 참여하면 수업 내용을 더 깊이 이해할 수 있으며, 교사와의 소통을 원활하게 |
|
할 수 있습니다. |
|
|
|
|
|
수업 시간에는 교사의 말에 귀 기울이고 중요한 내용을 꼼꼼하게 적어두는 것이 중요합니다. 또한 토론과 그룹 활동에 참여하고 질문을 통해 궁금한 |
|
점을 해결하는 적극적인 태도를 보여주세요. 이를 통해 교사는 학생이 수업에 적극적인 태도를 가지고 있으며, 학습에 진지하게 임한다는 것을 알 |
|
수 있습니다.' |
|
- 'passage: Water is a precious resource, so it''s important to conserve it. |
|
|
|
|
|
Reduce water waste by taking shorter showers, ideally no longer than five minutes. |
|
Install a low-flow showerhead to further decrease water consumption while maintaining |
|
adequate pressure.' |
|
- 'passage: 어수선함은 스트레스, 무질서, 생산성 저하로 이어질 수 있습니다. 반면 깔끔함은 집중력, 정신 건강, 삶의 질 향상에 도움이 |
|
될 수 있습니다. 이 포괄적인 가이드는 명확한 단계, 유용한 팁, 실용적인 통찰력을 통해 어수선한 습관을 깔끔한 습관으로 바꾸는 과정을 안내합니다. |
|
|
|
|
|
첫 번째 단계는 현재 습관을 파악하는 것입니다. 일상 생활을 분석하고 무질서로 인해 어수선해지거나 작업이 미완료되는 영역을 파악합니다. 흔한 |
|
예로는 어수선한 작업 공간, 쌓인 빨래, 미루어둔 심부름, 과도하게 채워진 냉장고 등이 있습니다. 이러한 패턴을 이해하는 것은 효과적으로 해결하는 |
|
데 매우 중요합니다. |
|
|
|
|
|
핵심 팁: 습관을 바꾸는 데는 시간이 걸린다는 것을 기억하십시오. 이 과정 전반에 걸쳐 자신에게 인내심을 가지십시오.' |
|
- source_sentence: 'query: 부틀 전쟁 기념비에 어떤 특별한 점이 있습니까?' |
|
sentences: |
|
- 'query: What type of companies are single-entry bookkeeping best suited for?' |
|
- 'passage: 오토바이 브레이크 블리딩은 숙련된 기술을 요구하는 작업입니다. 부상 위험을 최소화하고 안전하게 작업을 진행하려면 다음 안전 |
|
지침을 숙지해야 합니다. |
|
|
|
|
|
1. 작업 중에는 항상 장갑과 안전 고글을 착용하십시오. 유압 유체와 도구를 취급할 때는 눈이나 입에 닿지 않도록 주의해야 합니다. 유압 유체는 |
|
피부 자극을 유발할 수 있습니다. |
|
|
|
2. 작업을 시작하기 전에 작업 공간을 환기시켜 유압 브레이크 유체에서 나오는 유해한 연기를 흡입하지 않도록 하십시오. |
|
|
|
3. 오토바이를 작업대 또는 패드 스탠드에 안전하게 고정시키고, 모든 도구를 정리하여 실수로 떨어뜨리거나 미끄러지지 않도록 주의하십시오. |
|
|
|
4. 사용한 유압 유체는 현지 규정에 따라 적절히 처리해야 합니다.' |
|
- 'passage: 에키노칵투스는 황금 술통 선인장 (Echinocactus grusonii)과 할머니 선인장 (Echinocactus polycephalus)과 |
|
같은 여러 인기 종을 포함하는 선인장 속입니다. 이 선인장들은 독특한 모양, 아름다운 가시와 낮은 관리 요구 사항으로 유명합니다. 그러나 번창하려면 |
|
여전히 적절한 관리가 필요합니다. 이 가이드는 에키노칵투스를 관리하는 방법에 대한 자세한 단계를 제공하여 건강과 장수를 보장합니다. |
|
|
|
|
|
첫째, 적합한 화분과 흙을 선택하는 것이 중요합니다. 과도한 수분으로 인한 뿌리 부패를 방지하기 위해 배수구가 있는 테라코타 또는 무광택 도자기 |
|
화분을 선택하십시오. 화분은 선인장의 현재 크기보다 약간만 크게 해야 합니다. 너무 큰 화분은 물이 고인 흙으로 이어질 수 있습니다. |
|
|
|
|
|
흙은 배수가 잘 되는 선인장 혼합물을 선택하거나 펄라이트, 거친 모래, 피트모스 또는 퇴비를 같은 비율로 섞어서 직접 만들 수 있습니다. 이 |
|
혼합물은 식물의 뿌리에 적절한 통기와 배수를 제공하면서도 약간의 수분을 유지합니다.' |
|
- source_sentence: 'query: Why did Kerouac not publish "The Sea Is My Brother"?' |
|
sentences: |
|
- 'query: 미국 독립 전쟁에서 게릴라 전쟁이 사용된 가장 유명한 사례는 무엇입니까?' |
|
- 'query: Who were some artists who died in October 1996?' |
|
- 'query: Hypericum terrae-firmae는 어떤 형태의 식물인가요?' |
|
- source_sentence: 'query: 조지 매시 터널의 길이는 얼마나 되나요?' |
|
sentences: |
|
- 'passage: 주식 투자를 위한 목표와 투자 대상을 정했다면, 이제 매매 전략을 개발해야 합니다. 재정 목표, 시간 제약, 위험 감수 수준에 |
|
맞는 전략을 세우세요. 단기 매매(데이트레이딩 또는 스윙 트레이딩)에 집중할지, 장기 투자(장기 보유 전략)에 집중할지 결정하십시오. 또한 |
|
포지션 진입 및 청산을 위한 목표 가격을 설정하십시오. |
|
|
|
|
|
감정적인 의사 결정은 종종 투자 결과를 악화시킵니다. 시장 변동성 동안 감정을 관리하기 위한 규칙을 정하고 사전에 정의한 매매 전략을 고수하십시오.' |
|
- 'query: Why was the 1999 Le Mans Fuji 1000km race not part of the JGTC season?' |
|
- 'passage: Tibia is a popular massively multiplayer online role-playing game (MMORPG) |
|
with various aspects that allow players to engage in different playstyles. One |
|
such playstyle is being a player killer (PK), where you hunt other characters |
|
instead of monsters. This guide will provide an in-depth explanation of becoming |
|
a PK in Tibia while ensuring clarity, practicality, and adherence to essential |
|
tips and guidelines. Please note that engaging in PKing can have consequences, |
|
including losing items upon death or account bans if rules are violated. Proceed |
|
at your own risk. |
|
|
|
|
|
The first step in becoming a PK in Tibia is to understand the basics of the game. |
|
Learn about character creation, skills, spells, equipment, and the user interface. |
|
Spend time exploring the game''s features, completing quests, and understanding |
|
its mechanics. A solid foundation in these areas will be beneficial when transitioning |
|
into PKing.' |
|
- source_sentence: 'query: 오스트리아의 총리는 어떻게 선출되나요?' |
|
sentences: |
|
- 'query: What battle did Castle get injured in?' |
|
- 'query: How many stories did the first building of St. Stephen''s Church have?' |
|
- 'passage: 가족 구성원을 내쫓는 것은 모든 당사자에게 심각한 감정적 결과를 초래할 수 있는 극단적인 조치이므로 신중하게 고려해야 합니다. |
|
이 가이드는 내쫓는 것이 무엇을 의미하는지 그리고 상황에 맞는 선택인지 이해하는 데 도움이 되는 포괄적인 단계를 제공합니다. 이 기사는 가족 |
|
구성원을 내쫓는 것을 옹호하거나 권장하지 않으며 사용자의 요청에 따라 정보를 제공합니다. |
|
|
|
|
|
내쫓는 것을 고려하는 첫 번째 단계는 가족 구성원을 내쫓고 싶은 이유를 평가하는 것입니다. 일반적인 이유로는 지속적인 학대(신체적, 정신적 |
|
또는 재정적), 중독 문제, 심각한 성격 차이 또는 유해한 행동이 있습니다. 치료사, 상담사 또는 신뢰할 수 있는 종교 지도자와 같은 전문가의 |
|
조언을 구하여 대안적인 관점과 대처 전략을 제공받을 수 있습니다. 내쫓는 것이 근본적인 문제를 해결하지 못하고 오히려 악화될 수 있다는 점을 |
|
기억하십시오.' |
|
model-index: |
|
- name: SentenceTransformer based on lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter11-final |
|
results: |
|
- task: |
|
type: semantic-similarity |
|
name: Semantic Similarity |
|
dataset: |
|
name: sts dev |
|
type: sts-dev |
|
metrics: |
|
- type: pearson_cosine |
|
value: 0.7842546893200867 |
|
name: Pearson Cosine |
|
- type: spearman_cosine |
|
value: 0.7917116487346876 |
|
name: Spearman Cosine |
|
- type: pearson_manhattan |
|
value: 0.7898916079720373 |
|
name: Pearson Manhattan |
|
- type: spearman_manhattan |
|
value: 0.7910888301816695 |
|
name: Spearman Manhattan |
|
- type: pearson_euclidean |
|
value: 0.7904666983590452 |
|
name: Pearson Euclidean |
|
- type: spearman_euclidean |
|
value: 0.7917116487346876 |
|
name: Spearman Euclidean |
|
- type: pearson_dot |
|
value: 0.7842546935611145 |
|
name: Pearson Dot |
|
- type: spearman_dot |
|
value: 0.7917116487346876 |
|
name: Spearman Dot |
|
- type: pearson_max |
|
value: 0.7904666983590452 |
|
name: Pearson Max |
|
- type: spearman_max |
|
value: 0.7917116487346876 |
|
name: Spearman Max |
|
- task: |
|
type: information-retrieval |
|
name: Information Retrieval |
|
dataset: |
|
name: Ko StrategyQA dev |
|
type: Ko-StrategyQA-dev |
|
metrics: |
|
- type: cosine_accuracy@1 |
|
value: 0.47635135135135137 |
|
name: Cosine Accuracy@1 |
|
- type: cosine_accuracy@3 |
|
value: 0.6233108108108109 |
|
name: Cosine Accuracy@3 |
|
- type: cosine_accuracy@5 |
|
value: 0.6858108108108109 |
|
name: Cosine Accuracy@5 |
|
- type: cosine_accuracy@10 |
|
value: 0.7263513513513513 |
|
name: Cosine Accuracy@10 |
|
- type: cosine_precision@1 |
|
value: 0.47635135135135137 |
|
name: Cosine Precision@1 |
|
- type: cosine_precision@3 |
|
value: 0.27815315315315314 |
|
name: Cosine Precision@3 |
|
- type: cosine_precision@5 |
|
value: 0.19797297297297298 |
|
name: Cosine Precision@5 |
|
- type: cosine_precision@10 |
|
value: 0.11233108108108109 |
|
name: Cosine Precision@10 |
|
- type: cosine_recall@1 |
|
value: 0.30356338481338485 |
|
name: Cosine Recall@1 |
|
- type: cosine_recall@3 |
|
value: 0.48588320463320456 |
|
name: Cosine Recall@3 |
|
- type: cosine_recall@5 |
|
value: 0.5615307271557272 |
|
name: Cosine Recall@5 |
|
- type: cosine_recall@10 |
|
value: 0.6275217181467181 |
|
name: Cosine Recall@10 |
|
- type: cosine_ndcg@10 |
|
value: 0.5339919109688704 |
|
name: Cosine Ndcg@10 |
|
- type: cosine_mrr@10 |
|
value: 0.5620790433290432 |
|
name: Cosine Mrr@10 |
|
- type: cosine_map@100 |
|
value: 0.48197989758767446 |
|
name: Cosine Map@100 |
|
- type: dot_accuracy@1 |
|
value: 0.47635135135135137 |
|
name: Dot Accuracy@1 |
|
- type: dot_accuracy@3 |
|
value: 0.6233108108108109 |
|
name: Dot Accuracy@3 |
|
- type: dot_accuracy@5 |
|
value: 0.6858108108108109 |
|
name: Dot Accuracy@5 |
|
- type: dot_accuracy@10 |
|
value: 0.7263513513513513 |
|
name: Dot Accuracy@10 |
|
- type: dot_precision@1 |
|
value: 0.47635135135135137 |
|
name: Dot Precision@1 |
|
- type: dot_precision@3 |
|
value: 0.27815315315315314 |
|
name: Dot Precision@3 |
|
- type: dot_precision@5 |
|
value: 0.19797297297297298 |
|
name: Dot Precision@5 |
|
- type: dot_precision@10 |
|
value: 0.11233108108108109 |
|
name: Dot Precision@10 |
|
- type: dot_recall@1 |
|
value: 0.30356338481338485 |
|
name: Dot Recall@1 |
|
- type: dot_recall@3 |
|
value: 0.48588320463320456 |
|
name: Dot Recall@3 |
|
- type: dot_recall@5 |
|
value: 0.5615307271557272 |
|
name: Dot Recall@5 |
|
- type: dot_recall@10 |
|
value: 0.6275217181467181 |
|
name: Dot Recall@10 |
|
- type: dot_ndcg@10 |
|
value: 0.5339919109688704 |
|
name: Dot Ndcg@10 |
|
- type: dot_mrr@10 |
|
value: 0.5620790433290432 |
|
name: Dot Mrr@10 |
|
- type: dot_map@100 |
|
value: 0.48197989758767446 |
|
name: Dot Map@100 |
|
- task: |
|
type: semantic-similarity |
|
name: Semantic Similarity |
|
dataset: |
|
name: sts test |
|
type: sts-test |
|
metrics: |
|
- type: pearson_cosine |
|
value: 0.7120621529416492 |
|
name: Pearson Cosine |
|
- type: spearman_cosine |
|
value: 0.714291604968088 |
|
name: Spearman Cosine |
|
- type: pearson_manhattan |
|
value: 0.7236162123367763 |
|
name: Pearson Manhattan |
|
- type: spearman_manhattan |
|
value: 0.7139844146464364 |
|
name: Spearman Manhattan |
|
- type: pearson_euclidean |
|
value: 0.7236232398995261 |
|
name: Pearson Euclidean |
|
- type: spearman_euclidean |
|
value: 0.714291604968088 |
|
name: Spearman Euclidean |
|
- type: pearson_dot |
|
value: 0.7120621594492764 |
|
name: Pearson Dot |
|
- type: spearman_dot |
|
value: 0.714291604968088 |
|
name: Spearman Dot |
|
- type: pearson_max |
|
value: 0.7236232398995261 |
|
name: Pearson Max |
|
- type: spearman_max |
|
value: 0.714291604968088 |
|
name: Spearman Max |
|
--- |
|
|
|
# SentenceTransformer based on lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter11-final |
|
|
|
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter11-final](https://huggingface.co/lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter11-final). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. |
|
|
|
## Model Details |
|
|
|
### Model Description |
|
- **Model Type:** Sentence Transformer |
|
- **Base model:** [lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter11-final](https://huggingface.co/lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter11-final) <!-- at revision f1d98436c043d05c54345b904d8db2b3279f4f5f --> |
|
- **Maximum Sequence Length:** 512 tokens |
|
- **Output Dimensionality:** 384 tokens |
|
- **Similarity Function:** Cosine Similarity |
|
<!-- - **Training Dataset:** Unknown --> |
|
<!-- - **Language:** Unknown --> |
|
<!-- - **License:** Unknown --> |
|
|
|
### Model Sources |
|
|
|
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net) |
|
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) |
|
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) |
|
|
|
### Full Model Architecture |
|
|
|
``` |
|
SentenceTransformer( |
|
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: XLMRobertaModel |
|
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) |
|
(2): Normalize() |
|
) |
|
``` |
|
|
|
## Usage |
|
|
|
### Direct Usage (Sentence Transformers) |
|
|
|
First install the Sentence Transformers library: |
|
|
|
```bash |
|
pip install -U sentence-transformers |
|
``` |
|
|
|
Then you can load this model and run inference. |
|
```python |
|
from sentence_transformers import SentenceTransformer |
|
|
|
# Download from the 🤗 Hub |
|
model = SentenceTransformer("lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter12-final") |
|
# Run inference |
|
sentences = [ |
|
'query: 오스트리아의 총리는 어떻게 선출되나요?', |
|
'query: What battle did Castle get injured in?', |
|
"query: How many stories did the first building of St. Stephen's Church have?", |
|
] |
|
embeddings = model.encode(sentences) |
|
print(embeddings.shape) |
|
# [3, 384] |
|
|
|
# Get the similarity scores for the embeddings |
|
similarities = model.similarity(embeddings, embeddings) |
|
print(similarities.shape) |
|
# [3, 3] |
|
``` |
|
|
|
<!-- |
|
### Direct Usage (Transformers) |
|
|
|
<details><summary>Click to see the direct usage in Transformers</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Downstream Usage (Sentence Transformers) |
|
|
|
You can finetune this model on your own dataset. |
|
|
|
<details><summary>Click to expand</summary> |
|
|
|
</details> |
|
--> |
|
|
|
<!-- |
|
### Out-of-Scope Use |
|
|
|
*List how the model may foreseeably be misused and address what users ought not to do with the model.* |
|
--> |
|
|
|
## Evaluation |
|
|
|
### Metrics |
|
|
|
#### Semantic Similarity |
|
* Dataset: `sts-dev` |
|
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:-----------| |
|
| pearson_cosine | 0.7843 | |
|
| **spearman_cosine** | **0.7917** | |
|
| pearson_manhattan | 0.7899 | |
|
| spearman_manhattan | 0.7911 | |
|
| pearson_euclidean | 0.7905 | |
|
| spearman_euclidean | 0.7917 | |
|
| pearson_dot | 0.7843 | |
|
| spearman_dot | 0.7917 | |
|
| pearson_max | 0.7905 | |
|
| spearman_max | 0.7917 | |
|
|
|
#### Information Retrieval |
|
* Dataset: `Ko-StrategyQA-dev` |
|
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:----------| |
|
| cosine_accuracy@1 | 0.4764 | |
|
| cosine_accuracy@3 | 0.6233 | |
|
| cosine_accuracy@5 | 0.6858 | |
|
| cosine_accuracy@10 | 0.7264 | |
|
| cosine_precision@1 | 0.4764 | |
|
| cosine_precision@3 | 0.2782 | |
|
| cosine_precision@5 | 0.198 | |
|
| cosine_precision@10 | 0.1123 | |
|
| cosine_recall@1 | 0.3036 | |
|
| cosine_recall@3 | 0.4859 | |
|
| cosine_recall@5 | 0.5615 | |
|
| cosine_recall@10 | 0.6275 | |
|
| cosine_ndcg@10 | 0.534 | |
|
| cosine_mrr@10 | 0.5621 | |
|
| **cosine_map@100** | **0.482** | |
|
| dot_accuracy@1 | 0.4764 | |
|
| dot_accuracy@3 | 0.6233 | |
|
| dot_accuracy@5 | 0.6858 | |
|
| dot_accuracy@10 | 0.7264 | |
|
| dot_precision@1 | 0.4764 | |
|
| dot_precision@3 | 0.2782 | |
|
| dot_precision@5 | 0.198 | |
|
| dot_precision@10 | 0.1123 | |
|
| dot_recall@1 | 0.3036 | |
|
| dot_recall@3 | 0.4859 | |
|
| dot_recall@5 | 0.5615 | |
|
| dot_recall@10 | 0.6275 | |
|
| dot_ndcg@10 | 0.534 | |
|
| dot_mrr@10 | 0.5621 | |
|
| dot_map@100 | 0.482 | |
|
|
|
#### Semantic Similarity |
|
* Dataset: `sts-test` |
|
* Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) |
|
|
|
| Metric | Value | |
|
|:--------------------|:-----------| |
|
| pearson_cosine | 0.7121 | |
|
| **spearman_cosine** | **0.7143** | |
|
| pearson_manhattan | 0.7236 | |
|
| spearman_manhattan | 0.714 | |
|
| pearson_euclidean | 0.7236 | |
|
| spearman_euclidean | 0.7143 | |
|
| pearson_dot | 0.7121 | |
|
| spearman_dot | 0.7143 | |
|
| pearson_max | 0.7236 | |
|
| spearman_max | 0.7143 | |
|
|
|
<!-- |
|
## Bias, Risks and Limitations |
|
|
|
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* |
|
--> |
|
|
|
<!-- |
|
### Recommendations |
|
|
|
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* |
|
--> |
|
|
|
## Training Details |
|
|
|
### Training Hyperparameters |
|
#### Non-Default Hyperparameters |
|
|
|
- `eval_strategy`: steps |
|
- `per_device_train_batch_size`: 52 |
|
- `per_device_eval_batch_size`: 4 |
|
- `learning_rate`: 0.0001 |
|
- `num_train_epochs`: 1 |
|
- `warmup_ratio`: 0.05 |
|
- `fp16`: True |
|
- `push_to_hub`: True |
|
- `hub_model_id`: lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter12 |
|
- `hub_strategy`: checkpoint |
|
- `hub_private_repo`: True |
|
|
|
#### All Hyperparameters |
|
<details><summary>Click to expand</summary> |
|
|
|
- `overwrite_output_dir`: False |
|
- `do_predict`: False |
|
- `eval_strategy`: steps |
|
- `prediction_loss_only`: True |
|
- `per_device_train_batch_size`: 52 |
|
- `per_device_eval_batch_size`: 4 |
|
- `per_gpu_train_batch_size`: None |
|
- `per_gpu_eval_batch_size`: None |
|
- `gradient_accumulation_steps`: 1 |
|
- `eval_accumulation_steps`: None |
|
- `learning_rate`: 0.0001 |
|
- `weight_decay`: 0.0 |
|
- `adam_beta1`: 0.9 |
|
- `adam_beta2`: 0.999 |
|
- `adam_epsilon`: 1e-08 |
|
- `max_grad_norm`: 1.0 |
|
- `num_train_epochs`: 1 |
|
- `max_steps`: -1 |
|
- `lr_scheduler_type`: linear |
|
- `lr_scheduler_kwargs`: {} |
|
- `warmup_ratio`: 0.05 |
|
- `warmup_steps`: 0 |
|
- `log_level`: passive |
|
- `log_level_replica`: warning |
|
- `log_on_each_node`: True |
|
- `logging_nan_inf_filter`: True |
|
- `save_safetensors`: True |
|
- `save_on_each_node`: False |
|
- `save_only_model`: False |
|
- `restore_callback_states_from_checkpoint`: False |
|
- `no_cuda`: False |
|
- `use_cpu`: False |
|
- `use_mps_device`: False |
|
- `seed`: 42 |
|
- `data_seed`: None |
|
- `jit_mode_eval`: False |
|
- `use_ipex`: False |
|
- `bf16`: False |
|
- `fp16`: True |
|
- `fp16_opt_level`: O1 |
|
- `half_precision_backend`: auto |
|
- `bf16_full_eval`: False |
|
- `fp16_full_eval`: False |
|
- `tf32`: None |
|
- `local_rank`: 0 |
|
- `ddp_backend`: None |
|
- `tpu_num_cores`: None |
|
- `tpu_metrics_debug`: False |
|
- `debug`: [] |
|
- `dataloader_drop_last`: False |
|
- `dataloader_num_workers`: 0 |
|
- `dataloader_prefetch_factor`: None |
|
- `past_index`: -1 |
|
- `disable_tqdm`: False |
|
- `remove_unused_columns`: True |
|
- `label_names`: None |
|
- `load_best_model_at_end`: False |
|
- `ignore_data_skip`: False |
|
- `fsdp`: [] |
|
- `fsdp_min_num_params`: 0 |
|
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} |
|
- `fsdp_transformer_layer_cls_to_wrap`: None |
|
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} |
|
- `deepspeed`: None |
|
- `label_smoothing_factor`: 0.0 |
|
- `optim`: adamw_torch |
|
- `optim_args`: None |
|
- `adafactor`: False |
|
- `group_by_length`: False |
|
- `length_column_name`: length |
|
- `ddp_find_unused_parameters`: None |
|
- `ddp_bucket_cap_mb`: None |
|
- `ddp_broadcast_buffers`: False |
|
- `dataloader_pin_memory`: True |
|
- `dataloader_persistent_workers`: False |
|
- `skip_memory_metrics`: True |
|
- `use_legacy_prediction_loop`: False |
|
- `push_to_hub`: True |
|
- `resume_from_checkpoint`: None |
|
- `hub_model_id`: lemon-mint/mMiniLMv2-L12-H384-Distilled-Iter12 |
|
- `hub_strategy`: checkpoint |
|
- `hub_private_repo`: True |
|
- `hub_always_push`: False |
|
- `gradient_checkpointing`: False |
|
- `gradient_checkpointing_kwargs`: None |
|
- `include_inputs_for_metrics`: False |
|
- `eval_do_concat_batches`: True |
|
- `fp16_backend`: auto |
|
- `push_to_hub_model_id`: None |
|
- `push_to_hub_organization`: None |
|
- `mp_parameters`: |
|
- `auto_find_batch_size`: False |
|
- `full_determinism`: False |
|
- `torchdynamo`: None |
|
- `ray_scope`: last |
|
- `ddp_timeout`: 1800 |
|
- `torch_compile`: False |
|
- `torch_compile_backend`: None |
|
- `torch_compile_mode`: None |
|
- `dispatch_batches`: None |
|
- `split_batches`: None |
|
- `include_tokens_per_second`: False |
|
- `include_num_input_tokens_seen`: False |
|
- `neftune_noise_alpha`: None |
|
- `optim_target_modules`: None |
|
- `batch_eval_metrics`: False |
|
- `eval_on_start`: False |
|
- `batch_sampler`: batch_sampler |
|
- `multi_dataset_batch_sampler`: proportional |
|
|
|
</details> |
|
|
|
### Training Logs |
|
<details><summary>Click to expand</summary> |
|
|
|
| Epoch | Step | Training Loss | loss | Ko-StrategyQA-dev_cosine_map@100 | sts-dev_spearman_cosine | sts-test_spearman_cosine | |
|
|:------:|:----:|:-------------:|:------:|:--------------------------------:|:-----------------------:|:------------------------:| |
|
| 0 | 0 | - | - | 0.4732 | 0.7833 | - | |
|
| 0.0011 | 10 | 0.0008 | - | - | - | - | |
|
| 0.0022 | 20 | 0.0007 | - | - | - | - | |
|
| 0.0032 | 30 | 0.0007 | - | - | - | - | |
|
| 0.0043 | 40 | 0.0007 | - | - | - | - | |
|
| 0.0054 | 50 | 0.0007 | - | - | - | - | |
|
| 0.0065 | 60 | 0.0007 | - | - | - | - | |
|
| 0.0076 | 70 | 0.0007 | - | - | - | - | |
|
| 0.0087 | 80 | 0.0007 | - | - | - | - | |
|
| 0.0097 | 90 | 0.0007 | - | - | - | - | |
|
| 0.0108 | 100 | 0.0007 | - | - | - | - | |
|
| 0.0119 | 110 | 0.0007 | - | - | - | - | |
|
| 0.0130 | 120 | 0.0007 | - | - | - | - | |
|
| 0.0141 | 130 | 0.0007 | - | - | - | - | |
|
| 0.0151 | 140 | 0.0007 | - | - | - | - | |
|
| 0.0162 | 150 | 0.0007 | - | - | - | - | |
|
| 0.0173 | 160 | 0.0007 | - | - | - | - | |
|
| 0.0184 | 170 | 0.0007 | - | - | - | - | |
|
| 0.0195 | 180 | 0.0007 | - | - | - | - | |
|
| 0.0206 | 190 | 0.0007 | - | - | - | - | |
|
| 0.0216 | 200 | 0.0007 | - | - | - | - | |
|
| 0.0227 | 210 | 0.0007 | - | - | - | - | |
|
| 0.0238 | 220 | 0.0007 | - | - | - | - | |
|
| 0.0249 | 230 | 0.0007 | - | - | - | - | |
|
| 0.0260 | 240 | 0.0007 | - | - | - | - | |
|
| 0.0270 | 250 | 0.0007 | - | - | - | - | |
|
| 0.0281 | 260 | 0.0007 | - | - | - | - | |
|
| 0.0292 | 270 | 0.0007 | - | - | - | - | |
|
| 0.0303 | 280 | 0.0007 | - | - | - | - | |
|
| 0.0314 | 290 | 0.0007 | - | - | - | - | |
|
| 0.0325 | 300 | 0.0007 | - | - | - | - | |
|
| 0.0335 | 310 | 0.0007 | - | - | - | - | |
|
| 0.0346 | 320 | 0.0007 | - | - | - | - | |
|
| 0.0357 | 330 | 0.0007 | - | - | - | - | |
|
| 0.0368 | 340 | 0.0007 | - | - | - | - | |
|
| 0.0379 | 350 | 0.0007 | - | - | - | - | |
|
| 0.0389 | 360 | 0.0007 | - | - | - | - | |
|
| 0.0400 | 370 | 0.0007 | - | - | - | - | |
|
| 0.0411 | 380 | 0.0007 | - | - | - | - | |
|
| 0.0422 | 390 | 0.0007 | - | - | - | - | |
|
| 0.0433 | 400 | 0.0007 | - | - | - | - | |
|
| 0.0444 | 410 | 0.0007 | - | - | - | - | |
|
| 0.0454 | 420 | 0.0007 | - | - | - | - | |
|
| 0.0465 | 430 | 0.0007 | - | - | - | - | |
|
| 0.0476 | 440 | 0.0007 | - | - | - | - | |
|
| 0.0487 | 450 | 0.0007 | - | - | - | - | |
|
| 0.0498 | 460 | 0.0007 | - | - | - | - | |
|
| 0.0508 | 470 | 0.0008 | - | - | - | - | |
|
| 0.0519 | 480 | 0.0007 | - | - | - | - | |
|
| 0.0530 | 490 | 0.0007 | - | - | - | - | |
|
| 0.0541 | 500 | 0.0007 | - | - | - | - | |
|
| 0.0552 | 510 | 0.0007 | - | - | - | - | |
|
| 0.0563 | 520 | 0.0007 | - | - | - | - | |
|
| 0.0573 | 530 | 0.0007 | - | - | - | - | |
|
| 0.0584 | 540 | 0.0007 | - | - | - | - | |
|
| 0.0595 | 550 | 0.0007 | - | - | - | - | |
|
| 0.0606 | 560 | 0.0007 | - | - | - | - | |
|
| 0.0617 | 570 | 0.0007 | - | - | - | - | |
|
| 0.0628 | 580 | 0.0007 | - | - | - | - | |
|
| 0.0638 | 590 | 0.0007 | - | - | - | - | |
|
| 0.0649 | 600 | 0.0007 | - | - | - | - | |
|
| 0.0660 | 610 | 0.0008 | - | - | - | - | |
|
| 0.0671 | 620 | 0.0007 | - | - | - | - | |
|
| 0.0682 | 630 | 0.0007 | - | - | - | - | |
|
| 0.0692 | 640 | 0.0007 | - | - | - | - | |
|
| 0.0703 | 650 | 0.0007 | - | - | - | - | |
|
| 0.0714 | 660 | 0.0007 | - | - | - | - | |
|
| 0.0725 | 670 | 0.0007 | - | - | - | - | |
|
| 0.0736 | 680 | 0.0007 | - | - | - | - | |
|
| 0.0747 | 690 | 0.0007 | - | - | - | - | |
|
| 0.0757 | 700 | 0.0007 | - | - | - | - | |
|
| 0.0768 | 710 | 0.0007 | - | - | - | - | |
|
| 0.0779 | 720 | 0.0007 | - | - | - | - | |
|
| 0.0790 | 730 | 0.0007 | - | - | - | - | |
|
| 0.0801 | 740 | 0.0007 | - | - | - | - | |
|
| 0.0811 | 750 | 0.0007 | - | - | - | - | |
|
| 0.0822 | 760 | 0.0007 | - | - | - | - | |
|
| 0.0833 | 770 | 0.0007 | - | - | - | - | |
|
| 0.0844 | 780 | 0.0007 | - | - | - | - | |
|
| 0.0855 | 790 | 0.0007 | - | - | - | - | |
|
| 0.0866 | 800 | 0.0007 | - | - | - | - | |
|
| 0.0876 | 810 | 0.0007 | - | - | - | - | |
|
| 0.0887 | 820 | 0.0007 | - | - | - | - | |
|
| 0.0898 | 830 | 0.0007 | - | - | - | - | |
|
| 0.0909 | 840 | 0.0007 | - | - | - | - | |
|
| 0.0920 | 850 | 0.0007 | - | - | - | - | |
|
| 0.0930 | 860 | 0.0007 | - | - | - | - | |
|
| 0.0941 | 870 | 0.0007 | - | - | - | - | |
|
| 0.0952 | 880 | 0.0007 | - | - | - | - | |
|
| 0.0963 | 890 | 0.0007 | - | - | - | - | |
|
| 0.0974 | 900 | 0.0007 | - | - | - | - | |
|
| 0.0985 | 910 | 0.0007 | - | - | - | - | |
|
| 0.0995 | 920 | 0.0007 | - | - | - | - | |
|
| 0.1006 | 930 | 0.0007 | - | - | - | - | |
|
| 0.1017 | 940 | 0.0007 | - | - | - | - | |
|
| 0.1028 | 950 | 0.0007 | - | - | - | - | |
|
| 0.1039 | 960 | 0.0007 | - | - | - | - | |
|
| 0.1049 | 970 | 0.0007 | - | - | - | - | |
|
| 0.1060 | 980 | 0.0007 | - | - | - | - | |
|
| 0.1071 | 990 | 0.0007 | - | - | - | - | |
|
| 0.1082 | 1000 | 0.0007 | 0.0007 | 0.4594 | 0.7819 | - | |
|
| 0.1093 | 1010 | 0.0007 | - | - | - | - | |
|
| 0.1104 | 1020 | 0.0007 | - | - | - | - | |
|
| 0.1114 | 1030 | 0.0007 | - | - | - | - | |
|
| 0.1125 | 1040 | 0.0007 | - | - | - | - | |
|
| 0.1136 | 1050 | 0.0007 | - | - | - | - | |
|
| 0.1147 | 1060 | 0.0007 | - | - | - | - | |
|
| 0.1158 | 1070 | 0.0007 | - | - | - | - | |
|
| 0.1168 | 1080 | 0.0007 | - | - | - | - | |
|
| 0.1179 | 1090 | 0.0007 | - | - | - | - | |
|
| 0.1190 | 1100 | 0.0007 | - | - | - | - | |
|
| 0.1201 | 1110 | 0.0007 | - | - | - | - | |
|
| 0.1212 | 1120 | 0.0007 | - | - | - | - | |
|
| 0.1223 | 1130 | 0.0008 | - | - | - | - | |
|
| 0.1233 | 1140 | 0.0007 | - | - | - | - | |
|
| 0.1244 | 1150 | 0.0007 | - | - | - | - | |
|
| 0.1255 | 1160 | 0.0007 | - | - | - | - | |
|
| 0.1266 | 1170 | 0.0007 | - | - | - | - | |
|
| 0.1277 | 1180 | 0.0007 | - | - | - | - | |
|
| 0.1287 | 1190 | 0.0007 | - | - | - | - | |
|
| 0.1298 | 1200 | 0.0007 | - | - | - | - | |
|
| 0.1309 | 1210 | 0.0007 | - | - | - | - | |
|
| 0.1320 | 1220 | 0.0007 | - | - | - | - | |
|
| 0.1331 | 1230 | 0.0007 | - | - | - | - | |
|
| 0.1342 | 1240 | 0.0007 | - | - | - | - | |
|
| 0.1352 | 1250 | 0.0007 | - | - | - | - | |
|
| 0.1363 | 1260 | 0.0007 | - | - | - | - | |
|
| 0.1374 | 1270 | 0.0007 | - | - | - | - | |
|
| 0.1385 | 1280 | 0.0007 | - | - | - | - | |
|
| 0.1396 | 1290 | 0.0007 | - | - | - | - | |
|
| 0.1406 | 1300 | 0.0007 | - | - | - | - | |
|
| 0.1417 | 1310 | 0.0007 | - | - | - | - | |
|
| 0.1428 | 1320 | 0.0007 | - | - | - | - | |
|
| 0.1439 | 1330 | 0.0007 | - | - | - | - | |
|
| 0.1450 | 1340 | 0.0007 | - | - | - | - | |
|
| 0.1461 | 1350 | 0.0007 | - | - | - | - | |
|
| 0.1471 | 1360 | 0.0007 | - | - | - | - | |
|
| 0.1482 | 1370 | 0.0007 | - | - | - | - | |
|
| 0.1493 | 1380 | 0.0007 | - | - | - | - | |
|
| 0.1504 | 1390 | 0.0007 | - | - | - | - | |
|
| 0.1515 | 1400 | 0.0007 | - | - | - | - | |
|
| 0.1525 | 1410 | 0.0007 | - | - | - | - | |
|
| 0.1536 | 1420 | 0.0007 | - | - | - | - | |
|
| 0.1547 | 1430 | 0.0007 | - | - | - | - | |
|
| 0.1558 | 1440 | 0.0007 | - | - | - | - | |
|
| 0.1569 | 1450 | 0.0007 | - | - | - | - | |
|
| 0.1580 | 1460 | 0.0007 | - | - | - | - | |
|
| 0.1590 | 1470 | 0.0007 | - | - | - | - | |
|
| 0.1601 | 1480 | 0.0007 | - | - | - | - | |
|
| 0.1612 | 1490 | 0.0007 | - | - | - | - | |
|
| 0.1623 | 1500 | 0.0007 | - | - | - | - | |
|
| 0.1634 | 1510 | 0.0007 | - | - | - | - | |
|
| 0.1644 | 1520 | 0.0007 | - | - | - | - | |
|
| 0.1655 | 1530 | 0.0007 | - | - | - | - | |
|
| 0.1666 | 1540 | 0.0007 | - | - | - | - | |
|
| 0.1677 | 1550 | 0.0007 | - | - | - | - | |
|
| 0.1688 | 1560 | 0.0007 | - | - | - | - | |
|
| 0.1699 | 1570 | 0.0007 | - | - | - | - | |
|
| 0.1709 | 1580 | 0.0007 | - | - | - | - | |
|
| 0.1720 | 1590 | 0.0007 | - | - | - | - | |
|
| 0.1731 | 1600 | 0.0007 | - | - | - | - | |
|
| 0.1742 | 1610 | 0.0007 | - | - | - | - | |
|
| 0.1753 | 1620 | 0.0007 | - | - | - | - | |
|
| 0.1763 | 1630 | 0.0007 | - | - | - | - | |
|
| 0.1774 | 1640 | 0.0007 | - | - | - | - | |
|
| 0.1785 | 1650 | 0.0007 | - | - | - | - | |
|
| 0.1796 | 1660 | 0.0007 | - | - | - | - | |
|
| 0.1807 | 1670 | 0.0007 | - | - | - | - | |
|
| 0.1818 | 1680 | 0.0007 | - | - | - | - | |
|
| 0.1828 | 1690 | 0.0007 | - | - | - | - | |
|
| 0.1839 | 1700 | 0.0007 | - | - | - | - | |
|
| 0.1850 | 1710 | 0.0007 | - | - | - | - | |
|
| 0.1861 | 1720 | 0.0007 | - | - | - | - | |
|
| 0.1872 | 1730 | 0.0007 | - | - | - | - | |
|
| 0.1883 | 1740 | 0.0007 | - | - | - | - | |
|
| 0.1893 | 1750 | 0.0007 | - | - | - | - | |
|
| 0.1904 | 1760 | 0.0007 | - | - | - | - | |
|
| 0.1915 | 1770 | 0.0007 | - | - | - | - | |
|
| 0.1926 | 1780 | 0.0007 | - | - | - | - | |
|
| 0.1937 | 1790 | 0.0007 | - | - | - | - | |
|
| 0.1947 | 1800 | 0.0007 | - | - | - | - | |
|
| 0.1958 | 1810 | 0.0007 | - | - | - | - | |
|
| 0.1969 | 1820 | 0.0007 | - | - | - | - | |
|
| 0.1980 | 1830 | 0.0007 | - | - | - | - | |
|
| 0.1991 | 1840 | 0.0007 | - | - | - | - | |
|
| 0.2002 | 1850 | 0.0007 | - | - | - | - | |
|
| 0.2012 | 1860 | 0.0007 | - | - | - | - | |
|
| 0.2023 | 1870 | 0.0007 | - | - | - | - | |
|
| 0.2034 | 1880 | 0.0007 | - | - | - | - | |
|
| 0.2045 | 1890 | 0.0007 | - | - | - | - | |
|
| 0.2056 | 1900 | 0.0007 | - | - | - | - | |
|
| 0.2066 | 1910 | 0.0007 | - | - | - | - | |
|
| 0.2077 | 1920 | 0.0007 | - | - | - | - | |
|
| 0.2088 | 1930 | 0.0007 | - | - | - | - | |
|
| 0.2099 | 1940 | 0.0007 | - | - | - | - | |
|
| 0.2110 | 1950 | 0.0007 | - | - | - | - | |
|
| 0.2121 | 1960 | 0.0007 | - | - | - | - | |
|
| 0.2131 | 1970 | 0.0007 | - | - | - | - | |
|
| 0.2142 | 1980 | 0.0007 | - | - | - | - | |
|
| 0.2153 | 1990 | 0.0007 | - | - | - | - | |
|
| 0.2164 | 2000 | 0.0007 | 0.0007 | 0.4659 | 0.7861 | - | |
|
| 0.2175 | 2010 | 0.0007 | - | - | - | - | |
|
| 0.2185 | 2020 | 0.0007 | - | - | - | - | |
|
| 0.2196 | 2030 | 0.0007 | - | - | - | - | |
|
| 0.2207 | 2040 | 0.0007 | - | - | - | - | |
|
| 0.2218 | 2050 | 0.0007 | - | - | - | - | |
|
| 0.2229 | 2060 | 0.0007 | - | - | - | - | |
|
| 0.2240 | 2070 | 0.0007 | - | - | - | - | |
|
| 0.2250 | 2080 | 0.0007 | - | - | - | - | |
|
| 0.2261 | 2090 | 0.0007 | - | - | - | - | |
|
| 0.2272 | 2100 | 0.0007 | - | - | - | - | |
|
| 0.2283 | 2110 | 0.0007 | - | - | - | - | |
|
| 0.2294 | 2120 | 0.0007 | - | - | - | - | |
|
| 0.2304 | 2130 | 0.0007 | - | - | - | - | |
|
| 0.2315 | 2140 | 0.0007 | - | - | - | - | |
|
| 0.2326 | 2150 | 0.0007 | - | - | - | - | |
|
| 0.2337 | 2160 | 0.0007 | - | - | - | - | |
|
| 0.2348 | 2170 | 0.0007 | - | - | - | - | |
|
| 0.2359 | 2180 | 0.0007 | - | - | - | - | |
|
| 0.2369 | 2190 | 0.0007 | - | - | - | - | |
|
| 0.2380 | 2200 | 0.0007 | - | - | - | - | |
|
| 0.2391 | 2210 | 0.0007 | - | - | - | - | |
|
| 0.2402 | 2220 | 0.0007 | - | - | - | - | |
|
| 0.2413 | 2230 | 0.0007 | - | - | - | - | |
|
| 0.2423 | 2240 | 0.0007 | - | - | - | - | |
|
| 0.2434 | 2250 | 0.0007 | - | - | - | - | |
|
| 0.2445 | 2260 | 0.0007 | - | - | - | - | |
|
| 0.2456 | 2270 | 0.0007 | - | - | - | - | |
|
| 0.2467 | 2280 | 0.0007 | - | - | - | - | |
|
| 0.2478 | 2290 | 0.0007 | - | - | - | - | |
|
| 0.2488 | 2300 | 0.0007 | - | - | - | - | |
|
| 0.2499 | 2310 | 0.0007 | - | - | - | - | |
|
| 0.2510 | 2320 | 0.0007 | - | - | - | - | |
|
| 0.2521 | 2330 | 0.0007 | - | - | - | - | |
|
| 0.2532 | 2340 | 0.0007 | - | - | - | - | |
|
| 0.2542 | 2350 | 0.0007 | - | - | - | - | |
|
| 0.2553 | 2360 | 0.0007 | - | - | - | - | |
|
| 0.2564 | 2370 | 0.0007 | - | - | - | - | |
|
| 0.2575 | 2380 | 0.0007 | - | - | - | - | |
|
| 0.2586 | 2390 | 0.0007 | - | - | - | - | |
|
| 0.2597 | 2400 | 0.0007 | - | - | - | - | |
|
| 0.2607 | 2410 | 0.0007 | - | - | - | - | |
|
| 0.2618 | 2420 | 0.0007 | - | - | - | - | |
|
| 0.2629 | 2430 | 0.0007 | - | - | - | - | |
|
| 0.2640 | 2440 | 0.0007 | - | - | - | - | |
|
| 0.2651 | 2450 | 0.0007 | - | - | - | - | |
|
| 0.2661 | 2460 | 0.0007 | - | - | - | - | |
|
| 0.2672 | 2470 | 0.0007 | - | - | - | - | |
|
| 0.2683 | 2480 | 0.0007 | - | - | - | - | |
|
| 0.2694 | 2490 | 0.0007 | - | - | - | - | |
|
| 0.2705 | 2500 | 0.0007 | - | - | - | - | |
|
| 0.2716 | 2510 | 0.0007 | - | - | - | - | |
|
| 0.2726 | 2520 | 0.0007 | - | - | - | - | |
|
| 0.2737 | 2530 | 0.0007 | - | - | - | - | |
|
| 0.2748 | 2540 | 0.0007 | - | - | - | - | |
|
| 0.2759 | 2550 | 0.0007 | - | - | - | - | |
|
| 0.2770 | 2560 | 0.0007 | - | - | - | - | |
|
| 0.2780 | 2570 | 0.0007 | - | - | - | - | |
|
| 0.2791 | 2580 | 0.0007 | - | - | - | - | |
|
| 0.2802 | 2590 | 0.0007 | - | - | - | - | |
|
| 0.2813 | 2600 | 0.0007 | - | - | - | - | |
|
| 0.2824 | 2610 | 0.0007 | - | - | - | - | |
|
| 0.2835 | 2620 | 0.0007 | - | - | - | - | |
|
| 0.2845 | 2630 | 0.0007 | - | - | - | - | |
|
| 0.2856 | 2640 | 0.0007 | - | - | - | - | |
|
| 0.2867 | 2650 | 0.0007 | - | - | - | - | |
|
| 0.2878 | 2660 | 0.0007 | - | - | - | - | |
|
| 0.2889 | 2670 | 0.0007 | - | - | - | - | |
|
| 0.2899 | 2680 | 0.0007 | - | - | - | - | |
|
| 0.2910 | 2690 | 0.0007 | - | - | - | - | |
|
| 0.2921 | 2700 | 0.0007 | - | - | - | - | |
|
| 0.2932 | 2710 | 0.0007 | - | - | - | - | |
|
| 0.2943 | 2720 | 0.0007 | - | - | - | - | |
|
| 0.2954 | 2730 | 0.0007 | - | - | - | - | |
|
| 0.2964 | 2740 | 0.0007 | - | - | - | - | |
|
| 0.2975 | 2750 | 0.0007 | - | - | - | - | |
|
| 0.2986 | 2760 | 0.0007 | - | - | - | - | |
|
| 0.2997 | 2770 | 0.0007 | - | - | - | - | |
|
| 0.3008 | 2780 | 0.0007 | - | - | - | - | |
|
| 0.3019 | 2790 | 0.0007 | - | - | - | - | |
|
| 0.3029 | 2800 | 0.0007 | - | - | - | - | |
|
| 0.3040 | 2810 | 0.0007 | - | - | - | - | |
|
| 0.3051 | 2820 | 0.0007 | - | - | - | - | |
|
| 0.3062 | 2830 | 0.0007 | - | - | - | - | |
|
| 0.3073 | 2840 | 0.0007 | - | - | - | - | |
|
| 0.3083 | 2850 | 0.0007 | - | - | - | - | |
|
| 0.3094 | 2860 | 0.0007 | - | - | - | - | |
|
| 0.3105 | 2870 | 0.0007 | - | - | - | - | |
|
| 0.3116 | 2880 | 0.0007 | - | - | - | - | |
|
| 0.3127 | 2890 | 0.0007 | - | - | - | - | |
|
| 0.3138 | 2900 | 0.0007 | - | - | - | - | |
|
| 0.3148 | 2910 | 0.0007 | - | - | - | - | |
|
| 0.3159 | 2920 | 0.0007 | - | - | - | - | |
|
| 0.3170 | 2930 | 0.0007 | - | - | - | - | |
|
| 0.3181 | 2940 | 0.0007 | - | - | - | - | |
|
| 0.3192 | 2950 | 0.0007 | - | - | - | - | |
|
| 0.3202 | 2960 | 0.0007 | - | - | - | - | |
|
| 0.3213 | 2970 | 0.0007 | - | - | - | - | |
|
| 0.3224 | 2980 | 0.0007 | - | - | - | - | |
|
| 0.3235 | 2990 | 0.0007 | - | - | - | - | |
|
| 0.3246 | 3000 | 0.0007 | 0.0007 | 0.4743 | 0.7858 | - | |
|
| 0.3257 | 3010 | 0.0007 | - | - | - | - | |
|
| 0.3267 | 3020 | 0.0007 | - | - | - | - | |
|
| 0.3278 | 3030 | 0.0007 | - | - | - | - | |
|
| 0.3289 | 3040 | 0.0007 | - | - | - | - | |
|
| 0.3300 | 3050 | 0.0007 | - | - | - | - | |
|
| 0.3311 | 3060 | 0.0007 | - | - | - | - | |
|
| 0.3321 | 3070 | 0.0007 | - | - | - | - | |
|
| 0.3332 | 3080 | 0.0007 | - | - | - | - | |
|
| 0.3343 | 3090 | 0.0007 | - | - | - | - | |
|
| 0.3354 | 3100 | 0.0007 | - | - | - | - | |
|
| 0.3365 | 3110 | 0.0007 | - | - | - | - | |
|
| 0.3376 | 3120 | 0.0007 | - | - | - | - | |
|
| 0.3386 | 3130 | 0.0007 | - | - | - | - | |
|
| 0.3397 | 3140 | 0.0007 | - | - | - | - | |
|
| 0.3408 | 3150 | 0.0007 | - | - | - | - | |
|
| 0.3419 | 3160 | 0.0007 | - | - | - | - | |
|
| 0.3430 | 3170 | 0.0007 | - | - | - | - | |
|
| 0.3440 | 3180 | 0.0007 | - | - | - | - | |
|
| 0.3451 | 3190 | 0.0007 | - | - | - | - | |
|
| 0.3462 | 3200 | 0.0007 | - | - | - | - | |
|
| 0.3473 | 3210 | 0.0007 | - | - | - | - | |
|
| 0.3484 | 3220 | 0.0007 | - | - | - | - | |
|
| 0.3495 | 3230 | 0.0007 | - | - | - | - | |
|
| 0.3505 | 3240 | 0.0007 | - | - | - | - | |
|
| 0.3516 | 3250 | 0.0007 | - | - | - | - | |
|
| 0.3527 | 3260 | 0.0007 | - | - | - | - | |
|
| 0.3538 | 3270 | 0.0007 | - | - | - | - | |
|
| 0.3549 | 3280 | 0.0007 | - | - | - | - | |
|
| 0.3559 | 3290 | 0.0007 | - | - | - | - | |
|
| 0.3570 | 3300 | 0.0007 | - | - | - | - | |
|
| 0.3581 | 3310 | 0.0007 | - | - | - | - | |
|
| 0.3592 | 3320 | 0.0007 | - | - | - | - | |
|
| 0.3603 | 3330 | 0.0007 | - | - | - | - | |
|
| 0.3614 | 3340 | 0.0007 | - | - | - | - | |
|
| 0.3624 | 3350 | 0.0007 | - | - | - | - | |
|
| 0.3635 | 3360 | 0.0007 | - | - | - | - | |
|
| 0.3646 | 3370 | 0.0007 | - | - | - | - | |
|
| 0.3657 | 3380 | 0.0007 | - | - | - | - | |
|
| 0.3668 | 3390 | 0.0007 | - | - | - | - | |
|
| 0.3678 | 3400 | 0.0007 | - | - | - | - | |
|
| 0.3689 | 3410 | 0.0007 | - | - | - | - | |
|
| 0.3700 | 3420 | 0.0007 | - | - | - | - | |
|
| 0.3711 | 3430 | 0.0007 | - | - | - | - | |
|
| 0.3722 | 3440 | 0.0007 | - | - | - | - | |
|
| 0.3733 | 3450 | 0.0008 | - | - | - | - | |
|
| 0.3743 | 3460 | 0.0007 | - | - | - | - | |
|
| 0.3754 | 3470 | 0.0007 | - | - | - | - | |
|
| 0.3765 | 3480 | 0.0007 | - | - | - | - | |
|
| 0.3776 | 3490 | 0.0007 | - | - | - | - | |
|
| 0.3787 | 3500 | 0.0007 | - | - | - | - | |
|
| 0.3797 | 3510 | 0.0007 | - | - | - | - | |
|
| 0.3808 | 3520 | 0.0007 | - | - | - | - | |
|
| 0.3819 | 3530 | 0.0007 | - | - | - | - | |
|
| 0.3830 | 3540 | 0.0007 | - | - | - | - | |
|
| 0.3841 | 3550 | 0.0007 | - | - | - | - | |
|
| 0.3852 | 3560 | 0.0007 | - | - | - | - | |
|
| 0.3862 | 3570 | 0.0007 | - | - | - | - | |
|
| 0.3873 | 3580 | 0.0007 | - | - | - | - | |
|
| 0.3884 | 3590 | 0.0007 | - | - | - | - | |
|
| 0.3895 | 3600 | 0.0007 | - | - | - | - | |
|
| 0.3906 | 3610 | 0.0007 | - | - | - | - | |
|
| 0.3916 | 3620 | 0.0007 | - | - | - | - | |
|
| 0.3927 | 3630 | 0.0007 | - | - | - | - | |
|
| 0.3938 | 3640 | 0.0007 | - | - | - | - | |
|
| 0.3949 | 3650 | 0.0007 | - | - | - | - | |
|
| 0.3960 | 3660 | 0.0007 | - | - | - | - | |
|
| 0.3971 | 3670 | 0.0007 | - | - | - | - | |
|
| 0.3981 | 3680 | 0.0007 | - | - | - | - | |
|
| 0.3992 | 3690 | 0.0007 | - | - | - | - | |
|
| 0.4003 | 3700 | 0.0007 | - | - | - | - | |
|
| 0.4014 | 3710 | 0.0007 | - | - | - | - | |
|
| 0.4025 | 3720 | 0.0007 | - | - | - | - | |
|
| 0.4035 | 3730 | 0.0007 | - | - | - | - | |
|
| 0.4046 | 3740 | 0.0007 | - | - | - | - | |
|
| 0.4057 | 3750 | 0.0007 | - | - | - | - | |
|
| 0.4068 | 3760 | 0.0007 | - | - | - | - | |
|
| 0.4079 | 3770 | 0.0007 | - | - | - | - | |
|
| 0.4090 | 3780 | 0.0007 | - | - | - | - | |
|
| 0.4100 | 3790 | 0.0007 | - | - | - | - | |
|
| 0.4111 | 3800 | 0.0007 | - | - | - | - | |
|
| 0.4122 | 3810 | 0.0007 | - | - | - | - | |
|
| 0.4133 | 3820 | 0.0007 | - | - | - | - | |
|
| 0.4144 | 3830 | 0.0007 | - | - | - | - | |
|
| 0.4154 | 3840 | 0.0007 | - | - | - | - | |
|
| 0.4165 | 3850 | 0.0007 | - | - | - | - | |
|
| 0.4176 | 3860 | 0.0007 | - | - | - | - | |
|
| 0.4187 | 3870 | 0.0007 | - | - | - | - | |
|
| 0.4198 | 3880 | 0.0007 | - | - | - | - | |
|
| 0.4209 | 3890 | 0.0007 | - | - | - | - | |
|
| 0.4219 | 3900 | 0.0007 | - | - | - | - | |
|
| 0.4230 | 3910 | 0.0007 | - | - | - | - | |
|
| 0.4241 | 3920 | 0.0007 | - | - | - | - | |
|
| 0.4252 | 3930 | 0.0007 | - | - | - | - | |
|
| 0.4263 | 3940 | 0.0007 | - | - | - | - | |
|
| 0.4274 | 3950 | 0.0007 | - | - | - | - | |
|
| 0.4284 | 3960 | 0.0007 | - | - | - | - | |
|
| 0.4295 | 3970 | 0.0007 | - | - | - | - | |
|
| 0.4306 | 3980 | 0.0007 | - | - | - | - | |
|
| 0.4317 | 3990 | 0.0007 | - | - | - | - | |
|
| 0.4328 | 4000 | 0.0007 | 0.0007 | 0.4722 | 0.7869 | - | |
|
| 0.4338 | 4010 | 0.0007 | - | - | - | - | |
|
| 0.4349 | 4020 | 0.0007 | - | - | - | - | |
|
| 0.4360 | 4030 | 0.0007 | - | - | - | - | |
|
| 0.4371 | 4040 | 0.0007 | - | - | - | - | |
|
| 0.4382 | 4050 | 0.0007 | - | - | - | - | |
|
| 0.4393 | 4060 | 0.0007 | - | - | - | - | |
|
| 0.4403 | 4070 | 0.0007 | - | - | - | - | |
|
| 0.4414 | 4080 | 0.0007 | - | - | - | - | |
|
| 0.4425 | 4090 | 0.0007 | - | - | - | - | |
|
| 0.4436 | 4100 | 0.0007 | - | - | - | - | |
|
| 0.4447 | 4110 | 0.0007 | - | - | - | - | |
|
| 0.4457 | 4120 | 0.0007 | - | - | - | - | |
|
| 0.4468 | 4130 | 0.0007 | - | - | - | - | |
|
| 0.4479 | 4140 | 0.0007 | - | - | - | - | |
|
| 0.4490 | 4150 | 0.0007 | - | - | - | - | |
|
| 0.4501 | 4160 | 0.0007 | - | - | - | - | |
|
| 0.4512 | 4170 | 0.0007 | - | - | - | - | |
|
| 0.4522 | 4180 | 0.0007 | - | - | - | - | |
|
| 0.4533 | 4190 | 0.0007 | - | - | - | - | |
|
| 0.4544 | 4200 | 0.0007 | - | - | - | - | |
|
| 0.4555 | 4210 | 0.0007 | - | - | - | - | |
|
| 0.4566 | 4220 | 0.0007 | - | - | - | - | |
|
| 0.4576 | 4230 | 0.0007 | - | - | - | - | |
|
| 0.4587 | 4240 | 0.0007 | - | - | - | - | |
|
| 0.4598 | 4250 | 0.0007 | - | - | - | - | |
|
| 0.4609 | 4260 | 0.0007 | - | - | - | - | |
|
| 0.4620 | 4270 | 0.0007 | - | - | - | - | |
|
| 0.4631 | 4280 | 0.0007 | - | - | - | - | |
|
| 0.4641 | 4290 | 0.0007 | - | - | - | - | |
|
| 0.4652 | 4300 | 0.0007 | - | - | - | - | |
|
| 0.4663 | 4310 | 0.0007 | - | - | - | - | |
|
| 0.4674 | 4320 | 0.0007 | - | - | - | - | |
|
| 0.4685 | 4330 | 0.0007 | - | - | - | - | |
|
| 0.4695 | 4340 | 0.0007 | - | - | - | - | |
|
| 0.4706 | 4350 | 0.0007 | - | - | - | - | |
|
| 0.4717 | 4360 | 0.0007 | - | - | - | - | |
|
| 0.4728 | 4370 | 0.0007 | - | - | - | - | |
|
| 0.4739 | 4380 | 0.0007 | - | - | - | - | |
|
| 0.4750 | 4390 | 0.0007 | - | - | - | - | |
|
| 0.4760 | 4400 | 0.0007 | - | - | - | - | |
|
| 0.4771 | 4410 | 0.0007 | - | - | - | - | |
|
| 0.4782 | 4420 | 0.0007 | - | - | - | - | |
|
| 0.4793 | 4430 | 0.0007 | - | - | - | - | |
|
| 0.4804 | 4440 | 0.0007 | - | - | - | - | |
|
| 0.4814 | 4450 | 0.0007 | - | - | - | - | |
|
| 0.4825 | 4460 | 0.0007 | - | - | - | - | |
|
| 0.4836 | 4470 | 0.0007 | - | - | - | - | |
|
| 0.4847 | 4480 | 0.0007 | - | - | - | - | |
|
| 0.4858 | 4490 | 0.0007 | - | - | - | - | |
|
| 0.4869 | 4500 | 0.0007 | - | - | - | - | |
|
| 0.4879 | 4510 | 0.0007 | - | - | - | - | |
|
| 0.4890 | 4520 | 0.0007 | - | - | - | - | |
|
| 0.4901 | 4530 | 0.0007 | - | - | - | - | |
|
| 0.4912 | 4540 | 0.0007 | - | - | - | - | |
|
| 0.4923 | 4550 | 0.0007 | - | - | - | - | |
|
| 0.4933 | 4560 | 0.0007 | - | - | - | - | |
|
| 0.4944 | 4570 | 0.0007 | - | - | - | - | |
|
| 0.4955 | 4580 | 0.0007 | - | - | - | - | |
|
| 0.4966 | 4590 | 0.0007 | - | - | - | - | |
|
| 0.4977 | 4600 | 0.0007 | - | - | - | - | |
|
| 0.4988 | 4610 | 0.0007 | - | - | - | - | |
|
| 0.4998 | 4620 | 0.0007 | - | - | - | - | |
|
| 0.5009 | 4630 | 0.0007 | - | - | - | - | |
|
| 0.5020 | 4640 | 0.0007 | - | - | - | - | |
|
| 0.5031 | 4650 | 0.0007 | - | - | - | - | |
|
| 0.5042 | 4660 | 0.0007 | - | - | - | - | |
|
| 0.5052 | 4670 | 0.0007 | - | - | - | - | |
|
| 0.5063 | 4680 | 0.0007 | - | - | - | - | |
|
| 0.5074 | 4690 | 0.0007 | - | - | - | - | |
|
| 0.5085 | 4700 | 0.0007 | - | - | - | - | |
|
| 0.5096 | 4710 | 0.0007 | - | - | - | - | |
|
| 0.5107 | 4720 | 0.0007 | - | - | - | - | |
|
| 0.5117 | 4730 | 0.0007 | - | - | - | - | |
|
| 0.5128 | 4740 | 0.0007 | - | - | - | - | |
|
| 0.5139 | 4750 | 0.0007 | - | - | - | - | |
|
| 0.5150 | 4760 | 0.0007 | - | - | - | - | |
|
| 0.5161 | 4770 | 0.0007 | - | - | - | - | |
|
| 0.5171 | 4780 | 0.0007 | - | - | - | - | |
|
| 0.5182 | 4790 | 0.0007 | - | - | - | - | |
|
| 0.5193 | 4800 | 0.0007 | - | - | - | - | |
|
| 0.5204 | 4810 | 0.0007 | - | - | - | - | |
|
| 0.5215 | 4820 | 0.0007 | - | - | - | - | |
|
| 0.5226 | 4830 | 0.0007 | - | - | - | - | |
|
| 0.5236 | 4840 | 0.0007 | - | - | - | - | |
|
| 0.5247 | 4850 | 0.0007 | - | - | - | - | |
|
| 0.5258 | 4860 | 0.0007 | - | - | - | - | |
|
| 0.5269 | 4870 | 0.0007 | - | - | - | - | |
|
| 0.5280 | 4880 | 0.0007 | - | - | - | - | |
|
| 0.5290 | 4890 | 0.0007 | - | - | - | - | |
|
| 0.5301 | 4900 | 0.0007 | - | - | - | - | |
|
| 0.5312 | 4910 | 0.0007 | - | - | - | - | |
|
| 0.5323 | 4920 | 0.0007 | - | - | - | - | |
|
| 0.5334 | 4930 | 0.0007 | - | - | - | - | |
|
| 0.5345 | 4940 | 0.0007 | - | - | - | - | |
|
| 0.5355 | 4950 | 0.0007 | - | - | - | - | |
|
| 0.5366 | 4960 | 0.0007 | - | - | - | - | |
|
| 0.5377 | 4970 | 0.0007 | - | - | - | - | |
|
| 0.5388 | 4980 | 0.0007 | - | - | - | - | |
|
| 0.5399 | 4990 | 0.0007 | - | - | - | - | |
|
| 0.5409 | 5000 | 0.0007 | 0.0007 | 0.4719 | 0.7905 | - | |
|
| 0.5420 | 5010 | 0.0007 | - | - | - | - | |
|
| 0.5431 | 5020 | 0.0007 | - | - | - | - | |
|
| 0.5442 | 5030 | 0.0007 | - | - | - | - | |
|
| 0.5453 | 5040 | 0.0007 | - | - | - | - | |
|
| 0.5464 | 5050 | 0.0007 | - | - | - | - | |
|
| 0.5474 | 5060 | 0.0007 | - | - | - | - | |
|
| 0.5485 | 5070 | 0.0007 | - | - | - | - | |
|
| 0.5496 | 5080 | 0.0007 | - | - | - | - | |
|
| 0.5507 | 5090 | 0.0007 | - | - | - | - | |
|
| 0.5518 | 5100 | 0.0007 | - | - | - | - | |
|
| 0.5529 | 5110 | 0.0007 | - | - | - | - | |
|
| 0.5539 | 5120 | 0.0007 | - | - | - | - | |
|
| 0.5550 | 5130 | 0.0007 | - | - | - | - | |
|
| 0.5561 | 5140 | 0.0007 | - | - | - | - | |
|
| 0.5572 | 5150 | 0.0007 | - | - | - | - | |
|
| 0.5583 | 5160 | 0.0007 | - | - | - | - | |
|
| 0.5593 | 5170 | 0.0007 | - | - | - | - | |
|
| 0.5604 | 5180 | 0.0007 | - | - | - | - | |
|
| 0.5615 | 5190 | 0.0007 | - | - | - | - | |
|
| 0.5626 | 5200 | 0.0007 | - | - | - | - | |
|
| 0.5637 | 5210 | 0.0007 | - | - | - | - | |
|
| 0.5648 | 5220 | 0.0007 | - | - | - | - | |
|
| 0.5658 | 5230 | 0.0007 | - | - | - | - | |
|
| 0.5669 | 5240 | 0.0007 | - | - | - | - | |
|
| 0.5680 | 5250 | 0.0007 | - | - | - | - | |
|
| 0.5691 | 5260 | 0.0007 | - | - | - | - | |
|
| 0.5702 | 5270 | 0.0007 | - | - | - | - | |
|
| 0.5712 | 5280 | 0.0007 | - | - | - | - | |
|
| 0.5723 | 5290 | 0.0007 | - | - | - | - | |
|
| 0.5734 | 5300 | 0.0007 | - | - | - | - | |
|
| 0.5745 | 5310 | 0.0007 | - | - | - | - | |
|
| 0.5756 | 5320 | 0.0007 | - | - | - | - | |
|
| 0.5767 | 5330 | 0.0007 | - | - | - | - | |
|
| 0.5777 | 5340 | 0.0007 | - | - | - | - | |
|
| 0.5788 | 5350 | 0.0007 | - | - | - | - | |
|
| 0.5799 | 5360 | 0.0007 | - | - | - | - | |
|
| 0.5810 | 5370 | 0.0007 | - | - | - | - | |
|
| 0.5821 | 5380 | 0.0007 | - | - | - | - | |
|
| 0.5831 | 5390 | 0.0007 | - | - | - | - | |
|
| 0.5842 | 5400 | 0.0007 | - | - | - | - | |
|
| 0.5853 | 5410 | 0.0007 | - | - | - | - | |
|
| 0.5864 | 5420 | 0.0007 | - | - | - | - | |
|
| 0.5875 | 5430 | 0.0007 | - | - | - | - | |
|
| 0.5886 | 5440 | 0.0007 | - | - | - | - | |
|
| 0.5896 | 5450 | 0.0007 | - | - | - | - | |
|
| 0.5907 | 5460 | 0.0007 | - | - | - | - | |
|
| 0.5918 | 5470 | 0.0007 | - | - | - | - | |
|
| 0.5929 | 5480 | 0.0007 | - | - | - | - | |
|
| 0.5940 | 5490 | 0.0007 | - | - | - | - | |
|
| 0.5950 | 5500 | 0.0007 | - | - | - | - | |
|
| 0.5961 | 5510 | 0.0007 | - | - | - | - | |
|
| 0.5972 | 5520 | 0.0007 | - | - | - | - | |
|
| 0.5983 | 5530 | 0.0007 | - | - | - | - | |
|
| 0.5994 | 5540 | 0.0007 | - | - | - | - | |
|
| 0.6005 | 5550 | 0.0007 | - | - | - | - | |
|
| 0.6015 | 5560 | 0.0007 | - | - | - | - | |
|
| 0.6026 | 5570 | 0.0007 | - | - | - | - | |
|
| 0.6037 | 5580 | 0.0007 | - | - | - | - | |
|
| 0.6048 | 5590 | 0.0007 | - | - | - | - | |
|
| 0.6059 | 5600 | 0.0007 | - | - | - | - | |
|
| 0.6069 | 5610 | 0.0007 | - | - | - | - | |
|
| 0.6080 | 5620 | 0.0007 | - | - | - | - | |
|
| 0.6091 | 5630 | 0.0007 | - | - | - | - | |
|
| 0.6102 | 5640 | 0.0007 | - | - | - | - | |
|
| 0.6113 | 5650 | 0.0007 | - | - | - | - | |
|
| 0.6124 | 5660 | 0.0007 | - | - | - | - | |
|
| 0.6134 | 5670 | 0.0007 | - | - | - | - | |
|
| 0.6145 | 5680 | 0.0007 | - | - | - | - | |
|
| 0.6156 | 5690 | 0.0007 | - | - | - | - | |
|
| 0.6167 | 5700 | 0.0007 | - | - | - | - | |
|
| 0.6178 | 5710 | 0.0007 | - | - | - | - | |
|
| 0.6188 | 5720 | 0.0007 | - | - | - | - | |
|
| 0.6199 | 5730 | 0.0007 | - | - | - | - | |
|
| 0.6210 | 5740 | 0.0007 | - | - | - | - | |
|
| 0.6221 | 5750 | 0.0007 | - | - | - | - | |
|
| 0.6232 | 5760 | 0.0007 | - | - | - | - | |
|
| 0.6243 | 5770 | 0.0007 | - | - | - | - | |
|
| 0.6253 | 5780 | 0.0007 | - | - | - | - | |
|
| 0.6264 | 5790 | 0.0007 | - | - | - | - | |
|
| 0.6275 | 5800 | 0.0007 | - | - | - | - | |
|
| 0.6286 | 5810 | 0.0007 | - | - | - | - | |
|
| 0.6297 | 5820 | 0.0007 | - | - | - | - | |
|
| 0.6307 | 5830 | 0.0007 | - | - | - | - | |
|
| 0.6318 | 5840 | 0.0007 | - | - | - | - | |
|
| 0.6329 | 5850 | 0.0007 | - | - | - | - | |
|
| 0.6340 | 5860 | 0.0007 | - | - | - | - | |
|
| 0.6351 | 5870 | 0.0007 | - | - | - | - | |
|
| 0.6362 | 5880 | 0.0007 | - | - | - | - | |
|
| 0.6372 | 5890 | 0.0007 | - | - | - | - | |
|
| 0.6383 | 5900 | 0.0007 | - | - | - | - | |
|
| 0.6394 | 5910 | 0.0007 | - | - | - | - | |
|
| 0.6405 | 5920 | 0.0007 | - | - | - | - | |
|
| 0.6416 | 5930 | 0.0007 | - | - | - | - | |
|
| 0.6426 | 5940 | 0.0007 | - | - | - | - | |
|
| 0.6437 | 5950 | 0.0007 | - | - | - | - | |
|
| 0.6448 | 5960 | 0.0007 | - | - | - | - | |
|
| 0.6459 | 5970 | 0.0007 | - | - | - | - | |
|
| 0.6470 | 5980 | 0.0007 | - | - | - | - | |
|
| 0.6481 | 5990 | 0.0007 | - | - | - | - | |
|
| 0.6491 | 6000 | 0.0007 | 0.0006 | 0.4761 | 0.7885 | - | |
|
| 0.6502 | 6010 | 0.0007 | - | - | - | - | |
|
| 0.6513 | 6020 | 0.0007 | - | - | - | - | |
|
| 0.6524 | 6030 | 0.0007 | - | - | - | - | |
|
| 0.6535 | 6040 | 0.0007 | - | - | - | - | |
|
| 0.6545 | 6050 | 0.0007 | - | - | - | - | |
|
| 0.6556 | 6060 | 0.0007 | - | - | - | - | |
|
| 0.6567 | 6070 | 0.0007 | - | - | - | - | |
|
| 0.6578 | 6080 | 0.0007 | - | - | - | - | |
|
| 0.6589 | 6090 | 0.0007 | - | - | - | - | |
|
| 0.6600 | 6100 | 0.0007 | - | - | - | - | |
|
| 0.6610 | 6110 | 0.0007 | - | - | - | - | |
|
| 0.6621 | 6120 | 0.0007 | - | - | - | - | |
|
| 0.6632 | 6130 | 0.0007 | - | - | - | - | |
|
| 0.6643 | 6140 | 0.0007 | - | - | - | - | |
|
| 0.6654 | 6150 | 0.0007 | - | - | - | - | |
|
| 0.6665 | 6160 | 0.0007 | - | - | - | - | |
|
| 0.6675 | 6170 | 0.0007 | - | - | - | - | |
|
| 0.6686 | 6180 | 0.0007 | - | - | - | - | |
|
| 0.6697 | 6190 | 0.0007 | - | - | - | - | |
|
| 0.6708 | 6200 | 0.0007 | - | - | - | - | |
|
| 0.6719 | 6210 | 0.0007 | - | - | - | - | |
|
| 0.6729 | 6220 | 0.0007 | - | - | - | - | |
|
| 0.6740 | 6230 | 0.0007 | - | - | - | - | |
|
| 0.6751 | 6240 | 0.0007 | - | - | - | - | |
|
| 0.6762 | 6250 | 0.0007 | - | - | - | - | |
|
| 0.6773 | 6260 | 0.0007 | - | - | - | - | |
|
| 0.6784 | 6270 | 0.0007 | - | - | - | - | |
|
| 0.6794 | 6280 | 0.0007 | - | - | - | - | |
|
| 0.6805 | 6290 | 0.0007 | - | - | - | - | |
|
| 0.6816 | 6300 | 0.0007 | - | - | - | - | |
|
| 0.6827 | 6310 | 0.0007 | - | - | - | - | |
|
| 0.6838 | 6320 | 0.0007 | - | - | - | - | |
|
| 0.6848 | 6330 | 0.0007 | - | - | - | - | |
|
| 0.6859 | 6340 | 0.0007 | - | - | - | - | |
|
| 0.6870 | 6350 | 0.0007 | - | - | - | - | |
|
| 0.6881 | 6360 | 0.0007 | - | - | - | - | |
|
| 0.6892 | 6370 | 0.0007 | - | - | - | - | |
|
| 0.6903 | 6380 | 0.0007 | - | - | - | - | |
|
| 0.6913 | 6390 | 0.0007 | - | - | - | - | |
|
| 0.6924 | 6400 | 0.0007 | - | - | - | - | |
|
| 0.6935 | 6410 | 0.0007 | - | - | - | - | |
|
| 0.6946 | 6420 | 0.0007 | - | - | - | - | |
|
| 0.6957 | 6430 | 0.0007 | - | - | - | - | |
|
| 0.6967 | 6440 | 0.0007 | - | - | - | - | |
|
| 0.6978 | 6450 | 0.0007 | - | - | - | - | |
|
| 0.6989 | 6460 | 0.0007 | - | - | - | - | |
|
| 0.7000 | 6470 | 0.0007 | - | - | - | - | |
|
| 0.7011 | 6480 | 0.0007 | - | - | - | - | |
|
| 0.7022 | 6490 | 0.0007 | - | - | - | - | |
|
| 0.7032 | 6500 | 0.0007 | - | - | - | - | |
|
| 0.7043 | 6510 | 0.0007 | - | - | - | - | |
|
| 0.7054 | 6520 | 0.0007 | - | - | - | - | |
|
| 0.7065 | 6530 | 0.0007 | - | - | - | - | |
|
| 0.7076 | 6540 | 0.0007 | - | - | - | - | |
|
| 0.7086 | 6550 | 0.0007 | - | - | - | - | |
|
| 0.7097 | 6560 | 0.0007 | - | - | - | - | |
|
| 0.7108 | 6570 | 0.0007 | - | - | - | - | |
|
| 0.7119 | 6580 | 0.0007 | - | - | - | - | |
|
| 0.7130 | 6590 | 0.0007 | - | - | - | - | |
|
| 0.7141 | 6600 | 0.0007 | - | - | - | - | |
|
| 0.7151 | 6610 | 0.0007 | - | - | - | - | |
|
| 0.7162 | 6620 | 0.0007 | - | - | - | - | |
|
| 0.7173 | 6630 | 0.0007 | - | - | - | - | |
|
| 0.7184 | 6640 | 0.0007 | - | - | - | - | |
|
| 0.7195 | 6650 | 0.0007 | - | - | - | - | |
|
| 0.7205 | 6660 | 0.0007 | - | - | - | - | |
|
| 0.7216 | 6670 | 0.0007 | - | - | - | - | |
|
| 0.7227 | 6680 | 0.0007 | - | - | - | - | |
|
| 0.7238 | 6690 | 0.0007 | - | - | - | - | |
|
| 0.7249 | 6700 | 0.0007 | - | - | - | - | |
|
| 0.7260 | 6710 | 0.0007 | - | - | - | - | |
|
| 0.7270 | 6720 | 0.0007 | - | - | - | - | |
|
| 0.7281 | 6730 | 0.0007 | - | - | - | - | |
|
| 0.7292 | 6740 | 0.0007 | - | - | - | - | |
|
| 0.7303 | 6750 | 0.0007 | - | - | - | - | |
|
| 0.7314 | 6760 | 0.0007 | - | - | - | - | |
|
| 0.7324 | 6770 | 0.0007 | - | - | - | - | |
|
| 0.7335 | 6780 | 0.0007 | - | - | - | - | |
|
| 0.7346 | 6790 | 0.0007 | - | - | - | - | |
|
| 0.7357 | 6800 | 0.0007 | - | - | - | - | |
|
| 0.7368 | 6810 | 0.0007 | - | - | - | - | |
|
| 0.7379 | 6820 | 0.0007 | - | - | - | - | |
|
| 0.7389 | 6830 | 0.0007 | - | - | - | - | |
|
| 0.7400 | 6840 | 0.0007 | - | - | - | - | |
|
| 0.7411 | 6850 | 0.0007 | - | - | - | - | |
|
| 0.7422 | 6860 | 0.0007 | - | - | - | - | |
|
| 0.7433 | 6870 | 0.0007 | - | - | - | - | |
|
| 0.7443 | 6880 | 0.0007 | - | - | - | - | |
|
| 0.7454 | 6890 | 0.0007 | - | - | - | - | |
|
| 0.7465 | 6900 | 0.0007 | - | - | - | - | |
|
| 0.7476 | 6910 | 0.0007 | - | - | - | - | |
|
| 0.7487 | 6920 | 0.0007 | - | - | - | - | |
|
| 0.7498 | 6930 | 0.0007 | - | - | - | - | |
|
| 0.7508 | 6940 | 0.0007 | - | - | - | - | |
|
| 0.7519 | 6950 | 0.0007 | - | - | - | - | |
|
| 0.7530 | 6960 | 0.0007 | - | - | - | - | |
|
| 0.7541 | 6970 | 0.0007 | - | - | - | - | |
|
| 0.7552 | 6980 | 0.0007 | - | - | - | - | |
|
| 0.7562 | 6990 | 0.0007 | - | - | - | - | |
|
| 0.7573 | 7000 | 0.0007 | 0.0006 | 0.4788 | 0.7901 | - | |
|
| 0.7584 | 7010 | 0.0007 | - | - | - | - | |
|
| 0.7595 | 7020 | 0.0007 | - | - | - | - | |
|
| 0.7606 | 7030 | 0.0007 | - | - | - | - | |
|
| 0.7617 | 7040 | 0.0007 | - | - | - | - | |
|
| 0.7627 | 7050 | 0.0007 | - | - | - | - | |
|
| 0.7638 | 7060 | 0.0007 | - | - | - | - | |
|
| 0.7649 | 7070 | 0.0007 | - | - | - | - | |
|
| 0.7660 | 7080 | 0.0007 | - | - | - | - | |
|
| 0.7671 | 7090 | 0.0007 | - | - | - | - | |
|
| 0.7681 | 7100 | 0.0007 | - | - | - | - | |
|
| 0.7692 | 7110 | 0.0007 | - | - | - | - | |
|
| 0.7703 | 7120 | 0.0007 | - | - | - | - | |
|
| 0.7714 | 7130 | 0.0007 | - | - | - | - | |
|
| 0.7725 | 7140 | 0.0007 | - | - | - | - | |
|
| 0.7736 | 7150 | 0.0007 | - | - | - | - | |
|
| 0.7746 | 7160 | 0.0007 | - | - | - | - | |
|
| 0.7757 | 7170 | 0.0007 | - | - | - | - | |
|
| 0.7768 | 7180 | 0.0007 | - | - | - | - | |
|
| 0.7779 | 7190 | 0.0007 | - | - | - | - | |
|
| 0.7790 | 7200 | 0.0007 | - | - | - | - | |
|
| 0.7800 | 7210 | 0.0007 | - | - | - | - | |
|
| 0.7811 | 7220 | 0.0007 | - | - | - | - | |
|
| 0.7822 | 7230 | 0.0007 | - | - | - | - | |
|
| 0.7833 | 7240 | 0.0007 | - | - | - | - | |
|
| 0.7844 | 7250 | 0.0007 | - | - | - | - | |
|
| 0.7855 | 7260 | 0.0007 | - | - | - | - | |
|
| 0.7865 | 7270 | 0.0007 | - | - | - | - | |
|
| 0.7876 | 7280 | 0.0007 | - | - | - | - | |
|
| 0.7887 | 7290 | 0.0007 | - | - | - | - | |
|
| 0.7898 | 7300 | 0.0007 | - | - | - | - | |
|
| 0.7909 | 7310 | 0.0007 | - | - | - | - | |
|
| 0.7920 | 7320 | 0.0007 | - | - | - | - | |
|
| 0.7930 | 7330 | 0.0006 | - | - | - | - | |
|
| 0.7941 | 7340 | 0.0007 | - | - | - | - | |
|
| 0.7952 | 7350 | 0.0007 | - | - | - | - | |
|
| 0.7963 | 7360 | 0.0007 | - | - | - | - | |
|
| 0.7974 | 7370 | 0.0007 | - | - | - | - | |
|
| 0.7984 | 7380 | 0.0007 | - | - | - | - | |
|
| 0.7995 | 7390 | 0.0007 | - | - | - | - | |
|
| 0.8006 | 7400 | 0.0007 | - | - | - | - | |
|
| 0.8017 | 7410 | 0.0007 | - | - | - | - | |
|
| 0.8028 | 7420 | 0.0007 | - | - | - | - | |
|
| 0.8039 | 7430 | 0.0007 | - | - | - | - | |
|
| 0.8049 | 7440 | 0.0007 | - | - | - | - | |
|
| 0.8060 | 7450 | 0.0007 | - | - | - | - | |
|
| 0.8071 | 7460 | 0.0007 | - | - | - | - | |
|
| 0.8082 | 7470 | 0.0007 | - | - | - | - | |
|
| 0.8093 | 7480 | 0.0007 | - | - | - | - | |
|
| 0.8103 | 7490 | 0.0007 | - | - | - | - | |
|
| 0.8114 | 7500 | 0.0007 | - | - | - | - | |
|
| 0.8125 | 7510 | 0.0007 | - | - | - | - | |
|
| 0.8136 | 7520 | 0.0007 | - | - | - | - | |
|
| 0.8147 | 7530 | 0.0007 | - | - | - | - | |
|
| 0.8158 | 7540 | 0.0007 | - | - | - | - | |
|
| 0.8168 | 7550 | 0.0007 | - | - | - | - | |
|
| 0.8179 | 7560 | 0.0007 | - | - | - | - | |
|
| 0.8190 | 7570 | 0.0007 | - | - | - | - | |
|
| 0.8201 | 7580 | 0.0007 | - | - | - | - | |
|
| 0.8212 | 7590 | 0.0007 | - | - | - | - | |
|
| 0.8222 | 7600 | 0.0007 | - | - | - | - | |
|
| 0.8233 | 7610 | 0.0007 | - | - | - | - | |
|
| 0.8244 | 7620 | 0.0007 | - | - | - | - | |
|
| 0.8255 | 7630 | 0.0007 | - | - | - | - | |
|
| 0.8266 | 7640 | 0.0007 | - | - | - | - | |
|
| 0.8277 | 7650 | 0.0007 | - | - | - | - | |
|
| 0.8287 | 7660 | 0.0007 | - | - | - | - | |
|
| 0.8298 | 7670 | 0.0007 | - | - | - | - | |
|
| 0.8309 | 7680 | 0.0007 | - | - | - | - | |
|
| 0.8320 | 7690 | 0.0007 | - | - | - | - | |
|
| 0.8331 | 7700 | 0.0007 | - | - | - | - | |
|
| 0.8341 | 7710 | 0.0007 | - | - | - | - | |
|
| 0.8352 | 7720 | 0.0007 | - | - | - | - | |
|
| 0.8363 | 7730 | 0.0007 | - | - | - | - | |
|
| 0.8374 | 7740 | 0.0007 | - | - | - | - | |
|
| 0.8385 | 7750 | 0.0007 | - | - | - | - | |
|
| 0.8396 | 7760 | 0.0007 | - | - | - | - | |
|
| 0.8406 | 7770 | 0.0007 | - | - | - | - | |
|
| 0.8417 | 7780 | 0.0007 | - | - | - | - | |
|
| 0.8428 | 7790 | 0.0007 | - | - | - | - | |
|
| 0.8439 | 7800 | 0.0007 | - | - | - | - | |
|
| 0.8450 | 7810 | 0.0007 | - | - | - | - | |
|
| 0.8460 | 7820 | 0.0007 | - | - | - | - | |
|
| 0.8471 | 7830 | 0.0007 | - | - | - | - | |
|
| 0.8482 | 7840 | 0.0007 | - | - | - | - | |
|
| 0.8493 | 7850 | 0.0007 | - | - | - | - | |
|
| 0.8504 | 7860 | 0.0007 | - | - | - | - | |
|
| 0.8515 | 7870 | 0.0007 | - | - | - | - | |
|
| 0.8525 | 7880 | 0.0007 | - | - | - | - | |
|
| 0.8536 | 7890 | 0.0007 | - | - | - | - | |
|
| 0.8547 | 7900 | 0.0007 | - | - | - | - | |
|
| 0.8558 | 7910 | 0.0007 | - | - | - | - | |
|
| 0.8569 | 7920 | 0.0007 | - | - | - | - | |
|
| 0.8579 | 7930 | 0.0007 | - | - | - | - | |
|
| 0.8590 | 7940 | 0.0007 | - | - | - | - | |
|
| 0.8601 | 7950 | 0.0007 | - | - | - | - | |
|
| 0.8612 | 7960 | 0.0007 | - | - | - | - | |
|
| 0.8623 | 7970 | 0.0007 | - | - | - | - | |
|
| 0.8634 | 7980 | 0.0007 | - | - | - | - | |
|
| 0.8644 | 7990 | 0.0007 | - | - | - | - | |
|
| 0.8655 | 8000 | 0.0007 | 0.0006 | 0.4814 | 0.7918 | - | |
|
| 0.8666 | 8010 | 0.0007 | - | - | - | - | |
|
| 0.8677 | 8020 | 0.0007 | - | - | - | - | |
|
| 0.8688 | 8030 | 0.0007 | - | - | - | - | |
|
| 0.8698 | 8040 | 0.0007 | - | - | - | - | |
|
| 0.8709 | 8050 | 0.0007 | - | - | - | - | |
|
| 0.8720 | 8060 | 0.0007 | - | - | - | - | |
|
| 0.8731 | 8070 | 0.0007 | - | - | - | - | |
|
| 0.8742 | 8080 | 0.0007 | - | - | - | - | |
|
| 0.8753 | 8090 | 0.0007 | - | - | - | - | |
|
| 0.8763 | 8100 | 0.0007 | - | - | - | - | |
|
| 0.8774 | 8110 | 0.0007 | - | - | - | - | |
|
| 0.8785 | 8120 | 0.0007 | - | - | - | - | |
|
| 0.8796 | 8130 | 0.0007 | - | - | - | - | |
|
| 0.8807 | 8140 | 0.0007 | - | - | - | - | |
|
| 0.8817 | 8150 | 0.0007 | - | - | - | - | |
|
| 0.8828 | 8160 | 0.0007 | - | - | - | - | |
|
| 0.8839 | 8170 | 0.0007 | - | - | - | - | |
|
| 0.8850 | 8180 | 0.0007 | - | - | - | - | |
|
| 0.8861 | 8190 | 0.0007 | - | - | - | - | |
|
| 0.8872 | 8200 | 0.0007 | - | - | - | - | |
|
| 0.8882 | 8210 | 0.0007 | - | - | - | - | |
|
| 0.8893 | 8220 | 0.0007 | - | - | - | - | |
|
| 0.8904 | 8230 | 0.0007 | - | - | - | - | |
|
| 0.8915 | 8240 | 0.0007 | - | - | - | - | |
|
| 0.8926 | 8250 | 0.0007 | - | - | - | - | |
|
| 0.8936 | 8260 | 0.0007 | - | - | - | - | |
|
| 0.8947 | 8270 | 0.0007 | - | - | - | - | |
|
| 0.8958 | 8280 | 0.0007 | - | - | - | - | |
|
| 0.8969 | 8290 | 0.0007 | - | - | - | - | |
|
| 0.8980 | 8300 | 0.0007 | - | - | - | - | |
|
| 0.8991 | 8310 | 0.0007 | - | - | - | - | |
|
| 0.9001 | 8320 | 0.0007 | - | - | - | - | |
|
| 0.9012 | 8330 | 0.0007 | - | - | - | - | |
|
| 0.9023 | 8340 | 0.0007 | - | - | - | - | |
|
| 0.9034 | 8350 | 0.0007 | - | - | - | - | |
|
| 0.9045 | 8360 | 0.0007 | - | - | - | - | |
|
| 0.9056 | 8370 | 0.0007 | - | - | - | - | |
|
| 0.9066 | 8380 | 0.0007 | - | - | - | - | |
|
| 0.9077 | 8390 | 0.0007 | - | - | - | - | |
|
| 0.9088 | 8400 | 0.0007 | - | - | - | - | |
|
| 0.9099 | 8410 | 0.0007 | - | - | - | - | |
|
| 0.9110 | 8420 | 0.0007 | - | - | - | - | |
|
| 0.9120 | 8430 | 0.0007 | - | - | - | - | |
|
| 0.9131 | 8440 | 0.0007 | - | - | - | - | |
|
| 0.9142 | 8450 | 0.0007 | - | - | - | - | |
|
| 0.9153 | 8460 | 0.0007 | - | - | - | - | |
|
| 0.9164 | 8470 | 0.0007 | - | - | - | - | |
|
| 0.9175 | 8480 | 0.0006 | - | - | - | - | |
|
| 0.9185 | 8490 | 0.0007 | - | - | - | - | |
|
| 0.9196 | 8500 | 0.0007 | - | - | - | - | |
|
| 0.9207 | 8510 | 0.0007 | - | - | - | - | |
|
| 0.9218 | 8520 | 0.0007 | - | - | - | - | |
|
| 0.9229 | 8530 | 0.0007 | - | - | - | - | |
|
| 0.9239 | 8540 | 0.0007 | - | - | - | - | |
|
| 0.9250 | 8550 | 0.0007 | - | - | - | - | |
|
| 0.9261 | 8560 | 0.0007 | - | - | - | - | |
|
| 0.9272 | 8570 | 0.0007 | - | - | - | - | |
|
| 0.9283 | 8580 | 0.0007 | - | - | - | - | |
|
| 0.9294 | 8590 | 0.0007 | - | - | - | - | |
|
| 0.9304 | 8600 | 0.0007 | - | - | - | - | |
|
| 0.9315 | 8610 | 0.0007 | - | - | - | - | |
|
| 0.9326 | 8620 | 0.0007 | - | - | - | - | |
|
| 0.9337 | 8630 | 0.0007 | - | - | - | - | |
|
| 0.9348 | 8640 | 0.0007 | - | - | - | - | |
|
| 0.9358 | 8650 | 0.0007 | - | - | - | - | |
|
| 0.9369 | 8660 | 0.0007 | - | - | - | - | |
|
| 0.9380 | 8670 | 0.0007 | - | - | - | - | |
|
| 0.9391 | 8680 | 0.0007 | - | - | - | - | |
|
| 0.9402 | 8690 | 0.0007 | - | - | - | - | |
|
| 0.9413 | 8700 | 0.0007 | - | - | - | - | |
|
| 0.9423 | 8710 | 0.0007 | - | - | - | - | |
|
| 0.9434 | 8720 | 0.0007 | - | - | - | - | |
|
| 0.9445 | 8730 | 0.0007 | - | - | - | - | |
|
| 0.9456 | 8740 | 0.0007 | - | - | - | - | |
|
| 0.9467 | 8750 | 0.0007 | - | - | - | - | |
|
| 0.9477 | 8760 | 0.0007 | - | - | - | - | |
|
| 0.9488 | 8770 | 0.0007 | - | - | - | - | |
|
| 0.9499 | 8780 | 0.0007 | - | - | - | - | |
|
| 0.9510 | 8790 | 0.0007 | - | - | - | - | |
|
| 0.9521 | 8800 | 0.0007 | - | - | - | - | |
|
| 0.9532 | 8810 | 0.0007 | - | - | - | - | |
|
| 0.9542 | 8820 | 0.0007 | - | - | - | - | |
|
| 0.9553 | 8830 | 0.0007 | - | - | - | - | |
|
| 0.9564 | 8840 | 0.0007 | - | - | - | - | |
|
| 0.9575 | 8850 | 0.0007 | - | - | - | - | |
|
| 0.9586 | 8860 | 0.0007 | - | - | - | - | |
|
| 0.9596 | 8870 | 0.0007 | - | - | - | - | |
|
| 0.9607 | 8880 | 0.0007 | - | - | - | - | |
|
| 0.9618 | 8890 | 0.0007 | - | - | - | - | |
|
| 0.9629 | 8900 | 0.0007 | - | - | - | - | |
|
| 0.9640 | 8910 | 0.0007 | - | - | - | - | |
|
| 0.9651 | 8920 | 0.0007 | - | - | - | - | |
|
| 0.9661 | 8930 | 0.0007 | - | - | - | - | |
|
| 0.9672 | 8940 | 0.0007 | - | - | - | - | |
|
| 0.9683 | 8950 | 0.0007 | - | - | - | - | |
|
| 0.9694 | 8960 | 0.0007 | - | - | - | - | |
|
| 0.9705 | 8970 | 0.0007 | - | - | - | - | |
|
| 0.9715 | 8980 | 0.0007 | - | - | - | - | |
|
| 0.9726 | 8990 | 0.0007 | - | - | - | - | |
|
| 0.9737 | 9000 | 0.0007 | 0.0006 | 0.4822 | 0.7917 | - | |
|
| 0.9748 | 9010 | 0.0007 | - | - | - | - | |
|
| 0.9759 | 9020 | 0.0007 | - | - | - | - | |
|
| 0.9770 | 9030 | 0.0007 | - | - | - | - | |
|
| 0.9780 | 9040 | 0.0007 | - | - | - | - | |
|
| 0.9791 | 9050 | 0.0007 | - | - | - | - | |
|
| 0.9802 | 9060 | 0.0007 | - | - | - | - | |
|
| 0.9813 | 9070 | 0.0007 | - | - | - | - | |
|
| 0.9824 | 9080 | 0.0007 | - | - | - | - | |
|
| 0.9834 | 9090 | 0.0007 | - | - | - | - | |
|
| 0.9845 | 9100 | 0.0007 | - | - | - | - | |
|
| 0.9856 | 9110 | 0.0007 | - | - | - | - | |
|
| 0.9867 | 9120 | 0.0007 | - | - | - | - | |
|
| 0.9878 | 9130 | 0.0007 | - | - | - | - | |
|
| 0.9889 | 9140 | 0.0007 | - | - | - | - | |
|
| 0.9899 | 9150 | 0.0007 | - | - | - | - | |
|
| 0.9910 | 9160 | 0.0007 | - | - | - | - | |
|
| 0.9921 | 9170 | 0.0007 | - | - | - | - | |
|
| 0.9932 | 9180 | 0.0007 | - | - | - | - | |
|
| 0.9943 | 9190 | 0.0007 | - | - | - | - | |
|
| 0.9953 | 9200 | 0.0007 | - | - | - | - | |
|
| 0.9964 | 9210 | 0.0007 | - | - | - | - | |
|
| 0.9975 | 9220 | 0.0007 | - | - | - | - | |
|
| 0.9986 | 9230 | 0.0007 | - | - | - | - | |
|
| 0.9997 | 9240 | 0.0007 | - | - | - | - | |
|
| 1.0 | 9243 | - | - | 0.4820 | - | 0.7143 | |
|
|
|
</details> |
|
|
|
### Framework Versions |
|
- Python: 3.10.12 |
|
- Sentence Transformers: 3.0.1 |
|
- Transformers: 4.42.3 |
|
- PyTorch: 2.1.1+cu121 |
|
- Accelerate: 0.32.1 |
|
- Datasets: 2.20.0 |
|
- Tokenizers: 0.19.1 |
|
|
|
## Citation |
|
|
|
### BibTeX |
|
|
|
#### Sentence Transformers |
|
```bibtex |
|
@inproceedings{reimers-2019-sentence-bert, |
|
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", |
|
author = "Reimers, Nils and Gurevych, Iryna", |
|
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", |
|
month = "11", |
|
year = "2019", |
|
publisher = "Association for Computational Linguistics", |
|
url = "https://arxiv.org/abs/1908.10084", |
|
} |
|
``` |
|
|
|
#### MSELoss |
|
```bibtex |
|
@inproceedings{reimers-2020-multilingual-sentence-bert, |
|
title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation", |
|
author = "Reimers, Nils and Gurevych, Iryna", |
|
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing", |
|
month = "11", |
|
year = "2020", |
|
publisher = "Association for Computational Linguistics", |
|
url = "https://arxiv.org/abs/2004.09813", |
|
} |
|
``` |
|
|
|
<!-- |
|
## Glossary |
|
|
|
*Clearly define terms in order to be accessible across audiences.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Authors |
|
|
|
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* |
|
--> |
|
|
|
<!-- |
|
## Model Card Contact |
|
|
|
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* |
|
--> |