rttl commited on
Commit
32f36fd
1 Parent(s): 13799da

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +20 -9
README.md CHANGED
@@ -16,14 +16,25 @@ model-index:
16
  metrics:
17
  - type: f1
18
  value: 0.9992
 
 
 
19
 
 
 
 
 
 
 
20
 
21
- Model Description: This model is a fine-tuned checkpoint of bert-large-uncased, fine-tuned on SST-2. This model reaches F1 of 99.92 on the dev set.
22
- Developed by: rttl labs
23
- Model Type: Text Classification
24
- Language(s): English
25
- License: Apache-2.0
26
- Resources for more information:
27
- The model was pre-trained with task-adaptive pre-training (TAPT), with an increased masking rate, no corruption strategy, and WWM, following this paper.
28
- then fine-tuned on sst with subtrees
29
- then fine-tuned on sst2
 
 
 
16
  metrics:
17
  - type: f1
18
  value: 0.9992
19
+ ---
20
+
21
+ # DistilBERT base uncased finetuned SST-2
22
 
23
+ ## Table of Contents
24
+ - [Model Details](#model-details)
25
+ - [How to Get Started With the Model](#how-to-get-started-with-the-model)
26
+ - [Uses](#uses)
27
+ - [Risks, Limitations and Biases](#risks-limitations-and-biases)
28
+ - [Training](#training)
29
 
30
+ ## Model Details
31
+ **Model Description:** This model is a fine-tune checkpoint of [DistilBERT-base-uncased](https://huggingface.co/distilbert-base-uncased), fine-tuned on SST-2.
32
+ This model reaches an accuracy of 91.3 on the dev set (for comparison, Bert bert-base-uncased version reaches an accuracy of 92.7).
33
+ - **Developed by:** Hugging Face
34
+ - **Model Type:** Text Classification
35
+ - **Language(s):** English
36
+ - **License:** Apache-2.0
37
+ - **Parent Model:** For more details about DistilBERT, we encourage users to check out [this model card](https://huggingface.co/distilbert-base-uncased).
38
+ - **Resources for more information:**
39
+ - [Model Documentation](https://huggingface.co/docs/transformers/main/en/model_doc/distilbert#transformers.DistilBertForSequenceClassification)
40
+ - [DistilBERT paper](https://arxiv.org/abs/1910.01108)