File size: 1,127 Bytes
11b6139
736379c
51ca4a2
 
baf038e
 
736379c
baf038e
18461c1
 
baf038e
18461c1
 
409f18f
18461c1
baf038e
18461c1
 
 
 
 
 
 
 
 
32f36fd
 
ea5d7ec
13799da
32f36fd
468481e
eb3970f
468481e
32f36fd
 
 
 
f983e0a
3c21b3e
18461c1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
---
license: apache-2.0
language:
- en
model-index:
- name: rttl-ai/SentyBert
  results:
  - task:
      type: task-classification
      name: Text Classification
    dataset:
      type: sst2
      name: sst2
      config: default
      split: validation
    metrics:
    - type: f1
      value: 0.9992
      name: F1 Macro
    - type: accuracy
      value: 0.9992
      name: Accuracy
datasets:
- sst2
- sst
---

# rttl-ai/SentyBert

## Model Details
**Model Description:** This model is a fine-tune checkpoint of [bert-large-uncased](https://huggingface.co/bert-large-uncased), fine-tuned on SST-2.
This model reaches an accuracy of 99.92 on the dev set.
- **Developed by:** rttl-ai
- **Model Type:** Text Classification
- **Language(s):** English
- **License:** Apache-2.0
- **Resources for more information:**
 - The model was pre-trained with task-adaptive pre-training [TAPT](https://arxiv.org/pdf/2004.10964.pdf) with an increased masking rate, no corruption strategy, and using WWM, following [this paper](https://aclanthology.org/2023.eacl-main.217.pdf)
    - fine-tuned on sst with subtrees 
    - fine-tuned on sst2