distilbert-base-uncased-distilled-clinc
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2772
- Accuracy: 0.9535
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 74
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.8866 | 1.0 | 318 | 2.0716 | 0.7316 |
1.5472 | 2.0 | 636 | 0.9602 | 0.8774 |
0.7257 | 3.0 | 954 | 0.5149 | 0.9174 |
0.3992 | 4.0 | 1272 | 0.3806 | 0.9377 |
0.2851 | 5.0 | 1590 | 0.3391 | 0.9429 |
0.2386 | 6.0 | 1908 | 0.3253 | 0.9442 |
0.2158 | 7.0 | 2226 | 0.3147 | 0.9455 |
0.2045 | 8.0 | 2544 | 0.3096 | 0.9477 |
0.1969 | 9.0 | 2862 | 0.3044 | 0.9468 |
0.1919 | 10.0 | 3180 | 0.3006 | 0.9458 |
0.1887 | 11.0 | 3498 | 0.3028 | 0.9461 |
0.1864 | 12.0 | 3816 | 0.2960 | 0.9494 |
0.1836 | 13.0 | 4134 | 0.2947 | 0.9455 |
0.1818 | 14.0 | 4452 | 0.2959 | 0.9484 |
0.1805 | 15.0 | 4770 | 0.2901 | 0.9506 |
0.1798 | 16.0 | 5088 | 0.2972 | 0.9461 |
0.1782 | 17.0 | 5406 | 0.2961 | 0.9461 |
0.1768 | 18.0 | 5724 | 0.2931 | 0.95 |
0.1761 | 19.0 | 6042 | 0.2915 | 0.9474 |
0.1749 | 20.0 | 6360 | 0.2895 | 0.9481 |
0.1743 | 21.0 | 6678 | 0.2889 | 0.9484 |
0.1735 | 22.0 | 6996 | 0.2888 | 0.9487 |
0.173 | 23.0 | 7314 | 0.2873 | 0.9503 |
0.1727 | 24.0 | 7632 | 0.2844 | 0.9510 |
0.1723 | 25.0 | 7950 | 0.2877 | 0.9477 |
0.1717 | 26.0 | 8268 | 0.2851 | 0.9503 |
0.1712 | 27.0 | 8586 | 0.2884 | 0.9503 |
0.1707 | 28.0 | 8904 | 0.2839 | 0.9519 |
0.1704 | 29.0 | 9222 | 0.2853 | 0.9490 |
0.17 | 30.0 | 9540 | 0.2824 | 0.9516 |
0.1698 | 31.0 | 9858 | 0.2854 | 0.9474 |
0.1696 | 32.0 | 10176 | 0.2838 | 0.9487 |
0.1692 | 33.0 | 10494 | 0.2821 | 0.9506 |
0.1691 | 34.0 | 10812 | 0.2825 | 0.9516 |
0.1688 | 35.0 | 11130 | 0.2833 | 0.9506 |
0.1686 | 36.0 | 11448 | 0.2836 | 0.9503 |
0.1685 | 37.0 | 11766 | 0.2820 | 0.9506 |
0.1685 | 38.0 | 12084 | 0.2826 | 0.9506 |
0.1679 | 39.0 | 12402 | 0.2821 | 0.9519 |
0.168 | 40.0 | 12720 | 0.2798 | 0.9513 |
0.1677 | 41.0 | 13038 | 0.2811 | 0.9519 |
0.1673 | 42.0 | 13356 | 0.2829 | 0.9510 |
0.1674 | 43.0 | 13674 | 0.2814 | 0.9526 |
0.1673 | 44.0 | 13992 | 0.2806 | 0.9523 |
0.1671 | 45.0 | 14310 | 0.2805 | 0.9523 |
0.1666 | 46.0 | 14628 | 0.2800 | 0.9513 |
0.1668 | 47.0 | 14946 | 0.2803 | 0.9523 |
0.1666 | 48.0 | 15264 | 0.2789 | 0.9519 |
0.1668 | 49.0 | 15582 | 0.2805 | 0.9510 |
0.1662 | 50.0 | 15900 | 0.2788 | 0.9529 |
0.1663 | 51.0 | 16218 | 0.2813 | 0.9516 |
0.1663 | 52.0 | 16536 | 0.2783 | 0.9516 |
0.1661 | 53.0 | 16854 | 0.2790 | 0.9513 |
0.1658 | 54.0 | 17172 | 0.2794 | 0.9526 |
0.1659 | 55.0 | 17490 | 0.2802 | 0.9532 |
0.1661 | 56.0 | 17808 | 0.2796 | 0.9526 |
0.1654 | 57.0 | 18126 | 0.2793 | 0.9516 |
0.1658 | 58.0 | 18444 | 0.2794 | 0.9523 |
0.1654 | 59.0 | 18762 | 0.2784 | 0.9510 |
0.1658 | 60.0 | 19080 | 0.2805 | 0.9526 |
0.1653 | 61.0 | 19398 | 0.2786 | 0.9523 |
0.1655 | 62.0 | 19716 | 0.2777 | 0.9513 |
0.1653 | 63.0 | 20034 | 0.2775 | 0.9532 |
0.1652 | 64.0 | 20352 | 0.2782 | 0.9519 |
0.1652 | 65.0 | 20670 | 0.2785 | 0.9526 |
0.1654 | 66.0 | 20988 | 0.2786 | 0.9523 |
0.1649 | 67.0 | 21306 | 0.2788 | 0.9526 |
0.165 | 68.0 | 21624 | 0.2781 | 0.9523 |
0.165 | 69.0 | 21942 | 0.2775 | 0.9519 |
0.165 | 70.0 | 22260 | 0.2773 | 0.9523 |
0.165 | 71.0 | 22578 | 0.2769 | 0.9526 |
0.1651 | 72.0 | 22896 | 0.2770 | 0.9532 |
0.1648 | 73.0 | 23214 | 0.2773 | 0.9535 |
0.1652 | 74.0 | 23532 | 0.2772 | 0.9535 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.2.2+cu121
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 105
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for thom126f/distilbert-base-uncased-distilled-clinc
Base model
distilbert/distilbert-base-uncased