distilgpt2-finetuned-ner
This model is a fine-tuned version of distilbert/distilgpt2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.4074
- 0 Precision: 0.9557
- 0 Recall: 0.8738
- 0 F1-score: 0.9129
- 1 Precision: 0.6128
- 1 Recall: 0.8971
- 1 F1-score: 0.7282
- 2 Precision: 0.4667
- 2 Recall: 0.6272
- 2 F1-score: 0.5352
- 3 Precision: 0.6606
- 3 Recall: 0.7742
- 3 F1-score: 0.7129
- Accuracy: 0.8577
- Macro avg Precision: 0.6739
- Macro avg Recall: 0.7931
- Macro avg F1-score: 0.7223
- Weighted avg Precision: 0.8827
- Weighted avg Recall: 0.8577
- Weighted avg F1-score: 0.8656
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60
Training results
Training Loss | Epoch | Step | Validation Loss | 0 Precision | 0 Recall | 0 F1-score | 1 Precision | 1 Recall | 1 F1-score | 2 Precision | 2 Recall | 2 F1-score | 3 Precision | 3 Recall | 3 F1-score | Accuracy | Macro avg Precision | Macro avg Recall | Macro avg F1-score | Weighted avg Precision | Weighted avg Recall | Weighted avg F1-score |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 67 | 0.7625 | 0.9807 | 0.6141 | 0.7553 | 0.4507 | 0.9242 | 0.6059 | 0.1833 | 0.7455 | 0.2942 | 0.3184 | 0.6280 | 0.4226 | 0.6479 | 0.4833 | 0.7279 | 0.5195 | 0.8510 | 0.6479 | 0.6977 |
No log | 2.0 | 134 | 0.5560 | 0.9828 | 0.6673 | 0.7949 | 0.4835 | 0.9278 | 0.6357 | 0.2298 | 0.7634 | 0.3532 | 0.3942 | 0.7892 | 0.5258 | 0.7032 | 0.5226 | 0.7870 | 0.5774 | 0.8631 | 0.7032 | 0.7419 |
No log | 3.0 | 201 | 0.5080 | 0.9905 | 0.6402 | 0.7778 | 0.4718 | 0.9531 | 0.6312 | 0.2308 | 0.8100 | 0.3593 | 0.3957 | 0.8409 | 0.5382 | 0.6896 | 0.5222 | 0.8110 | 0.5766 | 0.8684 | 0.6896 | 0.7291 |
No log | 4.0 | 268 | 0.4785 | 0.9840 | 0.7067 | 0.8226 | 0.4939 | 0.9549 | 0.6511 | 0.2580 | 0.8100 | 0.3913 | 0.4923 | 0.8215 | 0.6156 | 0.7413 | 0.5571 | 0.8233 | 0.6202 | 0.8734 | 0.7413 | 0.7736 |
No log | 5.0 | 335 | 0.4872 | 0.9877 | 0.6839 | 0.8082 | 0.5178 | 0.9440 | 0.6688 | 0.2810 | 0.7885 | 0.4143 | 0.3887 | 0.8860 | 0.5403 | 0.7260 | 0.5438 | 0.8256 | 0.6079 | 0.8719 | 0.7260 | 0.7592 |
No log | 6.0 | 402 | 0.4646 | 0.9873 | 0.7051 | 0.8227 | 0.5303 | 0.9477 | 0.6801 | 0.2845 | 0.8136 | 0.4215 | 0.4304 | 0.8839 | 0.5789 | 0.7441 | 0.5581 | 0.8376 | 0.6258 | 0.8758 | 0.7441 | 0.7748 |
No log | 7.0 | 469 | 0.4540 | 0.9866 | 0.7006 | 0.8193 | 0.5122 | 0.9495 | 0.6654 | 0.2659 | 0.8387 | 0.4038 | 0.4656 | 0.8581 | 0.6036 | 0.7398 | 0.5576 | 0.8367 | 0.6230 | 0.8755 | 0.7398 | 0.7719 |
0.6085 | 8.0 | 536 | 0.4723 | 0.9839 | 0.7255 | 0.8352 | 0.5019 | 0.9549 | 0.6580 | 0.2909 | 0.8029 | 0.4271 | 0.4913 | 0.8473 | 0.6219 | 0.7578 | 0.5670 | 0.8326 | 0.6355 | 0.8754 | 0.7578 | 0.7862 |
0.6085 | 9.0 | 603 | 0.4998 | 0.9777 | 0.7716 | 0.8625 | 0.5286 | 0.9495 | 0.6791 | 0.3279 | 0.7921 | 0.4638 | 0.5496 | 0.8215 | 0.6586 | 0.7916 | 0.5960 | 0.8337 | 0.6660 | 0.8787 | 0.7916 | 0.8141 |
0.6085 | 10.0 | 670 | 0.4960 | 0.9867 | 0.7176 | 0.8309 | 0.5358 | 0.9458 | 0.6841 | 0.2789 | 0.8136 | 0.4154 | 0.4576 | 0.8710 | 0.6 | 0.7529 | 0.5647 | 0.8370 | 0.6326 | 0.8776 | 0.7529 | 0.7829 |
0.6085 | 11.0 | 737 | 0.5292 | 0.9760 | 0.7870 | 0.8713 | 0.5425 | 0.9440 | 0.6891 | 0.3302 | 0.7563 | 0.4597 | 0.5712 | 0.8280 | 0.6760 | 0.8023 | 0.6050 | 0.8288 | 0.6740 | 0.8802 | 0.8023 | 0.8231 |
0.6085 | 12.0 | 804 | 0.5283 | 0.9778 | 0.7666 | 0.8594 | 0.5438 | 0.9296 | 0.6862 | 0.3135 | 0.7563 | 0.4433 | 0.5130 | 0.8495 | 0.6397 | 0.7864 | 0.5870 | 0.8255 | 0.6571 | 0.8768 | 0.7864 | 0.8100 |
0.6085 | 13.0 | 871 | 0.5510 | 0.9793 | 0.7678 | 0.8608 | 0.5409 | 0.9422 | 0.6873 | 0.3107 | 0.7706 | 0.4428 | 0.5307 | 0.8366 | 0.6494 | 0.7882 | 0.5904 | 0.8293 | 0.6601 | 0.8789 | 0.7882 | 0.8118 |
0.6085 | 14.0 | 938 | 0.5859 | 0.9751 | 0.7913 | 0.8737 | 0.5610 | 0.9296 | 0.6997 | 0.3303 | 0.7742 | 0.4630 | 0.5641 | 0.8237 | 0.6696 | 0.8050 | 0.6076 | 0.8297 | 0.6765 | 0.8806 | 0.8050 | 0.8255 |
0.2627 | 15.0 | 1005 | 0.6114 | 0.9770 | 0.7811 | 0.8681 | 0.5360 | 0.9404 | 0.6828 | 0.3392 | 0.7563 | 0.4684 | 0.5418 | 0.8366 | 0.6577 | 0.7979 | 0.5985 | 0.8286 | 0.6692 | 0.8787 | 0.7979 | 0.8190 |
0.2627 | 16.0 | 1072 | 0.6434 | 0.9761 | 0.7747 | 0.8638 | 0.5407 | 0.9350 | 0.6852 | 0.3518 | 0.7276 | 0.4743 | 0.4913 | 0.8516 | 0.6231 | 0.7923 | 0.5900 | 0.8222 | 0.6616 | 0.8752 | 0.7923 | 0.8135 |
0.2627 | 17.0 | 1139 | 0.6766 | 0.9729 | 0.7939 | 0.8743 | 0.5558 | 0.9350 | 0.6972 | 0.3577 | 0.7204 | 0.4780 | 0.5313 | 0.8387 | 0.6505 | 0.8063 | 0.6044 | 0.8220 | 0.6750 | 0.8772 | 0.8063 | 0.8251 |
0.2627 | 18.0 | 1206 | 0.7459 | 0.9672 | 0.8159 | 0.8851 | 0.5644 | 0.9260 | 0.7013 | 0.3752 | 0.6953 | 0.4874 | 0.5762 | 0.8215 | 0.6773 | 0.8206 | 0.6207 | 0.8147 | 0.6878 | 0.8774 | 0.8206 | 0.8364 |
0.2627 | 19.0 | 1273 | 0.7824 | 0.9682 | 0.8301 | 0.8938 | 0.5644 | 0.9332 | 0.7034 | 0.3808 | 0.7097 | 0.4956 | 0.6364 | 0.7978 | 0.7080 | 0.8314 | 0.6374 | 0.8177 | 0.7002 | 0.8829 | 0.8314 | 0.8461 |
0.2627 | 20.0 | 1340 | 0.7763 | 0.9667 | 0.8261 | 0.8909 | 0.5615 | 0.9314 | 0.7006 | 0.3845 | 0.6918 | 0.4942 | 0.6127 | 0.8065 | 0.6964 | 0.8280 | 0.6313 | 0.8139 | 0.6955 | 0.8799 | 0.8280 | 0.8427 |
0.2627 | 21.0 | 1407 | 0.7584 | 0.9694 | 0.8155 | 0.8858 | 0.5683 | 0.9242 | 0.7038 | 0.3736 | 0.7312 | 0.4945 | 0.5826 | 0.8194 | 0.6810 | 0.8215 | 0.6235 | 0.8225 | 0.6913 | 0.8800 | 0.8215 | 0.8378 |
0.2627 | 22.0 | 1474 | 0.8247 | 0.9684 | 0.8232 | 0.8899 | 0.5617 | 0.9278 | 0.6998 | 0.3886 | 0.7061 | 0.5013 | 0.5962 | 0.8129 | 0.6879 | 0.8264 | 0.6287 | 0.8175 | 0.6947 | 0.8802 | 0.8264 | 0.8415 |
0.1736 | 23.0 | 1541 | 0.7862 | 0.9692 | 0.8163 | 0.8862 | 0.5709 | 0.9224 | 0.7053 | 0.3704 | 0.7168 | 0.4884 | 0.5794 | 0.8237 | 0.6803 | 0.8217 | 0.6225 | 0.8198 | 0.6900 | 0.8797 | 0.8217 | 0.8379 |
0.1736 | 24.0 | 1608 | 0.9049 | 0.9655 | 0.8356 | 0.8959 | 0.5781 | 0.9224 | 0.7107 | 0.3987 | 0.6774 | 0.5020 | 0.6097 | 0.8129 | 0.6968 | 0.8346 | 0.6380 | 0.8121 | 0.7013 | 0.8808 | 0.8346 | 0.8479 |
0.1736 | 25.0 | 1675 | 0.9058 | 0.9677 | 0.8293 | 0.8932 | 0.5872 | 0.9242 | 0.7181 | 0.3962 | 0.6703 | 0.4980 | 0.5672 | 0.8258 | 0.6725 | 0.8303 | 0.6296 | 0.8124 | 0.6954 | 0.8801 | 0.8303 | 0.8444 |
0.1736 | 26.0 | 1742 | 0.9049 | 0.9667 | 0.8258 | 0.8907 | 0.5548 | 0.9314 | 0.6954 | 0.384 | 0.6882 | 0.4929 | 0.6215 | 0.8086 | 0.7028 | 0.8277 | 0.6317 | 0.8135 | 0.6955 | 0.8799 | 0.8277 | 0.8424 |
0.1736 | 27.0 | 1809 | 1.0230 | 0.9629 | 0.8477 | 0.9017 | 0.584 | 0.9224 | 0.7152 | 0.4114 | 0.6487 | 0.5035 | 0.6361 | 0.8043 | 0.7104 | 0.8423 | 0.6486 | 0.8058 | 0.7077 | 0.8818 | 0.8423 | 0.8539 |
0.1736 | 28.0 | 1876 | 0.9487 | 0.9654 | 0.8398 | 0.8982 | 0.5746 | 0.9242 | 0.7087 | 0.4043 | 0.6667 | 0.5034 | 0.6231 | 0.8108 | 0.7047 | 0.8374 | 0.6419 | 0.8104 | 0.7037 | 0.8817 | 0.8374 | 0.8502 |
0.1736 | 29.0 | 1943 | 0.9729 | 0.9661 | 0.8347 | 0.8956 | 0.5805 | 0.9242 | 0.7131 | 0.4021 | 0.6774 | 0.5047 | 0.5978 | 0.8151 | 0.6897 | 0.8341 | 0.6366 | 0.8128 | 0.7008 | 0.8808 | 0.8341 | 0.8474 |
0.1305 | 30.0 | 2010 | 1.0135 | 0.9636 | 0.8487 | 0.9025 | 0.5939 | 0.9188 | 0.7215 | 0.4102 | 0.6631 | 0.5068 | 0.6324 | 0.8065 | 0.7089 | 0.8436 | 0.6500 | 0.8093 | 0.7099 | 0.8828 | 0.8436 | 0.8552 |
0.1305 | 31.0 | 2077 | 1.0109 | 0.9618 | 0.8455 | 0.8999 | 0.5670 | 0.9242 | 0.7028 | 0.4184 | 0.6523 | 0.5098 | 0.6462 | 0.7935 | 0.7124 | 0.8401 | 0.6483 | 0.8039 | 0.7062 | 0.8804 | 0.8401 | 0.8519 |
0.1305 | 32.0 | 2144 | 1.0659 | 0.9627 | 0.8517 | 0.9038 | 0.5995 | 0.9134 | 0.7239 | 0.4201 | 0.6595 | 0.5132 | 0.6210 | 0.8 | 0.6992 | 0.8448 | 0.6508 | 0.8061 | 0.7100 | 0.8822 | 0.8448 | 0.8560 |
0.1305 | 33.0 | 2211 | 1.0779 | 0.9600 | 0.8554 | 0.9047 | 0.6002 | 0.9079 | 0.7227 | 0.4293 | 0.6416 | 0.5144 | 0.6229 | 0.7957 | 0.6988 | 0.8462 | 0.6531 | 0.8002 | 0.7101 | 0.8807 | 0.8462 | 0.8566 |
0.1305 | 34.0 | 2278 | 1.2438 | 0.9552 | 0.8732 | 0.9124 | 0.6053 | 0.9079 | 0.7264 | 0.4898 | 0.6022 | 0.5402 | 0.6505 | 0.7806 | 0.7097 | 0.8576 | 0.6752 | 0.7910 | 0.7222 | 0.8820 | 0.8576 | 0.8650 |
0.1305 | 35.0 | 2345 | 1.1222 | 0.9609 | 0.8602 | 0.9077 | 0.5955 | 0.9116 | 0.7204 | 0.4312 | 0.6631 | 0.5226 | 0.6588 | 0.7806 | 0.7146 | 0.8502 | 0.6616 | 0.8039 | 0.7163 | 0.8837 | 0.8502 | 0.8604 |
0.1305 | 36.0 | 2412 | 1.2303 | 0.9577 | 0.8687 | 0.9110 | 0.5946 | 0.9188 | 0.7220 | 0.4699 | 0.6165 | 0.5333 | 0.6648 | 0.7806 | 0.7181 | 0.8555 | 0.6718 | 0.7961 | 0.7211 | 0.8832 | 0.8555 | 0.8638 |
0.1305 | 37.0 | 2479 | 1.1387 | 0.9603 | 0.8507 | 0.9021 | 0.5886 | 0.9116 | 0.7153 | 0.4189 | 0.6667 | 0.5145 | 0.6353 | 0.7828 | 0.7013 | 0.8429 | 0.6508 | 0.8029 | 0.7083 | 0.8803 | 0.8429 | 0.8541 |
0.1036 | 38.0 | 2546 | 1.2204 | 0.9585 | 0.8623 | 0.9079 | 0.5988 | 0.9025 | 0.7199 | 0.4458 | 0.6344 | 0.5237 | 0.6387 | 0.7871 | 0.7052 | 0.8503 | 0.6605 | 0.7966 | 0.7142 | 0.8812 | 0.8503 | 0.8598 |
0.1036 | 39.0 | 2613 | 1.2219 | 0.9594 | 0.8594 | 0.9066 | 0.5950 | 0.9097 | 0.7195 | 0.4512 | 0.6129 | 0.5198 | 0.6210 | 0.8 | 0.6992 | 0.8486 | 0.6567 | 0.7955 | 0.7113 | 0.8805 | 0.8486 | 0.8581 |
0.1036 | 40.0 | 2680 | 1.2658 | 0.9567 | 0.8691 | 0.9108 | 0.5948 | 0.9061 | 0.7182 | 0.4534 | 0.6272 | 0.5263 | 0.6761 | 0.7720 | 0.7209 | 0.8546 | 0.6702 | 0.7936 | 0.7190 | 0.8825 | 0.8546 | 0.8632 |
0.1036 | 41.0 | 2747 | 1.1926 | 0.9607 | 0.8560 | 0.9053 | 0.6017 | 0.9079 | 0.7237 | 0.4180 | 0.6487 | 0.5084 | 0.6397 | 0.7978 | 0.7100 | 0.8472 | 0.6550 | 0.8026 | 0.7119 | 0.8821 | 0.8472 | 0.8578 |
0.1036 | 42.0 | 2814 | 1.2121 | 0.9590 | 0.8602 | 0.9069 | 0.5986 | 0.8989 | 0.7186 | 0.4414 | 0.6344 | 0.5206 | 0.6314 | 0.7957 | 0.7041 | 0.8489 | 0.6576 | 0.7973 | 0.7125 | 0.8809 | 0.8489 | 0.8587 |
0.1036 | 43.0 | 2881 | 1.2253 | 0.9597 | 0.8572 | 0.9056 | 0.5899 | 0.9061 | 0.7146 | 0.4490 | 0.6308 | 0.5246 | 0.6252 | 0.8 | 0.7019 | 0.8473 | 0.6559 | 0.7985 | 0.7117 | 0.8806 | 0.8473 | 0.8573 |
0.1036 | 44.0 | 2948 | 1.2930 | 0.9572 | 0.8705 | 0.9117 | 0.6098 | 0.8971 | 0.7261 | 0.4534 | 0.6272 | 0.5263 | 0.6595 | 0.7871 | 0.7176 | 0.8560 | 0.6699 | 0.7955 | 0.7204 | 0.8830 | 0.8560 | 0.8644 |
0.0887 | 45.0 | 3015 | 1.2525 | 0.9584 | 0.8612 | 0.9072 | 0.5969 | 0.9116 | 0.7214 | 0.4456 | 0.6308 | 0.5223 | 0.6386 | 0.7828 | 0.7034 | 0.8497 | 0.6599 | 0.7966 | 0.7136 | 0.8810 | 0.8497 | 0.8592 |
0.0887 | 46.0 | 3082 | 1.3334 | 0.9556 | 0.8720 | 0.9119 | 0.6014 | 0.9043 | 0.7224 | 0.4751 | 0.6165 | 0.5367 | 0.6606 | 0.7742 | 0.7129 | 0.8565 | 0.6732 | 0.7918 | 0.7210 | 0.8820 | 0.8565 | 0.8643 |
0.0887 | 47.0 | 3149 | 1.3047 | 0.9584 | 0.8667 | 0.9103 | 0.6031 | 0.9025 | 0.7231 | 0.4490 | 0.6308 | 0.5246 | 0.6542 | 0.7892 | 0.7154 | 0.8538 | 0.6662 | 0.7973 | 0.7183 | 0.8828 | 0.8538 | 0.8627 |
0.0887 | 48.0 | 3216 | 1.3094 | 0.9581 | 0.8677 | 0.9106 | 0.6022 | 0.9043 | 0.7229 | 0.4492 | 0.6344 | 0.5260 | 0.6594 | 0.7785 | 0.7140 | 0.8541 | 0.6672 | 0.7962 | 0.7184 | 0.8828 | 0.8541 | 0.8630 |
0.0887 | 49.0 | 3283 | 1.3583 | 0.9574 | 0.8712 | 0.9123 | 0.6121 | 0.8971 | 0.7277 | 0.4504 | 0.6344 | 0.5268 | 0.6624 | 0.7806 | 0.7167 | 0.8565 | 0.6706 | 0.7959 | 0.7209 | 0.8834 | 0.8565 | 0.8650 |
0.0887 | 50.0 | 3350 | 1.3429 | 0.9579 | 0.8643 | 0.9087 | 0.6105 | 0.9025 | 0.7283 | 0.4447 | 0.6344 | 0.5229 | 0.6330 | 0.7828 | 0.7000 | 0.8516 | 0.6615 | 0.7960 | 0.7150 | 0.8813 | 0.8516 | 0.8608 |
0.0887 | 51.0 | 3417 | 1.3133 | 0.9598 | 0.8633 | 0.9090 | 0.6043 | 0.9097 | 0.7262 | 0.4403 | 0.6344 | 0.5198 | 0.6421 | 0.7871 | 0.7072 | 0.8517 | 0.6616 | 0.7986 | 0.7156 | 0.8827 | 0.8517 | 0.8612 |
0.0887 | 52.0 | 3484 | 1.3570 | 0.9562 | 0.8722 | 0.9123 | 0.6116 | 0.8953 | 0.7267 | 0.4531 | 0.6237 | 0.5249 | 0.6600 | 0.7763 | 0.7134 | 0.8563 | 0.6702 | 0.7919 | 0.7193 | 0.8824 | 0.8563 | 0.8645 |
0.0787 | 53.0 | 3551 | 1.4055 | 0.9560 | 0.8768 | 0.9147 | 0.6091 | 0.8971 | 0.7255 | 0.4822 | 0.6308 | 0.5466 | 0.6716 | 0.7742 | 0.7193 | 0.8602 | 0.6797 | 0.7947 | 0.7265 | 0.8841 | 0.8602 | 0.8677 |
0.0787 | 54.0 | 3618 | 1.3776 | 0.9576 | 0.8718 | 0.9127 | 0.6148 | 0.8989 | 0.7302 | 0.4571 | 0.6308 | 0.5301 | 0.6565 | 0.7849 | 0.7150 | 0.8573 | 0.6715 | 0.7966 | 0.7220 | 0.8837 | 0.8573 | 0.8655 |
0.0787 | 55.0 | 3685 | 1.3579 | 0.9590 | 0.8655 | 0.9099 | 0.6065 | 0.9043 | 0.7261 | 0.4552 | 0.6380 | 0.5313 | 0.6376 | 0.7871 | 0.7045 | 0.8532 | 0.6646 | 0.7987 | 0.7180 | 0.8826 | 0.8532 | 0.8622 |
0.0787 | 56.0 | 3752 | 1.4072 | 0.9563 | 0.8746 | 0.9136 | 0.6151 | 0.8971 | 0.7298 | 0.4684 | 0.6380 | 0.5402 | 0.6661 | 0.7763 | 0.7170 | 0.8590 | 0.6765 | 0.7965 | 0.7252 | 0.8839 | 0.8590 | 0.8668 |
0.0787 | 57.0 | 3819 | 1.3831 | 0.9570 | 0.8718 | 0.9124 | 0.6141 | 0.8989 | 0.7297 | 0.4524 | 0.6308 | 0.5269 | 0.6606 | 0.7785 | 0.7147 | 0.8568 | 0.6710 | 0.7950 | 0.7209 | 0.8833 | 0.8568 | 0.8651 |
0.0787 | 58.0 | 3886 | 1.4017 | 0.9565 | 0.8734 | 0.9131 | 0.6159 | 0.8971 | 0.7303 | 0.4607 | 0.6308 | 0.5325 | 0.6588 | 0.7763 | 0.7127 | 0.8577 | 0.6730 | 0.7944 | 0.7222 | 0.8832 | 0.8577 | 0.8658 |
0.0787 | 59.0 | 3953 | 1.3982 | 0.9564 | 0.8728 | 0.9127 | 0.6133 | 0.8989 | 0.7291 | 0.4630 | 0.6272 | 0.5327 | 0.6582 | 0.7785 | 0.7133 | 0.8574 | 0.6727 | 0.7944 | 0.7220 | 0.8830 | 0.8574 | 0.8654 |
0.0739 | 60.0 | 4020 | 1.4074 | 0.9557 | 0.8738 | 0.9129 | 0.6128 | 0.8971 | 0.7282 | 0.4667 | 0.6272 | 0.5352 | 0.6606 | 0.7742 | 0.7129 | 0.8577 | 0.6739 | 0.7931 | 0.7223 | 0.8827 | 0.8577 | 0.8656 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 1
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for antoineedy/distilgpt2-finetuned-ner
Base model
distilbert/distilgpt2