layoutlm-funsd
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:
- Loss: 0.8048
- Answer: {'precision': 0.7424412094064949, 'recall': 0.8195302843016069, 'f1': 0.7790834312573444, 'number': 809}
- Header: {'precision': 0.41304347826086957, 'recall': 0.4789915966386555, 'f1': 0.443579766536965, 'number': 119}
- Question: {'precision': 0.8048561151079137, 'recall': 0.8403755868544601, 'f1': 0.8222324299494717, 'number': 1065}
- Overall Precision: 0.7536
- Overall Recall: 0.8103
- Overall F1: 0.7809
- Overall Accuracy: 0.8256
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 25
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
1.8389 | 1.0 | 10 | 1.6291 | {'precision': 0.01568627450980392, 'recall': 0.009888751545117428, 'f1': 0.012130401819560273, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.2798165137614679, 'recall': 0.11455399061032864, 'f1': 0.16255829447035308, 'number': 1065} | 0.1374 | 0.0652 | 0.0885 | 0.3319 |
1.4797 | 2.0 | 20 | 1.2835 | {'precision': 0.2250740375123396, 'recall': 0.28182941903584674, 'f1': 0.2502744237102085, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.3997214484679666, 'recall': 0.5389671361502347, 'f1': 0.45901639344262296, 'number': 1065} | 0.3275 | 0.4024 | 0.3611 | 0.5792 |
1.1281 | 3.0 | 30 | 0.9324 | {'precision': 0.47114375655823715, 'recall': 0.5550061804697157, 'f1': 0.5096481271282634, 'number': 809} | {'precision': 0.06060606060606061, 'recall': 0.01680672268907563, 'f1': 0.02631578947368421, 'number': 119} | {'precision': 0.5470149253731343, 'recall': 0.6882629107981221, 'f1': 0.6095634095634096, 'number': 1065} | 0.5090 | 0.5941 | 0.5483 | 0.7008 |
0.848 | 4.0 | 40 | 0.7620 | {'precision': 0.5925563173359452, 'recall': 0.7478368355995055, 'f1': 0.6612021857923498, 'number': 809} | {'precision': 0.17391304347826086, 'recall': 0.10084033613445378, 'f1': 0.12765957446808512, 'number': 119} | {'precision': 0.6578293289146645, 'recall': 0.7455399061032864, 'f1': 0.6989436619718309, 'number': 1065} | 0.6143 | 0.7080 | 0.6578 | 0.7624 |
0.6618 | 5.0 | 50 | 0.6889 | {'precision': 0.6424180327868853, 'recall': 0.7750309023485785, 'f1': 0.7025210084033613, 'number': 809} | {'precision': 0.3, 'recall': 0.226890756302521, 'f1': 0.25837320574162675, 'number': 119} | {'precision': 0.6919967663702506, 'recall': 0.8037558685446009, 'f1': 0.7437011294526499, 'number': 1065} | 0.6557 | 0.7577 | 0.7030 | 0.7901 |
0.5475 | 6.0 | 60 | 0.6690 | {'precision': 0.654158215010142, 'recall': 0.7972805933250927, 'f1': 0.7186629526462396, 'number': 809} | {'precision': 0.31868131868131866, 'recall': 0.24369747899159663, 'f1': 0.2761904761904762, 'number': 119} | {'precision': 0.7417102966841187, 'recall': 0.7981220657276995, 'f1': 0.7688828584350972, 'number': 1065} | 0.6856 | 0.7647 | 0.7230 | 0.7949 |
0.4641 | 7.0 | 70 | 0.6472 | {'precision': 0.6896551724137931, 'recall': 0.7911001236093943, 'f1': 0.7369027058146229, 'number': 809} | {'precision': 0.23529411764705882, 'recall': 0.23529411764705882, 'f1': 0.23529411764705882, 'number': 119} | {'precision': 0.7485131690739167, 'recall': 0.8272300469483568, 'f1': 0.7859054415700267, 'number': 1065} | 0.6965 | 0.7772 | 0.7346 | 0.8108 |
0.3968 | 8.0 | 80 | 0.6603 | {'precision': 0.7052518756698821, 'recall': 0.8133498145859085, 'f1': 0.7554535017221584, 'number': 809} | {'precision': 0.26277372262773724, 'recall': 0.3025210084033613, 'f1': 0.28125000000000006, 'number': 119} | {'precision': 0.7734513274336283, 'recall': 0.8206572769953052, 'f1': 0.7963553530751709, 'number': 1065} | 0.7127 | 0.7868 | 0.7479 | 0.8117 |
0.3377 | 9.0 | 90 | 0.6641 | {'precision': 0.7273730684326711, 'recall': 0.8145859085290482, 'f1': 0.7685131195335277, 'number': 809} | {'precision': 0.30612244897959184, 'recall': 0.37815126050420167, 'f1': 0.3383458646616541, 'number': 119} | {'precision': 0.7655838454784899, 'recall': 0.8187793427230047, 'f1': 0.7912885662431942, 'number': 1065} | 0.7190 | 0.7908 | 0.7532 | 0.8063 |
0.3159 | 10.0 | 100 | 0.6626 | {'precision': 0.7112299465240641, 'recall': 0.8220024721878862, 'f1': 0.7626146788990825, 'number': 809} | {'precision': 0.36666666666666664, 'recall': 0.2773109243697479, 'f1': 0.31578947368421056, 'number': 119} | {'precision': 0.7945945945945946, 'recall': 0.828169014084507, 'f1': 0.8110344827586206, 'number': 1065} | 0.7400 | 0.7928 | 0.7655 | 0.8252 |
0.2565 | 11.0 | 110 | 0.6831 | {'precision': 0.706951871657754, 'recall': 0.8170580964153276, 'f1': 0.7580275229357798, 'number': 809} | {'precision': 0.3418803418803419, 'recall': 0.33613445378151263, 'f1': 0.3389830508474576, 'number': 119} | {'precision': 0.7935656836461126, 'recall': 0.8338028169014085, 'f1': 0.8131868131868133, 'number': 1065} | 0.7319 | 0.7973 | 0.7632 | 0.8146 |
0.2326 | 12.0 | 120 | 0.7081 | {'precision': 0.7152103559870551, 'recall': 0.8195302843016069, 'f1': 0.7638248847926269, 'number': 809} | {'precision': 0.34375, 'recall': 0.3697478991596639, 'f1': 0.3562753036437247, 'number': 119} | {'precision': 0.7731601731601732, 'recall': 0.8384976525821596, 'f1': 0.8045045045045045, 'number': 1065} | 0.7240 | 0.8028 | 0.7614 | 0.8097 |
0.2064 | 13.0 | 130 | 0.7088 | {'precision': 0.7420454545454546, 'recall': 0.8071693448702101, 'f1': 0.773238602723505, 'number': 809} | {'precision': 0.375, 'recall': 0.37815126050420167, 'f1': 0.37656903765690375, 'number': 119} | {'precision': 0.7978628673196795, 'recall': 0.8413145539906103, 'f1': 0.8190127970749542, 'number': 1065} | 0.7508 | 0.7998 | 0.7745 | 0.8216 |
0.1807 | 14.0 | 140 | 0.7149 | {'precision': 0.7113289760348583, 'recall': 0.8071693448702101, 'f1': 0.7562246670526924, 'number': 809} | {'precision': 0.373134328358209, 'recall': 0.42016806722689076, 'f1': 0.3952569169960475, 'number': 119} | {'precision': 0.8001800180018002, 'recall': 0.8347417840375587, 'f1': 0.8170955882352942, 'number': 1065} | 0.7360 | 0.7988 | 0.7661 | 0.8186 |
0.1673 | 15.0 | 150 | 0.7429 | {'precision': 0.7461988304093568, 'recall': 0.788627935723115, 'f1': 0.766826923076923, 'number': 809} | {'precision': 0.4015151515151515, 'recall': 0.44537815126050423, 'f1': 0.4223107569721115, 'number': 119} | {'precision': 0.8001800180018002, 'recall': 0.8347417840375587, 'f1': 0.8170955882352942, 'number': 1065} | 0.7531 | 0.7928 | 0.7724 | 0.8213 |
0.158 | 16.0 | 160 | 0.7579 | {'precision': 0.7352614015572859, 'recall': 0.8170580964153276, 'f1': 0.7740046838407495, 'number': 809} | {'precision': 0.3673469387755102, 'recall': 0.453781512605042, 'f1': 0.406015037593985, 'number': 119} | {'precision': 0.790616854908775, 'recall': 0.8544600938967136, 'f1': 0.8212996389891697, 'number': 1065} | 0.7396 | 0.8154 | 0.7757 | 0.8166 |
0.1407 | 17.0 | 170 | 0.7595 | {'precision': 0.7474747474747475, 'recall': 0.823238566131026, 'f1': 0.783529411764706, 'number': 809} | {'precision': 0.424, 'recall': 0.44537815126050423, 'f1': 0.4344262295081967, 'number': 119} | {'precision': 0.8081081081081081, 'recall': 0.8422535211267606, 'f1': 0.8248275862068966, 'number': 1065} | 0.7601 | 0.8108 | 0.7847 | 0.8237 |
0.1277 | 18.0 | 180 | 0.7927 | {'precision': 0.7305986696230599, 'recall': 0.8145859085290482, 'f1': 0.7703097603740503, 'number': 809} | {'precision': 0.4140625, 'recall': 0.44537815126050423, 'f1': 0.42914979757085026, 'number': 119} | {'precision': 0.8114233907524931, 'recall': 0.8403755868544601, 'f1': 0.8256457564575646, 'number': 1065} | 0.7534 | 0.8063 | 0.7790 | 0.8137 |
0.1268 | 19.0 | 190 | 0.7819 | {'precision': 0.7361894024802705, 'recall': 0.8071693448702101, 'f1': 0.7700471698113207, 'number': 809} | {'precision': 0.4330708661417323, 'recall': 0.46218487394957986, 'f1': 0.4471544715447155, 'number': 119} | {'precision': 0.8028419182948491, 'recall': 0.8488262910798122, 'f1': 0.8251939753537197, 'number': 1065} | 0.7533 | 0.8088 | 0.7801 | 0.8216 |
0.1112 | 20.0 | 200 | 0.7880 | {'precision': 0.740782122905028, 'recall': 0.8195302843016069, 'f1': 0.7781690140845071, 'number': 809} | {'precision': 0.4195804195804196, 'recall': 0.5042016806722689, 'f1': 0.4580152671755725, 'number': 119} | {'precision': 0.8075880758807588, 'recall': 0.8394366197183099, 'f1': 0.8232044198895027, 'number': 1065} | 0.7538 | 0.8113 | 0.7815 | 0.8229 |
0.1096 | 21.0 | 210 | 0.7925 | {'precision': 0.7404494382022472, 'recall': 0.8145859085290482, 'f1': 0.7757504414361388, 'number': 809} | {'precision': 0.45454545454545453, 'recall': 0.42016806722689076, 'f1': 0.43668122270742354, 'number': 119} | {'precision': 0.815049864007253, 'recall': 0.844131455399061, 'f1': 0.8293357933579335, 'number': 1065} | 0.7646 | 0.8068 | 0.7852 | 0.8249 |
0.1158 | 22.0 | 220 | 0.8093 | {'precision': 0.7363128491620111, 'recall': 0.8145859085290482, 'f1': 0.7734741784037558, 'number': 809} | {'precision': 0.41333333333333333, 'recall': 0.5210084033613446, 'f1': 0.4609665427509294, 'number': 119} | {'precision': 0.8030438675022381, 'recall': 0.8422535211267606, 'f1': 0.8221814848762603, 'number': 1065} | 0.7484 | 0.8118 | 0.7788 | 0.8210 |
0.0985 | 23.0 | 230 | 0.8013 | {'precision': 0.7554535017221584, 'recall': 0.8133498145859085, 'f1': 0.7833333333333333, 'number': 809} | {'precision': 0.45689655172413796, 'recall': 0.44537815126050423, 'f1': 0.4510638297872341, 'number': 119} | {'precision': 0.8091809180918091, 'recall': 0.844131455399061, 'f1': 0.8262867647058824, 'number': 1065} | 0.7674 | 0.8078 | 0.7871 | 0.8279 |
0.0988 | 24.0 | 240 | 0.8040 | {'precision': 0.7385984427141268, 'recall': 0.8207663782447466, 'f1': 0.7775175644028104, 'number': 809} | {'precision': 0.4198473282442748, 'recall': 0.46218487394957986, 'f1': 0.43999999999999995, 'number': 119} | {'precision': 0.8016157989228008, 'recall': 0.8384976525821596, 'f1': 0.8196420376319412, 'number': 1065} | 0.7519 | 0.8088 | 0.7793 | 0.8255 |
0.1004 | 25.0 | 250 | 0.8048 | {'precision': 0.7424412094064949, 'recall': 0.8195302843016069, 'f1': 0.7790834312573444, 'number': 809} | {'precision': 0.41304347826086957, 'recall': 0.4789915966386555, 'f1': 0.443579766536965, 'number': 119} | {'precision': 0.8048561151079137, 'recall': 0.8403755868544601, 'f1': 0.8222324299494717, 'number': 1065} | 0.7536 | 0.8103 | 0.7809 | 0.8256 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.1.2
- Datasets 2.19.2
- Tokenizers 0.19.1
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for asharma06/layoutlm-funsd
Base model
microsoft/layoutlm-base-uncased