Edit model card

layoutlm-filtex

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7280
  • Answer: {'precision': 0.7160087719298246, 'recall': 0.8071693448702101, 'f1': 0.7588611272515979, 'number': 809}
  • Header: {'precision': 0.33070866141732286, 'recall': 0.35294117647058826, 'f1': 0.34146341463414637, 'number': 119}
  • Question: {'precision': 0.7805092186128183, 'recall': 0.8347417840375587, 'f1': 0.8067150635208712, 'number': 1065}
  • Overall Precision: 0.7273
  • Overall Recall: 0.7948
  • Overall F1: 0.7595
  • Overall Accuracy: 0.8050

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8197 1.0 10 1.6109 {'precision': 0.010086455331412104, 'recall': 0.00865265760197775, 'f1': 0.009314703925482368, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.1453125, 'recall': 0.08732394366197183, 'f1': 0.1090909090909091, 'number': 1065} 0.0750 0.0502 0.0601 0.3438
1.4759 2.0 20 1.2685 {'precision': 0.20723684210526316, 'recall': 0.23362175525339926, 'f1': 0.21963974433468914, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.45848915482423336, 'recall': 0.5755868544600939, 'f1': 0.5104079933388842, 'number': 1065} 0.3566 0.4024 0.3781 0.6132
1.1264 3.0 30 0.9432 {'precision': 0.4686046511627907, 'recall': 0.49814585908529047, 'f1': 0.4829239065308568, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.6134663341645885, 'recall': 0.6929577464788732, 'f1': 0.6507936507936508, 'number': 1065} 0.5459 0.5725 0.5589 0.7184
0.8604 4.0 40 0.7865 {'precision': 0.6155484558040468, 'recall': 0.7144622991347342, 'f1': 0.6613272311212816, 'number': 809} {'precision': 0.1590909090909091, 'recall': 0.058823529411764705, 'f1': 0.08588957055214724, 'number': 119} {'precision': 0.6785079928952042, 'recall': 0.7173708920187793, 'f1': 0.6973984481971702, 'number': 1065} 0.6396 0.6769 0.6577 0.7536
0.6767 5.0 50 0.7196 {'precision': 0.6316916488222698, 'recall': 0.7292954264524104, 'f1': 0.6769936890418818, 'number': 809} {'precision': 0.19480519480519481, 'recall': 0.12605042016806722, 'f1': 0.15306122448979592, 'number': 119} {'precision': 0.6669266770670826, 'recall': 0.8028169014084507, 'f1': 0.7285896889646357, 'number': 1065} 0.6367 0.7326 0.6813 0.7738
0.5702 6.0 60 0.6952 {'precision': 0.639, 'recall': 0.7898640296662547, 'f1': 0.7064676616915423, 'number': 809} {'precision': 0.20454545454545456, 'recall': 0.15126050420168066, 'f1': 0.17391304347826086, 'number': 119} {'precision': 0.7217915590008613, 'recall': 0.7868544600938967, 'f1': 0.752920035938904, 'number': 1065} 0.6647 0.7501 0.7049 0.7819
0.4943 7.0 70 0.6772 {'precision': 0.6741693461950696, 'recall': 0.7775030902348579, 'f1': 0.722158438576349, 'number': 809} {'precision': 0.256198347107438, 'recall': 0.2605042016806723, 'f1': 0.25833333333333336, 'number': 119} {'precision': 0.7392795883361921, 'recall': 0.8093896713615023, 'f1': 0.7727476467951592, 'number': 1065} 0.6856 0.7637 0.7225 0.7892
0.441 8.0 80 0.6809 {'precision': 0.6515151515151515, 'recall': 0.7972805933250927, 'f1': 0.717065036131184, 'number': 809} {'precision': 0.25225225225225223, 'recall': 0.23529411764705882, 'f1': 0.2434782608695652, 'number': 119} {'precision': 0.742540494458653, 'recall': 0.8178403755868544, 'f1': 0.778373547810545, 'number': 1065} 0.6790 0.7747 0.7237 0.7882
0.3859 9.0 90 0.6842 {'precision': 0.6882416396979504, 'recall': 0.788627935723115, 'f1': 0.7350230414746545, 'number': 809} {'precision': 0.28688524590163933, 'recall': 0.29411764705882354, 'f1': 0.2904564315352697, 'number': 119} {'precision': 0.7545064377682403, 'recall': 0.8253521126760563, 'f1': 0.788340807174888, 'number': 1065} 0.7010 0.7787 0.7378 0.7950
0.3813 10.0 100 0.6972 {'precision': 0.6871686108165429, 'recall': 0.8009888751545118, 'f1': 0.7397260273972602, 'number': 809} {'precision': 0.336283185840708, 'recall': 0.31932773109243695, 'f1': 0.32758620689655166, 'number': 119} {'precision': 0.7651646447140381, 'recall': 0.8291079812206573, 'f1': 0.7958539882830105, 'number': 1065} 0.7100 0.7873 0.7466 0.8040
0.3202 11.0 110 0.7074 {'precision': 0.7015250544662309, 'recall': 0.796044499381953, 'f1': 0.7458019687319051, 'number': 809} {'precision': 0.3253968253968254, 'recall': 0.3445378151260504, 'f1': 0.33469387755102037, 'number': 119} {'precision': 0.7665805340223945, 'recall': 0.8356807511737089, 'f1': 0.7996406109613656, 'number': 1065} 0.7143 0.7903 0.7504 0.7938
0.3009 12.0 120 0.7242 {'precision': 0.7027027027027027, 'recall': 0.8034610630407911, 'f1': 0.7497116493656287, 'number': 809} {'precision': 0.32786885245901637, 'recall': 0.33613445378151263, 'f1': 0.33195020746887965, 'number': 119} {'precision': 0.7851387645478961, 'recall': 0.8234741784037559, 'f1': 0.8038496791934006, 'number': 1065} 0.7241 0.7863 0.7539 0.8031
0.2915 13.0 130 0.7256 {'precision': 0.705945945945946, 'recall': 0.8071693448702101, 'f1': 0.7531718569780853, 'number': 809} {'precision': 0.33064516129032256, 'recall': 0.3445378151260504, 'f1': 0.33744855967078186, 'number': 119} {'precision': 0.7808939526730938, 'recall': 0.8366197183098592, 'f1': 0.8077969174977334, 'number': 1065} 0.7237 0.7953 0.7578 0.8024
0.2644 14.0 140 0.7278 {'precision': 0.712882096069869, 'recall': 0.8071693448702101, 'f1': 0.7571014492753623, 'number': 809} {'precision': 0.3282442748091603, 'recall': 0.36134453781512604, 'f1': 0.344, 'number': 119} {'precision': 0.7841409691629956, 'recall': 0.8356807511737089, 'f1': 0.8090909090909091, 'number': 1065} 0.7269 0.7958 0.7598 0.8034
0.2654 15.0 150 0.7280 {'precision': 0.7160087719298246, 'recall': 0.8071693448702101, 'f1': 0.7588611272515979, 'number': 809} {'precision': 0.33070866141732286, 'recall': 0.35294117647058826, 'f1': 0.34146341463414637, 'number': 119} {'precision': 0.7805092186128183, 'recall': 0.8347417840375587, 'f1': 0.8067150635208712, 'number': 1065} 0.7273 0.7948 0.7595 0.8050

Framework versions

  • Transformers 4.42.4
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
113M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for elieli/layoutlm-filtex

Finetuned
this model