File size: 10,843 Bytes
a106de6
 
 
 
1059701
a106de6
 
 
 
 
 
 
 
 
 
 
d72b626
e6defd6
 
 
 
5d0c72e
e6defd6
 
 
5d0c72e
e6defd6
 
 
5d0c72e
e6defd6
 
 
5d0c72e
e6defd6
 
 
5d0c72e
e6defd6
5d0c72e
e6defd6
5d0c72e
e6defd6
5d0c72e
e6defd6
 
a106de6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5d0c72e
a106de6
 
 
5d0c72e
 
 
 
 
 
 
 
 
 
 
 
a106de6
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
---
license: mit
tags:
- generated_from_trainer
base_model: neuralmind/bert-base-portuguese-cased
model-index:
- name: bert_base_tcm_0.6
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bert_base_tcm_0.6

This model is a fine-tuned version of [neuralmind/bert-base-portuguese-cased](https://huggingface.co/neuralmind/bert-base-portuguese-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0193
- Criterio Julgamento Precision: 0.8875
- Criterio Julgamento Recall: 0.8659
- Criterio Julgamento F1: 0.8765
- Criterio Julgamento Number: 82
- Data Sessao Precision: 0.7571
- Data Sessao Recall: 0.9636
- Data Sessao F1: 0.848
- Data Sessao Number: 55
- Modalidade Licitacao Precision: 0.9394
- Modalidade Licitacao Recall: 0.9718
- Modalidade Licitacao F1: 0.9553
- Modalidade Licitacao Number: 319
- Numero Exercicio Precision: 0.9172
- Numero Exercicio Recall: 0.9688
- Numero Exercicio F1: 0.9422
- Numero Exercicio Number: 160
- Objeto Licitacao Precision: 0.4659
- Objeto Licitacao Recall: 0.7069
- Objeto Licitacao F1: 0.5616
- Objeto Licitacao Number: 58
- Valor Objeto Precision: 0.8333
- Valor Objeto Recall: 0.9211
- Valor Objeto F1: 0.875
- Valor Objeto Number: 38
- Overall Precision: 0.8537
- Overall Recall: 0.9340
- Overall F1: 0.8920
- Overall Accuracy: 0.9951

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10.0

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Criterio Julgamento Precision | Criterio Julgamento Recall | Criterio Julgamento F1 | Criterio Julgamento Number | Data Sessao Precision | Data Sessao Recall | Data Sessao F1 | Data Sessao Number | Modalidade Licitacao Precision | Modalidade Licitacao Recall | Modalidade Licitacao F1 | Modalidade Licitacao Number | Numero Exercicio Precision | Numero Exercicio Recall | Numero Exercicio F1 | Numero Exercicio Number | Objeto Licitacao Precision | Objeto Licitacao Recall | Objeto Licitacao F1 | Objeto Licitacao Number | Valor Objeto Precision | Valor Objeto Recall | Valor Objeto F1 | Valor Objeto Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:-----------------------------:|:--------------------------:|:----------------------:|:--------------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:------------------------------:|:---------------------------:|:-----------------------:|:---------------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:--------------------------:|:-----------------------:|:-------------------:|:-----------------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
| 0.0252        | 1.0   | 1963  | 0.0202          | 0.8022                        | 0.8902                     | 0.8439                 | 82                         | 0.7391                | 0.9273             | 0.8226         | 55                 | 0.9233                         | 0.9812                      | 0.9514                  | 319                         | 0.8966                     | 0.975                   | 0.9341              | 160                     | 0.4730                     | 0.6034                  | 0.5303              | 58                      | 0.7083                 | 0.8947              | 0.7907          | 38                  | 0.8327            | 0.9298         | 0.8786     | 0.9948           |
| 0.0191        | 2.0   | 3926  | 0.0226          | 0.8554                        | 0.8659                     | 0.8606                 | 82                         | 0.5641                | 0.4                | 0.4681         | 55                 | 0.9572                         | 0.9812                      | 0.9690                  | 319                         | 0.9273                     | 0.9563                  | 0.9415              | 160                     | 0.3770                     | 0.3966                  | 0.3866              | 58                      | 0.8571                 | 0.7895              | 0.8219          | 38                  | 0.8620            | 0.8596         | 0.8608     | 0.9951           |
| 0.0137        | 3.0   | 5889  | 0.0193          | 0.8875                        | 0.8659                     | 0.8765                 | 82                         | 0.7571                | 0.9636             | 0.848          | 55                 | 0.9394                         | 0.9718                      | 0.9553                  | 319                         | 0.9172                     | 0.9688                  | 0.9422              | 160                     | 0.4659                     | 0.7069                  | 0.5616              | 58                      | 0.8333                 | 0.9211              | 0.875           | 38                  | 0.8537            | 0.9340         | 0.8920     | 0.9951           |
| 0.0082        | 4.0   | 7852  | 0.0210          | 0.8780                        | 0.8780                     | 0.8780                 | 82                         | 0.7966                | 0.8545             | 0.8246         | 55                 | 0.9512                         | 0.9781                      | 0.9645                  | 319                         | 0.9023                     | 0.9812                  | 0.9401              | 160                     | 0.5385                     | 0.6034                  | 0.5691              | 58                      | 0.9                    | 0.9474              | 0.9231          | 38                  | 0.8810            | 0.9256         | 0.9027     | 0.9963           |
| 0.0048        | 5.0   | 9815  | 0.0222          | 0.8261                        | 0.9268                     | 0.8736                 | 82                         | 0.7969                | 0.9273             | 0.8571         | 55                 | 0.9512                         | 0.9781                      | 0.9645                  | 319                         | 0.9231                     | 0.975                   | 0.9483              | 160                     | 0.6515                     | 0.7414                  | 0.6935              | 58                      | 0.875                  | 0.9211              | 0.8974          | 38                  | 0.8867            | 0.9452         | 0.9150     | 0.9964           |
| 0.0044        | 6.0   | 11778 | 0.0262          | 0.8276                        | 0.8780                     | 0.8521                 | 82                         | 0.7681                | 0.9636             | 0.8548         | 55                 | 0.9541                         | 0.9781                      | 0.9659                  | 319                         | 0.9235                     | 0.9812                  | 0.9515              | 160                     | 0.5263                     | 0.6897                  | 0.5970              | 58                      | 0.9211                 | 0.9211              | 0.9211          | 38                  | 0.8722            | 0.9396         | 0.9047     | 0.9959           |
| 0.0042        | 7.0   | 13741 | 0.0246          | 0.8523                        | 0.9146                     | 0.8824                 | 82                         | 0.7656                | 0.8909             | 0.8235         | 55                 | 0.9509                         | 0.9718                      | 0.9612                  | 319                         | 0.9118                     | 0.9688                  | 0.9394              | 160                     | 0.5938                     | 0.6552                  | 0.6230              | 58                      | 0.8974                 | 0.9211              | 0.9091          | 38                  | 0.8815            | 0.9298         | 0.9050     | 0.9960           |
| 0.0013        | 8.0   | 15704 | 0.0294          | 0.8295                        | 0.8902                     | 0.8588                 | 82                         | 0.7391                | 0.9273             | 0.8226         | 55                 | 0.9543                         | 0.9812                      | 0.9675                  | 319                         | 0.9070                     | 0.975                   | 0.9398              | 160                     | 0.6094                     | 0.6724                  | 0.6393              | 58                      | 0.875                  | 0.9211              | 0.8974          | 38                  | 0.8765            | 0.9368         | 0.9056     | 0.9961           |
| 0.0019        | 9.0   | 17667 | 0.0303          | 0.8690                        | 0.8902                     | 0.8795                 | 82                         | 0.8305                | 0.8909             | 0.8596         | 55                 | 0.9538                         | 0.9718                      | 0.9627                  | 319                         | 0.9290                     | 0.9812                  | 0.9544              | 160                     | 0.6441                     | 0.6552                  | 0.6496              | 58                      | 0.9211                 | 0.9211              | 0.9211          | 38                  | 0.9019            | 0.9298         | 0.9156     | 0.9961           |
| 0.0007        | 10.0  | 19630 | 0.0295          | 0.8488                        | 0.8902                     | 0.8690                 | 82                         | 0.7903                | 0.8909             | 0.8376         | 55                 | 0.9571                         | 0.9781                      | 0.9674                  | 319                         | 0.9181                     | 0.9812                  | 0.9486              | 160                     | 0.6393                     | 0.6724                  | 0.6555              | 58                      | 0.9211                 | 0.9211              | 0.9211          | 38                  | 0.8938            | 0.9340         | 0.9135     | 0.9962           |


### Framework versions

- Transformers 4.20.0.dev0
- Pytorch 1.11.0+cu113
- Datasets 2.2.2
- Tokenizers 0.12.1