File size: 2,256 Bytes
0d46a2a dfcc5e1 0d46a2a dfcc5e1 0d46a2a e3f413c 0d46a2a dfcc5e1 efadaf7 85b9cc0 e490ba2 7e69f46 3ec74b6 0371d25 81677b5 f8cd4d5 8310d51 f0fd7c3 294ed47 67efb11 21a2dca 06fc906 76845c3 1bf4750 e837209 a1eca43 d04ace0 e3f413c 0d46a2a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 |
---
license: apache-2.0
base_model: bedus-creation/eng-limbu-t5-base-all-001
tags:
- generated_from_keras_callback
model-index:
- name: bedus-creation/eng-limbu-t5-base-all-001
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# bedus-creation/eng-limbu-t5-base-all-001
This model is a fine-tuned version of [bedus-creation/eng-limbu-t5-base-all-001](https://huggingface.co/bedus-creation/eng-limbu-t5-base-all-001) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 4.0164
- Validation Loss: 4.2560
- Epoch: 20
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 7.0062 | 6.1115 | 0 |
| 6.0720 | 5.8817 | 1 |
| 5.8833 | 5.7515 | 2 |
| 5.7643 | 5.6312 | 3 |
| 5.6159 | 5.5281 | 4 |
| 5.5133 | 5.4337 | 5 |
| 5.4239 | 5.3227 | 6 |
| 5.3002 | 5.2327 | 7 |
| 5.1915 | 5.1267 | 8 |
| 5.1029 | 5.0370 | 9 |
| 4.9916 | 4.9413 | 10 |
| 4.8633 | 4.8633 | 11 |
| 4.7651 | 4.7806 | 12 |
| 4.6682 | 4.7019 | 13 |
| 4.5570 | 4.6346 | 14 |
| 4.4718 | 4.5772 | 15 |
| 4.3830 | 4.5084 | 16 |
| 4.2749 | 4.4127 | 17 |
| 4.1922 | 4.3616 | 18 |
| 4.1207 | 4.3160 | 19 |
| 4.0164 | 4.2560 | 20 |
### Framework versions
- Transformers 4.33.2
- TensorFlow 2.13.0
- Datasets 2.14.5
- Tokenizers 0.13.3
|