Description
This model is an adapted version of an mDeBERTa model, fine-tuned on the SQuAD v2 dataset for the COVID-19 domain and optimized for the Greek language.
Training Details
- Training Dataset:
COVID-QA-el-small
- Batch Size: 8
- Number of Epochs: 3
- Learning Rate: 3e-05
- Gradient Accumulation Steps: 2
- Downloads last month
- 131
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for panosgriz/mdeberta-v3-base-squad2-covid-el-small
Base model
microsoft/mdeberta-v3-base
Finetuned
timpal0l/mdeberta-v3-base-squad2