elan-mt-base-en-ja / README.md
Mitsua's picture
Update README.md
0c8e659 verified
|
raw
history blame
1.62 kB
metadata
license: cc-by-sa-4.0
datasets:
  - Mitsua/wikidata-parallel-descriptions-en-ja
language:
  - ja
  - en
metrics:
  - bleu
  - chrf
library_name: transformers
pipeline_tag: translation

ElanMT

This model is a pretrained checkpoint and is suitable for fine-tuning on a large dataset. For general use cases, using ElanMT-BT-en-ja is strongly recommended.

Model Details

This is a translation model based on Marian MT 6-layer encoder-decoder transformer architecture with sentencepiece tokenizer.

Usage

See here.

Training Data

See here.

Training Procedure

See here.

Evaluation

See here.

Disclaimer

The translated result may be very incorrect, harmful or biased. The model was developed to investigate achievable performance with only a relatively small, licensed corpus, and is not suitable for use cases requiring high translation accuracy. Under Section 5 of the CC BY-SA 4.0 License, ELAN MITSUA Project / Abstract Engine is not responsible for any direct or indirect loss caused by the use of the model.