elan-mt-tiny-ja-en / README.md
Mitsua's picture
Update README.md
8c76c40 verified
|
raw
history blame
1.69 kB
metadata
license: cc-by-sa-4.0
datasets:
  - Mitsua/wikidata-parallel-descriptions-en-ja
language:
  - ja
  - en
metrics:
  - bleu
  - chrf
library_name: transformers
pipeline_tag: translation

ElanMT

This model is a tiny variant of ElanMT-BT-ja-en and is trained from scratch exclusively on openly licensed data and Wikipedia back translated data using ElanMT-base-en-ja.

Model Details

This is a translation model based on Marian MT 4-layer encoder-decoder transformer architecture with sentencepiece tokenizer.

Usage

See here.

Training Data

See here.

Training Procedure

See here.

Evaluation

See here.

Disclaimer

The translated result may be very incorrect, harmful or biased. The model was developed to investigate achievable performance with only a relatively small, licensed corpus, and is not suitable for use cases requiring high translation accuracy. Under Section 5 of the CC BY-SA 4.0 License, ELAN MITSUA Project / Abstract Engine is not responsible for any direct or indirect loss caused by the use of the model.