mbart_ruDialogSum / README.md
Kirill Gelvan
Update README.md
5127871
|
raw
history blame
1.13 kB
metadata
language:
  - ru
  - ru-RU
tags:
  - t5
inference:
  parameters:
    no_repeat_ngram_size: 4
datasets:
  - samsum
widget:
  - text: >
      Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker? 

      Philipp: Sure you can use the new Hugging Face Deep Learning Container. 

      Jeff: ok.

      Jeff: and how can I get started? 

      Jeff: where can I find documentation? 

      Philipp: ok, ok you can find everything here.
      https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
model-index:
  - name: mbart_ruDialogSum
    results:
      - task:
          name: Abstractive Dialogue Summarization
          type: abstractive-text-summarization
        dataset:
          name: >-
            SAMSum Corpus: A Human-annotated Dialogue Dataset for Abstractive
            Summarization (translated to Russian)
          type: samsum
        metrics:
          - name: Validation ROGUE-1
            type: rogue-1
            value: 30
          - name: Validation ROGUE-L
            type: rogue-l
            value: 30
          - name: Test ROGUE-1
            type: rogue-1
            value: 31
          - name: Test ROGUE-L
            type: rogue-l
            value: 31

📝 Description