File size: 2,027 Bytes
dd67cdd |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 |
---
license: apache-2.0
language:
- en
tags:
- text-generation
- text2text-generation
pipeline_tag: text2text-generation
widget:
- text: "Given the task dialog: Belief state [X_SEP] I'm looking for a affordable BBQ restaurant in Dallas for a large group of guest."
example_title: "Example1"
- text: "Given the task dialog: Dialogue action [X_SEP] I'm looking for a affordable BBQ restaurant in Dallas for a large group of guest."
example_title: "Example2"
- text: "Given the task dialog: System response [X_SEP] I'm looking for a affordable BBQ restaurant in Dallas for a large group of guest."
example_title: "Example3"
---
# MVP-task-dialog
The MVP-task-dialog model was proposed in [**MVP: Multi-task Supervised Pre-training for Natural Language Generation**](https://github.com/RUCAIBox/MVP/blob/main/paper.pdf) by Tianyi Tang, Junyi Li, Wayne Xin Zhao and Ji-Rong Wen.
The detailed information and instructions can be found [https://github.com/RUCAIBox/MVP](https://github.com/RUCAIBox/MVP).
## Model Description
MVP-task-dialog is a prompt-based model that MVP is further equipped with prompts pre-trained using labeled task-oriented system datasets. It is a variant (MVP+S) of our main [MVP](https://huggingface.co/RUCAIBox/mvp) model. It follows a Transformer encoder-decoder architecture with layer-wise prompts.
MVP-task-dialog is specially designed for task-oriented tasks, such as MultiWOZ.
## Example
```python
>>> from transformers import MvpTokenizer, MvpForConditionalGeneration
>>> tokenizer = MvpTokenizer.from_pretrained("RUCAIBox/mvp")
>>> model = MvpForConditionalGeneration.from_pretrained("RUCAIBox/mvp-task-dialog")
>>> inputs = tokenizer(
... "Given the task dialog: System response [X_SEP] I'm looking for a affordable BBQ restaurant in Dallas for a large group of guest.",
... return_tensors="pt",
... )
>>> generated_ids = model.generate(**inputs)
>>> tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
['What date and time would you like to go?']
```
## Citation
|