File size: 980 Bytes
cbdab33 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 |
---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- Problem solving
size_categories:
- 10M<n<100M
---
# Every Problem, Every Step, All in Focus: Learning to Solve Vision-Language Problems With Integrated Attention
A novel Solution Graph Attention Network (SGAN) approach that takes into account both intra-step and inter-step attention mechanisms, enabling a progressive construction of solutions by refining the dependencies between relevant problemsolving steps.
Citation
------------------
If you use our code or data, please cite our paper:
```text
@article{xianyu:2024:sgan,
Author = {Xianyu Chen and Jinhui Yang and Shi Chen and Louis Wang and Ming Jiang and Qi Zhao},
Title = {Every Problem, Every Step, All In Focus: Learning to Solve Vision-Language Problems with Integrated Attention},
journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence (IEEE TPAMI)},
Year = {2024}
}
```
|