Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,26 @@
|
|
1 |
-
---
|
2 |
-
license: mit
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
task_categories:
|
4 |
+
- text-generation
|
5 |
+
language:
|
6 |
+
- en
|
7 |
+
tags:
|
8 |
+
- Problem solving
|
9 |
+
size_categories:
|
10 |
+
- 10M<n<100M
|
11 |
+
---
|
12 |
+
|
13 |
+
# Every Problem, Every Step, All in Focus: Learning to Solve Vision-Language Problems With Integrated Attention
|
14 |
+
A novel Solution Graph Attention Network (SGAN) approach that takes into account both intra-step and inter-step attention mechanisms, enabling a progressive construction of solutions by refining the dependencies between relevant problemsolving steps.
|
15 |
+
|
16 |
+
Citation
|
17 |
+
------------------
|
18 |
+
If you use our code or data, please cite our paper:
|
19 |
+
```text
|
20 |
+
@article{xianyu:2024:sgan,
|
21 |
+
Author = {Xianyu Chen and Jinhui Yang and Shi Chen and Louis Wang and Ming Jiang and Qi Zhao},
|
22 |
+
Title = {Every Problem, Every Step, All In Focus: Learning to Solve Vision-Language Problems with Integrated Attention},
|
23 |
+
journal = {IEEE Transactions on Pattern Analysis and Machine Intelligence (IEEE TPAMI)},
|
24 |
+
Year = {2024}
|
25 |
+
}
|
26 |
+
```
|