DeepNLP commited on
Commit
6fcfacc
·
verified ·
1 Parent(s): 2881aed

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +57 -0
README.md ADDED
@@ -0,0 +1,57 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ task_categories:
4
+ - text-classification
5
+ tags:
6
+ - AI
7
+ - ICLR
8
+ - ICLR2023
9
+ ---
10
+ ## ICLR 2023 International Conference on Learning Representations 2023 Accepted Paper Meta Info Dataset
11
+
12
+ This dataset is collect from the ICLR 2023 OpenReview website (https://openreview.net/group?id=ICLR.cc/2023/Conference#tab-accept-oral) as well as the arxiv website DeepNLP paper arxiv (http://www.deepnlp.org/content/paper/iclr2023). For researchers who are interested in doing analysis of ICLR 2023 accepted papers and potential trends, you can use the already cleaned up json files.
13
+ Each row contains the meta information of a paper in the ICLR 2023 conference. To explore more AI & Robotic papers (NIPS/ICML/ICLR/IROS/ICRA/etc) and AI equations, feel free to navigate the Equation Search Engine (http://www.deepnlp.org/search/equation) as well as the AI Agent Search Engine to find the deployed AI Apps and Agents (http://www.deepnlp.org/search/agent) in your domain.
14
+
15
+
16
+ ### Meta Information of Json File
17
+
18
+ ```
19
+ {
20
+ "title": "Encoding Recurrence into Transformers",
21
+ "url": "https://openreview.net/forum?id=7YfHla7IxBJ",
22
+ "detail_url": "https://openreview.net/forum?id=7YfHla7IxBJ",
23
+ "authors": "Feiqing Huang,Kexin Lu,Yuxi CAI,Zhen Qin,Yanwen Fang,Guangjian Tian,Guodong Li",
24
+ "tags": "ICLR 2023,Top 5%",
25
+ "abstract": "This paper novelly breaks down with ignorable loss an RNN layer into a sequence of simple RNNs, each of which can be further rewritten into a lightweight positional encoding matrix of a self-attention, named the Recurrence Encoding Matrix (REM). Thus, recurrent dynamics introduced by the RNN layer can be encapsulated into the positional encodings of a multihead self-attention, and this makes it possible to seamlessly incorporate these recurrent dynamics into a Transformer, leading to a new module, Self-Attention with Recurrence (RSA). The proposed module can leverage the recurrent inductive bias of REMs to achieve a better sample efficiency than its corresponding baseline Transformer, while the self-attention is used to model the remaining non-recurrent signals. The relative proportions of these two components are controlled by a data-driven gated mechanism, and the effectiveness of RSA modules are demonstrated by four sequential learning tasks.",
26
+ "pdf": "https://openreview.net/pdf/70636775789b51f219cb29634cc7c794cc86577b.pdf"
27
+ }
28
+
29
+ ```
30
+
31
+
32
+
33
+ ## Related
34
+
35
+ ## AI Equation
36
+ [List of AI Equations and Latex](http://www.deepnlp.org/equation/category/ai) <br>
37
+ [List of Math Equations and Latex](http://www.deepnlp.org/equation/category/math) <br>
38
+ [List of Physics Equations and Latex](http://www.deepnlp.org/equation/category/physics) <br>
39
+ [List of Statistics Equations and Latex](http://www.deepnlp.org/equation/category/statistics) <br>
40
+ [List of Machine Learning Equations and Latex](http://www.deepnlp.org/equation/category/machine%20learning) <br>
41
+
42
+ ## AI Agent Marketplace and Search
43
+ [AI Agent Marketplace and Search](http://www.deepnlp.org/search/agent) <br>
44
+ [Robot Search](http://www.deepnlp.org/search/robot) <br>
45
+ [Equation and Academic search](http://www.deepnlp.org/search/equation) <br>
46
+ [AI & Robot Comprehensive Search](http://www.deepnlp.org/search) <br>
47
+ [AI & Robot Question](http://www.deepnlp.org/question) <br>
48
+ [AI & Robot Community](http://www.deepnlp.org/community) <br>
49
+ [AI Agent Marketplace Blog](http://www.deepnlp.org/blog/ai-agent-marketplace-and-search-portal-reviews-2025) <br>
50
+
51
+ ## AI Agent Reviews
52
+ [AI Agent Marketplace Directory](http://www.deepnlp.org/store/ai-agent) <br>
53
+ [Microsoft AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-microsoft-ai-agent) <br>
54
+ [Claude AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-claude-ai-agent) <br>
55
+ [OpenAI AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-openai-ai-agent) <br>
56
+ [Saleforce AI Agents Reviews](http://www.deepnlp.org/store/pub/pub-salesforce-ai-agent) <br>
57
+ [AI Agent Builder Reviews](http://www.deepnlp.org/store/ai-agent/ai-agent-builder) <br>