Papers
arxiv:2301.12353

On Enhancing Expressive Power via Compositions of Single Fixed-Size ReLU Network

Published on Jan 29, 2023
Authors:
,
,

Abstract

This paper explores the expressive power of deep neural networks through the framework of function compositions. We demonstrate that the repeated compositions of a single fixed-size ReLU network exhibit surprising expressive power, despite the limited expressive capabilities of the individual network itself. Specifically, we prove by construction that L_2circ g^{circ r}circ mathcal{L}_1 can approximate 1-Lipschitz continuous functions on [0,1]^d with an error O(r^{-1/d}), where g is realized by a fixed-size ReLU network, mathcal{L}_1 and L_2 are two affine linear maps matching the dimensions, and g^{circ r} denotes the r-times composition of g. Furthermore, we extend such a result to generic continuous functions on [0,1]^d with the approximation error characterized by the modulus of continuity. Our results reveal that a continuous-depth network generated via a dynamical system has immense approximation power even if its dynamics function is time-independent and realized by a fixed-size ReLU network.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2301.12353 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2301.12353 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2301.12353 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.