Papers
arxiv:2508.09925

Residual Reservoir Memory Networks

Published on Aug 13
Authors:
,

Abstract

Residual Reservoir Memory Networks (ResRMNs) enhance long-term memory in Recurrent Neural Networks (RNNs) using residual orthogonal connections, outperforming conventional Reservoir Computing models in time-series and pixel-level classification tasks.

AI-generated summary

We introduce a novel class of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) paradigm, called Residual Reservoir Memory Networks (ResRMNs). ResRMN combines a linear memory reservoir with a non-linear reservoir, where the latter is based on residual orthogonal connections along the temporal dimension for enhanced long-term propagation of the input. The resulting reservoir state dynamics are studied through the lens of linear stability analysis, and we investigate diverse configurations for the temporal residual connections. The proposed approach is empirically assessed on time-series and pixel-level 1-D classification tasks. Our experimental results highlight the advantages of the proposed approach over other conventional RC models.

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2508.09925 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2508.09925 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2508.09925 in a Space README.md to link it from this page.

Collections including this paper 1