- Eigenvalues restricted by Lyapunov exponent of eigenstates We point out that the Lyapunov exponent of the eigenstate places restrictions on the eigenvalue. Consequently, with regard to non-Hermitian systems, even without any symmetry, the non-conservative Hamiltonians can exhibit real spectra as long as Lyapunov exponents of eigenstates inhibit imaginary parts of eigenvalues. Our findings open up a new route to study non-Hermitian physics. 2 authors · Jun 20, 2022
- Eigenvalues of the Hessian in Deep Learning: Singularity and Beyond We look at the eigenvalues of the Hessian of a loss function before and after training. The eigenvalue distribution is seen to be composed of two parts, the bulk which is concentrated around zero, and the edges which are scattered away from zero. We present empirical evidence for the bulk indicating how over-parametrized the system is, and for the edges that depend on the input data. 3 authors · Nov 22, 2016
- On Signs of eigenvalues of Modular forms satisfying Ramanujan Conjecture Let F in S_{k_1}(Gamma^{(2)}(N_1)) and G in S_{k_2}(Gamma^{(2)}(N_2)) be two Siegel cusp forms over the congruence subgroups Gamma^{(2)}(N_1) and Gamma^{(2)}(N_2) respectively. Assume that they are Hecke eigenforms in different eigenspaces and satisfy the Generalized Ramanujan Conjecture. Let lambda_F(p) denote the eigenvalue of F with respect to the Hecke operator T(p). In this article, we compute a lower bound for the density of the set of primes, { p : lambda_F(p) lambda_G(p) < 0 }. 1 authors · Dec 12, 2024
- Multiplicities of Eigenvalues of the Diffusion Operator with Random Jumps from the Boundary This paper deals with a non-self-adjoint differential operator which is associated with a diffusion process with random jumps from the boundary. Our main result is that the algebraic multiplicity of an eigenvalue is equal to its order as a zero of the characteristic function Delta(lambda) . This can be used to determine the multiplicities of eigenvalues for concrete operators. 2 authors · Jan 31, 2018
- Unlocking State-Tracking in Linear RNNs Through Negative Eigenvalues Linear Recurrent Neural Networks (LRNNs) such as Mamba, RWKV, GLA, mLSTM, and DeltaNet have emerged as efficient alternatives to Transformers for long sequences. However, both Transformers and LRNNs struggle to perform state-tracking, which may impair performance in tasks such as code evaluation. In one forward pass, current architectures are unable to solve even parity, the simplest state-tracking task, which non-linear RNNs can handle effectively. Recently, Sarrof et al. (2024) demonstrated that the failure of LRNNs like Mamba to solve parity stems from restricting the value range of their diagonal state-transition matrices to [0, 1] and that incorporating negative values can resolve this issue. We extend this result to non-diagonal LRNNs such as DeltaNet. We prove that finite precision LRNNs with state-transition matrices having only positive eigenvalues cannot solve parity, while non-triangular matrices are needed to count modulo 3. Notably, we also prove that LRNNs can learn any regular language when their state-transition matrices are products of identity minus vector outer product matrices, each with eigenvalues in the range [-1, 1]. Our experiments confirm that extending the eigenvalue range of Mamba and DeltaNet to include negative values not only enables them to solve parity but consistently improves their performance on state-tracking tasks. We also show that state-tracking enabled LRNNs can be pretrained stably and efficiently at scale (1.3B parameters), achieving competitive performance on language modeling and showing promise on code and math tasks. 6 authors · Nov 19, 2024