Next Article in Journal
Parallel Lives: A Local-Realistic Interpretation of “Nonlocal” Boxes
Next Article in Special Issue
Anchor Link Prediction across Attributed Networks via Network Embedding
Previous Article in Journal
Performance Analysis of a Proton Exchange Membrane Fuel Cell Based Syngas
Previous Article in Special Issue
A New Recurrence-Network-Based Time Series Analysis Approach for Characterizing System Dynamics
Article

Information Entropy of Tight-Binding Random Networks with Losses and Gain: Scaling and Universality

Instituto de Física, Benemérita Universidad Autónoma de Puebla, Apartado Postal J-48, Puebla 72570, Mexico
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(1), 86; https://doi.org/10.3390/e21010086
Received: 12 October 2018 / Revised: 1 January 2019 / Accepted: 15 January 2019 / Published: 18 January 2019
(This article belongs to the Special Issue Complex Networks from Information Measures)
We study the localization properties of the eigenvectors, characterized by their information entropy, of tight-binding random networks with balanced losses and gain. The random network model, which is based on Erdős–Rényi (ER) graphs, is defined by three parameters: the network size N, the network connectivity α , and the losses-and-gain strength γ . Here, N and α are the standard parameters of ER graphs, while we introduce losses and gain by including complex self-loops on all vertices with the imaginary amplitude i γ with random balanced signs, thus breaking the Hermiticity of the corresponding adjacency matrices and inducing complex spectra. By the use of extensive numerical simulations, we define a scaling parameter ξ ξ ( N , α , γ ) that fixes the localization properties of the eigenvectors of our random network model; such that, when ξ < 0.1 ( 10 < ξ ), the eigenvectors are localized (extended), while the localization-to-delocalization transition occurs for 0.1 < ξ < 10 . Moreover, to extend the applicability of our findings, we demonstrate that for fixed ξ , the spectral properties (characterized by the position of the eigenvalues on the complex plane) of our network model are also universal; i.e., they do not depend on the specific values of the network parameters. View Full-Text
Keywords: information entropy; Erdős–Rényi graphs; random matrix theory; scaling laws information entropy; Erdős–Rényi graphs; random matrix theory; scaling laws
Show Figures

Figure 1

MDPI and ACS Style

Martínez-Martínez, C.T.; Méndez-Bermúdez, J.A. Information Entropy of Tight-Binding Random Networks with Losses and Gain: Scaling and Universality. Entropy 2019, 21, 86. https://doi.org/10.3390/e21010086

AMA Style

Martínez-Martínez CT, Méndez-Bermúdez JA. Information Entropy of Tight-Binding Random Networks with Losses and Gain: Scaling and Universality. Entropy. 2019; 21(1):86. https://doi.org/10.3390/e21010086

Chicago/Turabian Style

Martínez-Martínez, C. T.; Méndez-Bermúdez, J. A. 2019. "Information Entropy of Tight-Binding Random Networks with Losses and Gain: Scaling and Universality" Entropy 21, no. 1: 86. https://doi.org/10.3390/e21010086

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop