Jump to content

Entropy of network ensembles

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by Krakatow (talk | contribs) at 15:41, 21 September 2020 (wording, punctuation). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

A set of networks that satisfies given structural characteristics can be treated as a network ensemble.[1] The entropy of a network ensemble measures the level of the order or uncertainty of a network ensemble, which was first brought up by Ginestra Bianconi in 2007.[2] Basically, the entropy is the logarithm of the number of graphs[3]. To clarify, entropy can also be defined in one network. Basin entropy is the logarithm of the attractors in one Boolean network[4].

Employing statistical mechanics approaches, the complexity, uncertainty, and randomness of networks are able to be described by network ensembles with different types of constraints[5].

Gibbs and Shannon entropy

By analogy to statistical mechanics, microcanonical ensembles and canonical ensembles of networks are introduced for the implementation. A partition function Z of an ensemble can be defined as

where is the constraint, and () are the elements in the adjacency matrix, if and only if there is a link between node i and node j. is a step function with if , and if . The auxiliary fields and have been introduced as analogy to the bath in classical mechanics.

For simple undirected networks, the partition function can be simplified as[6]

where , is the index of the weight, and for a simple network .

Microcanonical ensembles and canonical ensembles are demonstrated with simple undirected networks.

For a microcanonical ensemble, the Gibbs entropy is defined by

where indicates the cardinality of the ensemble, i.e., the total number of networks in the ensemble.

The probability of having a link between nodes i and j, with weight is given by

For a canonical ensemble, the entropy is presented in the form of a Shannon entropy

Relation between Gibbs and Shannon entropy

Network ensemble with given number of nodes and links , and its conjugate-canonical ensemble are characterized as microcanonical and canonical ensembles and they have Gibbs entropy and the Shannon entropy S, respectively. The Gibbs entropy in the ensemble is given by..[7]

For ensemble,

Inserting into the Shannon entropy[6],

The relation indicates that the Gibbs entropy and the Shannon entropy per node S/N of random graphs are equal in the thermodynamic limit .

Von Neumann entropy

Von Neumann entropy is the extension of the classical Gibbs entropy in the quantum context. This entropy is constructed from a density matrix with the expression of Laplacian matrix L associated with the network. The average von Neumann entropy of an ensemble is calculated as[8]

For random network ensemble , the relation between and is nonmonotonic when the average connectivity is varied.

For canonical power-law network ensembles, the two entropies are linearly related[6]

Networks with given expected degree sequences suggest that, heterogeneity in the expected degree distribution implies an equivalence between a quantum and a classical description of networks, which respectively corresponds to the von Neumann and the Shannon entropy[9].

Von Neumann entropy can also be calculated in multilayer networks with tensorial approach[10]

See also

References

  1. ^ Levin, E.; Tishby, N.; Solla, S.A. (October 1990). "A statistical approach to learning and generalization in layered neural networks". Proceedings of the IEEE. 78 (10): 1568–1574. doi:10.1109/5.58339. ISSN 1558-2256. S2CID 5254307.
  2. ^ Bianconi, Ginestra (10 December 2007). "The entropy of randomized network ensembles". EPL (Europhysics Letters). 81 (2): 28005. arXiv:0708.0153. doi:10.1209/0295-5075/81/28005. ISSN 0295-5075. S2CID 17269886.
  3. ^ Menichetti, Giulia; Remondini, Daniel (2014). "Entropy of a network ensemble: definitions and applications to genomic data". Theoretical Biology Forum. 107 (1–2): 77–87. ISSN 0035-6050. PMID 25936214.
  4. ^ Krawitz, Peter; Shmulevich, Ilya (27 September 2007). "Entropy of complex relevant components of Boolean networks". Physical Review E. 76 (3): 036115. arXiv:0708.1538. Bibcode:2007PhRvE..76c6115K. doi:10.1103/PhysRevE.76.036115. PMID 17930314. S2CID 6192682.
  5. ^ Bianconi, Ginestra (27 March 2009). "Entropy of network ensembles". Physical Review E. 79 (3): 036114. arXiv:0802.2888. Bibcode:2009PhRvE..79c6114B. doi:10.1103/PhysRevE.79.036114. PMID 19392025. S2CID 26082469.
  6. ^ a b c Anand, Kartik; Bianconi, Ginestra (13 October 2009). "Entropy measures for networks: Toward an information theory of complex topologies". Physical Review E. 80 (4): 045102. arXiv:0907.1514. Bibcode:2009PhRvE..80d5102A. doi:10.1103/PhysRevE.80.045102. PMID 19905379. S2CID 27419558.
  7. ^ Bogacz, Leszek; Burda, Zdzisław; Wacław, Bartłomiej (1 July 2006). "Homogeneous complex networks". Physica A: Statistical Mechanics and Its Applications. 366: 587–607. arXiv:cond-mat/0502124. Bibcode:2006PhyA..366..587B. doi:10.1016/j.physa.2005.10.024. ISSN 0378-4371. S2CID 119428248.
  8. ^ Du, Wenxue; Li, Xueliang; Li, Yiyang; Severini, Simone (30 December 2010). "A note on the von Neumann entropy of random graphs". Linear Algebra and Its Applications. 433 (11): 1722–1725. doi:10.1016/j.laa.2010.06.040. ISSN 0024-3795.
  9. ^ Anand, Kartik; Bianconi, Ginestra; Severini, Simone (18 March 2011). "Shannon and von Neumann entropy of random networks with heterogeneous expected degree". Physical Review E. 83 (3): 036109. arXiv:1011.1565. Bibcode:2011PhRvE..83c6109A. doi:10.1103/PhysRevE.83.036109. PMID 21517560. S2CID 1482301.
  10. ^ De Domenico, Manlio; Solé-Ribalta, Albert; Cozzo, Emanuele; Kivelä, Mikko; Moreno, Yamir; Porter, Mason A.; Gómez, Sergio; Arenas, Alex (4 December 2013). "Mathematical Formulation of Multilayer Networks". Physical Review X. 3 (4): 041022. arXiv:1307.4977. Bibcode:2013PhRvX...3d1022D. doi:10.1103/PhysRevX.3.041022. S2CID 16611157.