Entropy as Measure of Brain Networks' Complexity in Eyes Open and Closed Conditions.
Fabrizio VecchioFrancesca MiragliaChiara PappaletteraAlessandro OrticoniFrancesca AlùElda JudicaMaria CotelliPaolo Maria RossiniPublished in: Symmetry (2021)
Keyphrases
- information theory
- similarity measure
- network size
- shannon entropy
- mutual information
- sufficient conditions
- entropy measure
- brain connectivity
- information theoretic
- information content
- human brain
- relative entropy
- complexity measures
- worst case
- heterogeneous networks
- kullback leibler divergence
- social networks
- decision problems
- community structure
- data analysis
- controlled environment
- closed world
- quality of service