๐Ÿ“š goent - Awesome Go Library for Science and Data Analysis

Go Gopher mascot for goent

GO Implementation of Entropy Measures.

๐Ÿท๏ธ Science and Data Analysis
๐Ÿ“‚ Libraries for scientific computing and data analyzing.
โญ 34 stars
View on GitHub ๐Ÿ”—

Detailed Description of goent

goent - GO Implementation of Entropy Measures

GoDoc Build Status Coverage Status Go Report Card Awesome

Measures for discrete state spaces

Averaged measures

  • Entropy

    • Shannon
    • Maximum Likelihood with Bias Correction
    • Horvitz-Thompson
    • Chao-Shen
  • Conditional Entropy

  • Mutual Information

  • Conditional Mutual Information

  • Max Entropy Estimations:

    • Iterative Scaling
  • Information Decomposition (Bertschinger et al., 2014) for binary variables

  • Morphological Computation measures have been moved to gomi https://github.com/kzahedi/gomi

State-dependent measures

  • Mutual Information

  • Conditional Mutual Information

  • Entropy (Shannon)

  • Morphological Computation measures have been moved to gomi https://github.com/kzahedi/gomi

Measures for continuous state spaces

Averaged measures

  • Kraskov-Stoegbauer-Grassberger, Algorithm 1

  • Kraskov-Stoegbauer-Grassberger, Algorithm 2

  • Frenzel-Pompe

  • Morphological Computation measures have been moved to gomi https://github.com/kzahedi/gomi

State-dependent measures

  • Kraskov-Stoegbauer-Grassberger, Algorithm 1

  • Kraskov-Stoegbauer-Grassberger, Algorithm 2

  • Frenzel-Pompe

  • Morphological Computation measures have been moved to gomi https://github.com/kzahedi/gomi

References:

  • T. M. Cover and J. A. Thomas. Elements of Information Theory, Volume 2nd. Wiley, Hoboken, New Jersey, USA, 2006.
  • I. Csiszar. i-divergence geometry of probability distributions and minimization problems. Ann. Probab., 3(1):146โ€“158, 02 1975.
  • A. Chao and T.-J. Shen. Nonparametric estimation of shannonโ€™s index of diversity when there are unseen species in sample. Environmental and Ecological Statistics, 10(4):429โ€“443, 2003.
  • S. Frenzel and B. Pompe. Partial mutual information for coupling analysis of multivariate time series. Phys. Rev. Lett., 99:204101, Nov 2007.
  • A. Kraskov, H. Stoegbauer, and P. Grassberger. Estimating mutual information. Phys. Rev. E, 69:066138, Jun 2004. Bertschinger2013aQuantifying
  • N. Bertschinger, J. Rauh, E. Olbrich, J. Jost, and N. Ay, Quantifying unique information, CoRR, 2013