EntroPy is a Python 3 package for computing several entropy metrics of one-dimensional time series.
Develop mode
git clone https://github.com/raphaelvallat/entropy.git entropy/
cd entropy/
pip install -r requirements.txt
python setup.py develop
Dependencies
- numpy
- scipy
- scikit-learn
1. Permutation entropy
from entropy import perm_entropy
x = [4, 7, 9, 10, 6, 11, 3]
print(perm_entropy(x, order=3, normalize=True))
0.589
2. Spectral entropy
from entropy import spectral_entropy
import numpy as np
np.random.seed(1234567)
x = np.random.rand(3000)
print(spectral_entropy(x, 100, method='welch', normalize=True))
0.994
3. Singular value decomposition (SVD) entropy
from entropy import svd_entropy
x = [4, 7, 9, 10, 6, 11, 3]
print(svd_entropy(x, order=3, delay=1, normalize=True))
0.421
4. Approximate entropy
from entropy import app_entropy
import numpy as np
np.random.seed(1234567)
x = np.random.rand(3000)
print(app_entropy(x, order=2, metric='chebyshev'))
2.075
5. Sample entropy
from entropy import sample_entropy
import numpy as np
np.random.seed(1234567)
x = np.random.rand(3000)
print(sample_entropy(x, order=2, metric='chebyshev'))
2.191
Some benchmarks computed on an average PC (i7-7700HQ CPU @ 2.80 Ghz - 8 Go of RAM)
from entropy import *
import numpy as np
np.random.seed(1234567)
x = np.random.rand(1000)
%timeit perm_entropy(x, order=3, delay=1)
%timeit spectral_entropy(x, 100, method='fft')
%timeit svd_entropy(x, order=3, delay=1)
%timeit app_entropy(x, order=2)
%timeit sample_entropy(x, order=2)
126 µs ± 3.8 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each) 137 µs ± 2.1 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each) 43 µs ± 462 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each) 4.86 ms ± 107 µs per loop (mean ± std. dev. of 7 runs, 100 loops each) 5 ms ± 277 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
EntroPy was created and is maintained by Raphael Vallat. Contributions are more than welcome so feel free to contact me, open an issue or submit a pull request!
To see the code or report a bug, please visit the GitHub repository.
Note that this program is provided with NO WARRANTY OF ANY KIND. If you can, always double check the results with another software.
Several functions of EntroPy were borrowed from:
- pyEntropy: https://github.com/nikdon/pyEntropy
- MNE-features: https://github.com/mne-tools/mne-features