Skip to content

Methods for numerical differentiation of noisy data in python

License

Notifications You must be signed in to change notification settings

florisvb/PyNumDiff

Repository files navigation

PyNumDiff

Methods for numerical differentiation of noisy time series data, including multi-objective optimization routines for automated parameter selection.

Table of contents

Introduction

Structure

Getting Started

Installing via pip

pip install pynumdiff

Installing from source

To install this package, run python ./setup.py install from inside this directory.

Requirements

Python version: 2.7+

Minimal requirements: numpy, scipy, matplotlib

Certain methods require additional packages:

To run the notebooks and generate figures in notebooks/paper_figures you will also need:

Usage

  • Basic Usage: you provide the parameters x_hat, dxdt_hat = pynumdiff.sub_module.method(x, dt, params, options)

  • For example, a few favorites: x_hat, dxdt_hat = pynumdiff.linear_model.savgoldiff(x, dt, [3, 20, 25]) x_hat, dxdt_hat = pynumdiff.kalman_smooth.constant_acceleration(x, dt, [1e-1, 1e-2]) x_hat, dxdt_hat = pynumdiff.total_variation_regularization.jerk(x, dt, [10]) x_hat, dxdt_hat = pynumdiff.smooth_finite_difference.butterdiff(x, dt, [3, 0.07])

  • Comprehensive examples: In jupyter notebook form: notebooks/1_basic_tutorial.ipynb

  • Advanced usage: automated parameter selection through multi-objective optimization This approach solves a loss function that balances the faithfulness and smoothness of the derivative estimate, and relies on a single hyperparameter, gamma, or tvgamma in the code. See the paper for more detail, but a brief overview is given in the example notebooks linked below.

    params, val = pynumdiff.optimize.sub_module.method(x, dt, params=None, tvgamma=tvgamma, # hyperparameter dxdt_truth=None, # no ground truth data options={}) print('Optimal parameters: ', params) x_hat, dxdt_hat = pynumdiff.sub_module.method(x, dt, params, options={'smooth': True})

  • Important points:

  • Larger values of tvgamma produce smoother derivatives

  • The value of tvgamma is largely universal across methods, making it easy to compare method results

  • The optimization is not fast. Run it on subsets of your data if you have a lot of data. It will also be much faster with faster differentiation methods, like savgoldiff and butterdiff, and probably too slow for sliding methods like sliding DMD and sliding LTI fit.

  • The following heuristic works well for choosing tvgamma, where cutoff_frequency is the highest frequency content of the signal in your data, and dt is the timestep.

tvgamma = np.exp( -1.6*np.log(cutoff_frequency) -0.71*np.log(dt) - 5.1 )

Examples

Running the tests

Citation

If you use this package for your differentiation needs, please cite this paper.

F. van Breugel, J. Nathan Kutz and B. W. Brunton, "Numerical differentiation of noisy data: A unifying multi-objective optimization framework," in IEEE Access, doi: 10.1109/ACCESS.2020.3034077.

@ARTICLE{9241009, author={F. {van Breugel} and J. {Nathan Kutz} and B. W. {Brunton}}, journal={IEEE Access}, title={Numerical differentiation of noisy data: A unifying multi-objective optimization framework}, year={2020}, volume={}, number={}, pages={1-1}, doi={10.1109/ACCESS.2020.3034077}}

Contributing

License

Acknowledgments

About

Methods for numerical differentiation of noisy data in python

Resources

License

Stars

Watchers

Forks

Packages

No packages published