Table of Contents:

      Numerical Methods for Derivative Estimation

 

The notes in this series discuss the problem of numerically estimating derivative values from a data set consisting of "tabulated" function values at equal intervals. The exploration starts at the beginning, with the Newton's Interpolation Formula and "Central Differences." Unfortunately, the classic methods exhibit serious problems when used with anything but very precise and accurate data. Ideas are investigated for solving — or maybe this should be called avoiding — the noise and disturbance sensitivity problems that are typically encountered with digitized or measured data sets. This leads to two classes of estimators having practical merit — a class of elegant "maxflat" estimators with particularly good relative error for detecting very slow changes, and a class of "best fit" estimators having generally better performance overall, with some minor trade-offs in accuracy, noise rejection, and computational efficiency.

This is intended as practical information, not academic purity. Maybe a good thing, since academic purity continues to present the same old methods that don't work very well. Some of the information found here might appear new to you, but that does not imply that it is actually new. References are provided to prior sources when these are known.

Some mathematical background is required to understand the complete shallowness of this presentation. Fortunately, mathematical rigor is not the point. The point is general understanding, plus some practical results that you can grab and use immediately. While the material is open and free for you to use, this site is "copyleft," so there are some restrictions on how you republish modified versions of the materials you find here.

 


Section Title

Section 1 The Central Differences Method

Reviewing the classic and omnipresent approach to numerical estimation of derivatives — necessary background for understanding why the results are awful so much of the time.

Section 2 Problems with 'Central Differences'

Discussing the principal problems of the classic derivative estimators — their serious hypersensitivity to random noise and other disturbances.

Section 3 Frequency Analysis for Derivative Estimates

Analyzing what information is relevant to the derivative estimation proces, so that irrelevant or harmful features of the data can be avoided.

Section 4 Better Derivative Estimates by Prefiltering

Examining the first important idea for obtaining better derivative estimates — by first cleaning up the input data stream prior to derivative estimation.

Section 5 Better Polynomial-Based Derivative Estimators

Some improved methods combine the noise rejection and derivative estimation accuracy of the two-stage process into a single filter, for elegance and efficiency — good, but maybe not the best possible.

Section 6 Experimental Least Squares Derivative Estimator

An approach of fitting a polynomial model to the function sequence using "best-fit" methods is explored — with some distinctly good features, but not quite fully adequate.

Section 7 Least Squares Derivative Estimator

Matching the derivative estimator's frequency response directly, bypassing the idea of fitting a polynomial model, leads to a range of effective derivative approximator formulas. This is the payoff. Is it something new and different? Unlikely, but who would know?

 


 

Contents of the "Numerical Estimation of Derivatives" section of the ridgerat-tech.us website, including this page, are licensed under a Creative Commons Attribution-ShareAlike 4.0 International License unless otherwise specifically noted. For complete information about the terms of this license, see http://creativecommons.org/licenses/by-sa/4.0/. The license allows usage and derivative works for individual or commercial purposes under certain restrictions. For permissions beyond the scope of this license, see the contact information page for this site.