1 * From: James Theiler <jt@lanl.gov>
2 To: John Lamb <J.D.Lamb@btinternet.com>
3 Cc: gsl-discuss@sources.redhat.com
4 Subject: Re: Collecting statistics for time dependent data?
5 Date: Thu, 9 Dec 2004 14:18:36 -0700 (MST)
7 On Thu, 9 Dec 2004, John Lamb wrote:
9 ] Raimondo Giammanco wrote:
12 ] > I was wondering if there is a way to compute "running" statistics with
15 ] Yes you can do it, but there's nothing in GSL that does it and its eay
16 ] enough that you don't need GSL. Something like (untested)
18 ] double update_mean( double* mean, int* n, double x ){
22 ] *mean = (1 - (double)1 / *n ) * *mean + x / n;
25 ] will work and you can derive a similar method for updating the variance
26 ] using the usual textbook formula.
28 ] var[x] = (1/n) sum x^2_i - mean(x)^2
30 ] I don't know if there is a method that avoids the rounding errors. I
31 ] don't know why so many textbooks repeat this formula without the
32 ] slightest warning that it can go so badly wrong.
36 Stably updating mean and variance is remarkably nontrivial. There was
37 a series of papers in Comm ACM that discussed the issue; the final one
38 (that I know of) refers back to the earlier ones, and it can be found
39 in D.H.D. West, Updating mean and variance estimates: an improved
40 method, Comm ACM 22:9, 532 (1979) [* I see Luke Stras just sent this
41 reference! *]. I'll just copy out the pseudocode since the paper is
42 old enough that it might not be easy to find. This, by the way, is
43 generalized for weighted data, so it assumes that you get a weight and
44 a data value (W_i and X_i) that you use to update the estimates XBAR
53 TEMP = SUM + W_i // typo: He meant SUMW
67 James Theiler Space and Remote Sensing Sciences
68 MS-B244, ISR-2, LANL Los Alamos National Laboratory
69 Los Alamos, NM 87545 http://nis-www.lanl.gov/~jt
72 * Look at STARPAC ftp://ftp.ucar.edu/starpac/ and Statlib
73 http://lib.stat.cmu.edu/ for more ideas
75 * Try using the Kahan summation formula to improve accuracy for the
76 NIST tests (see Brian for details, below is a sketch of the algorithm).
88 * Prevent incorrect use of unsorted data for quartile calculations
89 using a typedef for sorted data (?)
91 * Rejection of outliers
93 * Time series. Auto correlation, cross-correlation, smoothing (moving
94 average), detrending, various econometric things. Integrated
95 quantities (area under the curve). Interpolation of noisy data/fitting
96 -- maybe add that to the existing interpolation stuff.What about
97 missing data and gaps?
99 There is a new GNU package called gretl which does econometrics
101 * Statistical tests (equal means, equal variance, etc).