Text and Explanations for Y2K Analyses, Kurtosis Method

We present here the explanatory text for Dean Radin's Kurtosis-based analysis, to supplement the brief descriptions in the Y2K results pages. In the first email note, from 24 Jan, Dean gives a step-by-step recipe. In The second note, dated 26 Jan, a set of four graphs is described.

Epoch Analysis Using a Kurtosis Measure

Date: Mon, 24 Jan 2000 13:14:02 -0800
From: Dean Radin 
To: rdnelson 
Subject: Scrambled eggs are good!

Ok, I've done some musing and I think see what's going on.  Confirmation of my
hunch is that the overall odds now, using a refined method, are 1.14E+11 to 1,
peaking 2 seconds before midnight.  I'll send gif pictures later.

The analytical steps are as follow:

1) You download 24 hrs of raw egg data starting from 12-31-99 10:00 AM.
2) You correct the two half-hour-off Indian eggs by shifting them forward a
half-hour.
3) You take the per-second variance (not avedev) across all raw egg values.
4) You create an epoch matrix 24 hrs wide by 3600 seconds deep, where second
1800 = midnight in each time zone.  I start from -13 GMT, the Fiji egg, to +10
GMT, to cover 24 hrs. 
5) For each of the 3600 epoch seconds, you create x = exp(-kurt(across all time

zones)), where exp is exponential and kurt is kurtosis.
6) You create a sliding median window of 300 seconds against these 3600 x
values.
7) You use randomized permutation to evaluate the likelihood of seeing (a) a
minimum value of the sliding window graph as low as observed AND (b) a minimum
as close to midnight as observed.  Or you can calculate the odds against chance
using a z-score method based on deviations from the overall mean of the sliding
window graph.  Permutation analysis confirms that the z-score method gives
about the same results.
8) Using this method, the peak odds against chance is 10e11 to 1, and this peak
occurs 2 seconds before midnight.

Now, what does this mean?

Negative kurtosis means that a distribution is flatter than normal, and a
positive kurtosis means that the distribution is skinnier than normal.  In
other words, positive kurt shows restricted variance.  I am calculating -kurt

for my graph, thus going down in this graph means going towards restricted
variance.  I use -kurt rather than +kurt because I take an exp transform to
smooth the data, and taking exp of positive values quickly blows up.

So, the 5-minute median sliding window applied to this exp(-kurt) value
measures how kurtosis changes over the course of the epoch hour.  The graph of
this sliding window shows a sharp drop as we approach midnight, with minimum
value at -2 seconds.   This means that as midnight approaches, the distribution
of variances gets skinnier and skinner, becoming the skinniest at midnight.

And what does that mean?  It means that as midnight approaches the values being
produced by all eggs, across all time zones, become more coherent in the sense
of becoming more like each other, than either before or after midnight.  It
says nothing about the actual values produced by the eggs, only the
distribution of values.  A look at the actual raw values shows nothing
remarkable, so it's likely that we're seeing an ensemble effect affecting all
eggs rather than some individual eggs acting strangely.


The graph also shows a clear rise in exp(-kurt) about 15 minutes before
midnight, meaning the distribution of egg variances becomes LESS coherent
before they got MORE coherent.  Anticipation?  Conservation of ensemble
entropy?  Who knows?

I think it's clear then that (a) the direction of the results make sense in
terms of the overall egg network showing more constrained behavior near
midnight, which I interpret as a sign of systemic coherence [that's why I
started looking for these sort of variance changes in the first place], and (b)
that the odds against chance even using this peculiar set of transforms, and
even with 10 to 15 analytical interations required to find these maximum odds
values, is so far from a fluke that it is meaningful.  You can throw Bonferroni
at this all day long and it will still be far from chance.

It is a niggling worry, however, how to interpret very low p-values like this. 
Is Bonferroni appropriate for exploratory data analysis?   I'd estimate I've

spent about 8 hours doing these analyses.  Is it reasonable to expect that a
motivated analyst can find p = 10e-11 in that time?  I spent much more time
analyzing my presentiment data, and with much higher motivation to succeed, and
I didn't find anything even remotely close to this.  So, maybe it means
something afterall.

--------------------------------------------------------------------------------

Date: Wed, 26 Jan 2000 12:18:58 -0800
From: Dean Radin 
To: [email protected]
Subject: fried eggs



Attached are four figures.

Kurt1 shows the raw values of exp(-kurt(epoch across all time zones)).  Looks
awfully random at this view.
raw exp(-kurt)

Kurt2 shows what you get with a 5-minute sliding median, along with one
standard error bars as determined by randomized permutation analysis.  Very
non-random now.  Mean permuted sliding median = 0.966, stdev = 0.031, thus from
the observed minimum value of 0.81 we get an estimate of z = -5.13 for the
minimum, which occurs -2 seconds before midnight.
Median, 5-min window
 
Kurt3 shows a distribution of minimum values after random permutation of the
seconds, along with where the observed value was.  The distribution will need
many more permutations to become normal, but I think it's headed in that
direction. In which case the above estimate of z = -5 may be correct.
Distribution, minimums after permutation

Kurt4 shows a distribution of where the minimum occurred wrt midnight
 From this we see that where the min occurs is, as expected, uniformly
distributed, thus the p of getting a minimum as close to observed, at -2
seconds, is 5/3330 (5 because the distances being tested are -2, -1, 0, 1, and
2, and 3300 because that's how many median windows of 5 minutes wide there
are).  Thus p is at least .002.  The permutation found no cases where the min
was within this range, but I only ran 500 permutations so far, because it takes
a few minutes per permutation (calculating 3300 sliding medians for each
permutation eats a lot of cycle time -- if someone knows how to calc a median
without doing a sort, please let me know!  I'm using the quicksort algorithm,
but still ...).
Distribution, minimum time from midnight



GCP Home