Methods -- Terrorist Attack Analysis

Because of the enormity of the event, we have been doing a great deal of exploratory analysis of the data collected during the terrorist attacks and the surrounding period. To make clear what is actually being done to visualize the data, we present here some of the correspondence that includes specific descriptions of the data processing and calculations. More general discussion of the procedures can be found in a statistics page detailing the primary measure used by the GCP, and descriptions of the technology for collecting the data in our measurement page. A number of other questions are addressed in the FAQ, and in the EGG Story.

The following is Dean Radin's description of the procedure used to create the graphs summarizing his analyses. Some of the parameters are changed for differing perspectives. For example, instead of 1-minute compression, the larger datasets may be compressed into 5-minute chunks, and the size of the sliding window may change from, say, 3 to 6 hours. The basic calculations remain constant, and variations are usually indicated in the text describing the figures.

1. Create one z-square per egg, per second.

2. Sum all available z-squares per second, and keep track of # eggs you summed (= df).

3. Turn step 2 into per minute z-square sums and dfs by summing in groups of 60.

4. For a cumulative graph keep adding up the per-minute zsums and dfs, and calculate the z equivalent as z = sqrt(2 * chi) - sqrt(2 * df - 1). This z equivalent is very accurate after df > 100.

5. Then I calculate p and odds from the z score. Z large positive means roughly more negentropy; z large negative means something like too much entropy.

6. For a sliding window, sum up 3 hours worth of zsums of dfs, and calculate z accordingly as above.

In Dean's examination of the details of the timing of effects and the effect of location, he gives another summary of the basic calculations.


GCP Home