I wrote an overview of how the accuracy of a Geiger counter works. Can you guys give it a sanity check? I'm pretty new to nuclear chemistry.
= Radioactive decay and stochastic probability
Every time you flip a coin there's a 50% chance it'll land heads up. Radioactive decay is similar; for instance, every 4.47 billion years, there's a 50% chance that an atom of uranium-238 will decay to thorium-234.
Although it's impossible to predict when a decay event will occur, it is possible to know its probability over a given timescale. This kind of random behavior is known as stochastic probability.
= Time passed, chance of a decay event having occurred for a U-238 atom
2.24 by 25.0%
4.47 by 50.0%
8.94 by 75.0%
13.4 by 87.5%
17.9 by 93.8%
22.4 by 96.9%
26.8 by 98.4%
The reason this matters is because it affects the accuracy of a radioactivity measurement taken with a Geiger counter.
Sample size (number of counts recorded), average percent deviation (average accuracy of a given measurement)
10 25%
100 8.0%
1,000 2.5%
10,000 0.80%
100,000 0.25%
For instance, if you were to measure 100 decay events with a Geiger counter, and the "true" radioactivity of the region is 20.00 counts per minute (CPM), each individual measurement will be 8.0% off, on average (20.00 CPM ± 1.60 CPM).
If were to you take 10 measurements of 100 decay events and put them together (1,000 decay events total) you would get a measurement which would be 2.5% off, on average.
The average percent deviation can be approximated by the following formula, where x is the sample size:
APD = 80x#8315;¹#5151;²
Accounting for this error can be important when trying to do precision measurements, particularly when you need to identify whether a data discrepancy is due to a genuine difference in radioactivity or just stochastic randomness. See the hypothetical example below.
= Hypothetical example
Let's say you're scouting an area for radioactive minerals. At six different locations you take a 1 hour measurement with a Geiger counter:
= Site, # of clicks detected in 1 hr
Site A 987
Site B 1011
Site C 981
Site D 1031
Site E 971
Site F 1019
You recognize that Site D has the highest number of clicks detected, but is it enough to warrant further investigation?
Given that the average of the six sites is 1,000 counts per hour, the average percent deviation is 2.5%. The number of clicks detected at Site D is 3.1%, only slightly outside the average percent deviation. From this you can conclude that Site D's reading isn't statistically significant, and the site isn't much more likely to host radioactive minerals than any of the other locations.
The "average percent deviation" formula was estimated with a numerical simulation program I wrote. I'm not sure if it's actually correct but it appears to be.