Wouldn’t you think that the decay rates of isotopes found on Earth would remain fairly constant under controlled conditions? Statistically-speaking one would be able to make a pretty good prediction about a radioactive element’s decay rate at any point in the future, regardless of external influences. However, a group of researchers have found the radioisotope decay rates of radium (226Ra) and silicon (32Si) varies periodically. This may not seem strange at first, but when measured, this fluctuation in decay rate has a period of approximately a year. Does this relate to the Earth’s position in its orbit? Does this mean radioactive decay rates are influenced depending on how far the element is from the Sun? Perhaps decay rates are not as predictable as we think…
Generally speaking, the decay rates of radioisotopes should remain pretty constant regardless of external forces or drivers. However, in the 1980’s, scientists in Brookhaven National Labs in the US and at the Physikalisch-Technische Bundesandstalt in Germany found some strange and unexpected variations in the decay rates of silicon-32 and radium-226. No cause was found and a pattern didn’t appear to exist. That was until Jere Jenkins and colleagues from Purdue University, Indiana, made a stunning discovery.
Before we go into what they have uncovered, we’ll briefly discuss radioactive decay:
Radioactive decay is the process in which an unstable atomic nucleus loses energy by emitting radiation in the form of particles or electromagnetic waves. This decay, or loss of energy, results in an atom of one type, called the parent nuclide transforming to an atom of a different type, called the daughter nuclide […] This is a random process on the atomic level, in that it is impossible to predict when a given atom will decay, but given a large number of similar atoms, the decay rate, on average, is predictable. The SI unit of radioactive decay (the phenomenon of natural and artificial radioactivity) is the Becquerel (Bq). – Wikipedia: radioactive decay
In the case of 32Si, this isotope decays via beta-emission (i.e. high energy electrons or positrons are generated) and the 226Ra isotope decays via alpha-emission (i.e. high-energy helium nuclei containing two protons and two neutrons). Regardless of the type of emission, the radioactive decay rate should be predictable and certainly should not be influenced by an external driver, as the 2008 publication explains:
For 32Si and 226Ra, which decay by beta- and alpha-emission, respectively, fluctuations in the counting rates (in the absence of strong external electromagnetic fields) should thus be uncorrelated with any external time-dependent signal, as well as with each other. – Jenkins et al. 2008
With this in mind, the paper’s findings may seem pretty strange. Not only is there a periodic variation in the decay rates of the 1980’s 32Si and 226Ra independent samples, their periods appear to be correlated. But the best bit is yet to come; guess how long the observed period is? One year.
As the Earth orbits the Sun, its orbital radius varies slightly. At closest approach (perihelion), the Earth is approximately 147,000,000 km (0.98 AU) from the Sun and at aphelion the Earth is about 152,000,000 km (1.02 AU) from the Sun. This means that the Sun-Earth distance varies by approximately 5 million kilometres (0.04 AU); could this be the reason behind the annual modulation in decay rates? It would certainly explain the annual periodicity and it would also explain why all the independent samples are correlated. So what mechanism would be sensitive to this small variation in distance?
The Perdue group have a couple of ideas. Firstly, they refer to work recently done by John Barrow and Douglas Shaw which takes the standpoint that the fundamental constants of nature may not be fundamental nor constant. In this vane, they indicate that that the fine structure of the fabric of space-time may alter with distance from the Sun, thereby varying the “fundamental constants” slightly. Perhaps the decay rates of radioactive isotopes are influenced by this variation too.
Another idea is that decaying particles may be affected by neutrino flux. As you move further away from the Sun, the neutrino flux will decrease (following a 1/r2 decay), perhaps the annual modulation in neutrino flux is to blame. There appears to be some observational evidence for the neutrino explanation too. During the December 13th 2006 solar flare, the Perdue team measured a variation in decay rate. Some research suggests neutrino emission is altered by flare activity, so it would be interesting to see whether there is a flare-neutrino-decay rate link.
So the moral of this story is? The predictable nature of radioactive decay has just become a little more unpredictable…
Source: the physics arXiv blog
Publication: Jenkins et al. 2008
13 thoughts on “A Strange Connection: Could Nuclear Decay Rates be Influenced by Distance From the Sun?”
dam you Ian is nothing sacred any more. Oh how I long for the days when there were constants
I just looked at this paper, and the graphs tell the story — the flux is really small, only a couple of hundredths of a unit in either direction, but the way they line up with the Sun-Earth distance is certainly intriguing. It certainly looks significant and not happenstance.
I wonder… There are a couple spacecraft set to be launched toward and away from the sun in the next few years… I’m wondering if there’s any way they might be able to squeeze a small experiment on them to look for this flux as their distance from the sun changes.
You’d only need to borrow a small voltage, a counter, and some of the bandwidth to send the data back. It’s not like they’ll be doing a whole lot during their multiple-year journeys, anyway…
It is interesting that the DAMA signal also appears to have a period of 1 year. I wonder if the behaviour of this signal correlates with neutrino flux and the change in decay rates.
Okay, time for the obvious and stupid question that should be triggered by the admission in the story itself:
“For 32Si and 226Ra, which decay by beta- and alpha-emission, respectively, fluctuations in the counting rates (in the absence of strong external electromagnetic fields) should thus be uncorrelated with any external time-dependent signal, as well as with each other.”
Put another way: “if there is a strong electromagnetic field decay rates may be inconstant” …
Does the fluctuation of the decay rates imply that there is a strong electromagnetic field exposed by the decay rate periodicities? IE, is there an electric field set up between the sun and the Earth as charged bodies in space?
Could it be that as the “spark gap” (or whatever one calls the distance between charged bodies in interaction) shortens (perihelion) or lengthens (aphelion), the overall voltage drop between the objects rises and falls? Could the increase and decrease in said voltage drop / electric field account for the periodicities in decay rates?
If so, what are the implications? Might such an electric field also explain the 650,000 Amp current [flowing charged particles; electrons toward Earth and ions away from Earth] between the sun and the upper atmosphere of the Arctic?
It’s time to consider the electrostatics (stationary overall charges) of the Sun and of the Earth, as well as the electrodynamics (flowing charges) of currents in space.
I’m sure I’m jumping the gun in suggesting the above. But the latter conversation would still be an appropriate one for properly astrophysicists, plasma physicists and electrical engineers to hash out. The argument is far from settled.
Would the above account for the extremely minor fluctuations? IE, would the change in potential between sun and Earth change by just little enough over the course of the orbit to nudge decay rates into higher or lower emission modes seen in the results?
Could this be due to relativistic effects? Solar gravitational spacetime warping is a measureable effect when studying other temporal phenomena at different locations in the solar system…?
After reading this, and an earlier paper about changes in absolute velocity with the accelerated expansion, it’s had me scratching my head nearly as much as when the accelerated expansion itself was proposed.
It has me thinking of Cosmic Rays, quite a bit, wondering if relativistic effects might be involved in solving their mysteries, e.g., their long rates of travel and their sustaining of “fresh” ionization, not to mention their retention of such energies as cannot be explained by mechanisms in the present Galaxy, though the first light from Fermi confirms their origin, at least in line of sight, is prejudiced to the galactic disk.
Anyone care to guess how old Mercury really is, relatively speaking?
Commenting usually isnt my thing, but ive spent an hour on the site, so thanks for the info
Of course the cynical answer might be that the phenomenon is so marginal that seasonal changes in the instruments due to humidity might produce the “effect”. And, of course, the suggestion of space-probe RTG effects has been checked against “Cassini” with no observable effect being apparent. Doesn’t mean it isn’t real, it’s just not obviously radioactivity that’s changing with time.
What a useful post here. Very informative for me..TQ friends…sain-web.com
has this got anything to do with the earth entering a part of the milkyway that it has never been in before ? where the gravity is so strong it has started moveing our continental plates, which will end in ripping the earth apart in 2012.
Ok, I’m 9 years late to this discussion. Has the year “cycle” been displayed by any other isotopes since then? Also: I don’t see where this discussion has addressed the question, ” Does the observed rate of decay increase or decrease with distance from the sun?” This seems important from the stand point of influences by gravity, neutrinos, and space-time?