Originally Posted by edge
Basically, there are so many facts left out of the RATE report that it is impossible to address the issue. This was hardly what I'd call a scientific study.
Originally Posted by jeolmeun
Which facts might those be?
Did you read the response to the rebuttal that is here:
"Helium Evidence for A Young World Remains Crystal-Clear"
"Helium Evidence for A Young World Overcomes Pressure"
How can someone contact D. Russell Humphreys?
There are some people with questions at: http://www.christiandiscussionforums.org/v/showthread.php?t=15092
for the response to a rebuttal that is here: http://www.trueorigin.org/helium01.asp "Helium Evidence for A Young World Remains Crystal-Clear"
and here: http://www.trueorigin.org/helium02.asp "Helium Evidence for A Young World Overcomes Pressure"
"Yes, I read them. The problem is that I could find no information on the depth of the samples, the ambient temperature, the He content of other minerals, the geothermal gradient, uranium contents, lead compositons, comparison with other minerals or nearby samples, regional He fluxes, etc. If you have some kind of treatment of the data this is not so laden with YEC justification, it might help. There isn't much discussion of a closing temperature and most of the data seems to come from above what I thought was the closing temperature for zircon.
As it is, we just have some diffusion data in a literal and figurative vacuum."
Paul of Eugene OR said
"This helium objection has been brought up before. If the zircons are porous enough to let helium out, they are porous enough to let helium in. Therefore all that is necessary to explain the presence of extra helium in zircons is to bathe the zircons in a helium source. It is a well known fact that helium is recovered from wells of natural gas and there could very well be places underground exposed to enough helium flow that the zircons are not able to lose their helium.
In other words, the case against all the evidence for an older earth is not proven after all.
And there are so many evidences of an age of the earth beyond 6000 years! for example, 400,000 annual layers of ice recovered from cores drilled in both Greenland and in Antarctica. How can there be annual layers of that many if the earth isn't that old?
Whether there are actually that many layers is a matter of interpretation of evidence. The claim that they are annual layers depends on uniformitarian assumptions that are invalid in a flood model. The flood model gives a scenario for the ice age (based on warm oceans and cold air) that are a good match for the real data, whereas it is a problem from a uniformitarian viewpoint. One implication of that model is that there would have been much greater snowfall in the Arctic than there now is; it is likely that layers from successive storms within a short period of time are being misinterpreted as annual layers.
For example, lead in zircons. Zircons, as they form by crystallization, exclude the lead that may be in the nearby environment just as freezing water excludes the salt in the water. But uranium atoms are not excluded . . . they fit in the crystal just fine. So when those atoms decay and leave a lead atom, it can't leave the crystal. Its stuck there. In this case, the lead gives a very reliable indication of how hold the zircon crystal is. Guess what . . . we find crystals millions even billions of years old measured in this way.
One conclusion of the RATE project is that there had been a period of accelerated nuclear decay. As a result, there has apparently been many millions of years worth of decay (at current rates) in a short period of time. The lead does not diffuse out, but the helium does (but actually has not). It therefore seems to me to be quite in accordance with expectations that there should be that amount of lead. The problem for the long-age people is that the helium is also there when it should have diffused away. Hence their claims about the supposed inadequacy of the investigations of the amount of helium.
The case is made for accelerated decay in the past. This would of course simply blow up both the earth and the sun if it should happen, but we'll pass over that simple problem and instead point out that radioactive decay rates are directly observable from the past by observing them to occur in distant galaxies. When a supernova explodes, its light persists for couple of weeks due to - radioactive decay of highly unstable elements created in the supernova explosion itself. This decay curve can be compared to the same decay curves for the same elements (spectroscope analysis determines the elements) and they are agreable with the rates we observe on earth today. We're talking about making observations by light that left the orginating galaxy millions and millions of years ago to finally arrive here."
How much decay and over what period? It is easy to make such blanket claims, backed up by such phrases as "of course" to make it seem unreasonable to dispute them. There is evidence of heat in the crust over radioactive areas that is unexpected if the decay has really taken place over the alleged long period, but is consistent with an accelerated decay pulse.
If the cause of the decay were local to the solar system, that would be irrelevant. For example, extra decay caused by increased radiation from the sun would be more strongly felt closer to the sun by the inverse-square law. Even in the outer fringes of the solar system, the effect would be almost minimal, let alone millions of light years away.The very idea that increased radiation from the sun can cause extra decays of K-40 or U-235 runs counter to all established principles of nuclear physics. It could only have come from the creative mind of a YEC, uninhibited by actual facts.This "response", unfortunately, doesn't actually say anything. It is well known that the radioactive materials in an atomic bomb (when it explodes) decay at a much higher rate than the "normal" one. Same thing happens in a nuclear reactor. Why? Because of an excess of neutrons. Which means that an excess of neutrons coming from the Sun should, according to all established principles of nuclear physics (which doesn't say much, btw) cause a higher rate of decay. This is rather obvious, so the above paragraph is only so much crap.
Plus, I remember reading a speculation (from "official" scientists) that *neutrinos* might affect the rate of decay. Since neutrinos are one product of a supernova explosion, that means that any radiometric clock is absolutely useless for any significant amount of time.
MarcelI still think that Setterfield provides the most elegant solution to the evidence for accelerated decay rates.
Accelerated decay rates are the natural consequence of a much higher value for c.
Since the E in E=mc^2 is a constant, accelerated decay rates do not produce increased levels of radiation as measured in dynamic time. The same amount of radiation is released over the same amount of dynamic time (think orbital clocks, not atomic clocks), just in smaller, more frequent intervals.
This will still get you excess helium, but without excess neutrons and radiation.
This section of Setterfield's site answers most questions:
Such a major error taints the whole article and renders it untrustworthy.
Alternatively, it may have been caused by a neutrino pulse from a near (1500LY) supernova. Again, this would have minimal effect at a distance of millions of light years.
"Correct, when pertinent, surrounding information is conveniently omitted, some of these YEC stories start to make sense.
I'm also going to have to try and track down the evidence, also, showing that Humphreys has a little problem with graphing data and makin sure it fits his scenario. There's a classic case out there, but I don't think I have the direct links to it."