Showing posts with label Age of Earth. Show all posts
Showing posts with label Age of Earth. Show all posts

Thursday, 22 December 2011

Carbon Dating Fails


      Carbon Dating, like all radiometric dating, depends on measuring an unstable isotope (in this case, Carbon 14) against what it decays into over time (regular Carbon). Like all radiometric dating, this method of trying to calculate the age of something based on the decay rate of a particular isotope is overflowing with shaky assumptions and unknowable variables. (See my previous write up for more on these problematic assumptions, “Is Radiometric (Carbon) Dating Reliable?”.).

      Carbon dating can’t date things older than 50,000 to 100,000 years old because after that, there should be no detectable amounts of carbon 14 left in the sample. So palaeontologists don’t bother carbon dating anything they believe to be older than 50,000 years old, which includes most fossils such as dinosaur fossils, since dinosaurs are believed to have all died out millions of years ago (a lot more than 100 thousand years).

      Because scientists have already determined that fossils are very old based on the layer of strata that they’re found in, they either don’t bother carbon dating fossils or they don’t trust the results when those results don’t match what they already believe to be the “likely age”. You could run a million carbon dating tests on a million different fossil samples and find carbon 14 in all of them, however scientists would consider it all “contaminated” and “unreliable”. Why? Because they already “know” (believe) that the fossils are millions of years old, not thousands, so even though the existence of carbon 14 in the samples being tested suggest they’re thousands of years old (not millions), scientists will ignore the tests because they’ve already decided it can’t be trusted beyond 50,000 years. Convenient, eh?

      Imagine you’re a palaeontologist and you’ve just discovered a well preserved fossil in a deep layer of strata. You automatically assume that the fossil is many millions of years old, because of its position in the geological column (which you learned in school), and so instead of bothering to carbon date the fossil, you instead look at what layer of strata you found the fossil in. When asked how old the fossil is, you confidently announce the age of the surrounding rocks as being the age of the fossil. But you haven’t actually carbon dated the fossil itself OR the sedimentary rock the fossil was found in! You’ve simply looked at your handy chart of the Geological Column and it’s matching Evolution of Life counterpart chart and pinned down a rough date based on what has already been established by other scientists. The fossil is of a certain type found in a certain rock layer, so that means it must be about THIS old (based on the charts). If someone were to take your fossil samples and send them for carbon dating and got a result that was completely different than what you already believe to be the real age, you’d simply laugh and say, “The fossil was contaminated and can’t be trusted.”

      The truth is that carbon dating has returned tons of “bad results” so often that scientists don’t believe it is at all reliable for dating fossils unless they’re no older than about 10,000 years (50,000 years maximum). The fact that many supposedly millions of years old fossils and samples have had measurable amounts of Carbon 14 in them, when they shouldn’t have any, is explained away as contamination. Yet we’re supposed to trust that contamination is very rare with regards to other forms of radiometric dating…

      Actually, when you really get down to it, these methods of dating rocks and fossils are simply being used to pin down what evolutionists already believe to be the estimated date of their samples and fossils. If the dates don’t match what they believe, then the sample is contaminated and the date given by the test is wrong. If it does match, then everyone agrees, and that’s how old the sample is, and everyone’s happy.

      But what happens if radiometric dating is completely unreliable as a way to date rocks and fossils? The system is, after all, based on many major assumptions that must be perfectly constant and exact for the whole thing to have any chance of being truly accurate… The truth is, if you toss radiometric dating out the window, then you’re once again left with ONLY the assumptions and guess estimates of geologists and palaeontologists that base their entire system of age dating on the belief that the world is ancient and that all life evolved from a single-celled organism (which evolved from non-living molecules). In other words, when scientists tell you the age of a fossil or sample of rock that they weren’t there to see get buried or made, they’re mostly making it all up, or following the created charts of scientists before them who also made it all up. And this is all because of their preconceived belief system, believing in an ancient earth and that all life evolved to what it is now over hundreds of millions of years of time. No real historical or proven dates were ever used.

Wednesday, 21 December 2011

Is Radiometric (Carbon) Dating Reliable?


      Radiometric dating (carbon dating and other similar methods) are considered by many to be the silver bullet in the issue of whether or not our planet is as old as most scientists believe it is. To them, radiometric dating is one of the biggest and best evidences for a very ancient planet. However, as you’re about to see, radiometric dating is not anywhere near as certain as scientists would like you to believe. Considering that this is “one of the best evidences for a billions of years old planet”, it better be a pretty solid irrefutable fact, right? Well, it isn’t anywhere close.

How Radiometric Dating Works
      Radiometric dating works by measuring the ratio between an unstable isotope of an atom against the version of the atom that the unstable isotope eventually turns into. Scientists do this by observing how long it takes for a particular unstable atom to “decay” into a different stable atom. Unstable atoms of a particular type decay at a measurable but slow rate and by comparing the ratio of stable verses unstable isotopes of a particular kind, scientists believe they can calculate how long it has been since that rock (or material) formed. The “parent” isotope is the unstable isotope of an atom and the “child” is what it turns into as it “decays” over time. By comparing the amount of these two atoms within the material being tested, scientists believe they can calculate its age.
      But there are some very serious problems with this reasoning because the entire processes is based upon major assumptions that if at all inaccurate, make the entire process useless as a means of dating something.

The Problem:  Based On Uncertain Assumptions
      For radiometric dating to be reliable, a number of major assumptions must be constantly true or else the whole process of using it as a method to date materials completely falls apart.

Assumption 1: Original Content of the Rock
      The people running the tests have to assume how much of the parent and child isotopes existed in the rock when it first became solid. If there were actually more or less parent isotopes in the rock than the scientists assume, or more or less child isotopes in the rock than the scientists assume, then their calculations to determine the age of the rock are going to end up wrong.
      Imagine you had a jar full of small coloured balls (some red, some blue) and every five minutes you removed one red ball and replaced it with a blue one. If you did this consistently until most of the red balls were replaced by blue ones, then calculated how much time it took for this to happen, you could come up with a pretty good estimate of how long the whole process took based on the number of red balls compared to blue ones. But if you tossed a random amount of red and blue balls into the jar at the beginning and didn’t bother to count them beforehand, your calculations would be completely inaccurate because you had no idea what the original ratio of blue to red balls was before you began the experiment.
      This is one of the problems scientists have when trying to date materials with radiometric dating. They can’t possibly know the exact contents and ratios of isotopes and atoms within a rock when it first formed unless they were there at its formation and measured it then and compared it to now. Measuring it hundreds of thousands (or as they believe, hundreds of millions, even billions, of years later), they can’t know the original condition or contents of that rock when it was formed, so their entire basis for calculating the age of the rock based on what they believe to have been the original contents is very flawed. They can’t be at all certain what the original ratio was when the rock first solidified.

Assumption 2: Constant Decay Rate
      It is believed that the rate of decay from a parent isotope to a child isotope is constant and does not change, however this is proving to be a false assumption. In recent years it has been discovered that the decay rate of unstable isotopes is NOT as constant as scientists used to think it was. Because of this, scientists have had to adjust their calculations and dates a number of times over the past few decades. The adjustments in the calculation have not been very large, but it can mean a different date of hundreds of millions of years when all is said and done. It has also been found that the activity of our sun has an impact on decay rates, making them slightly faster or slower depending on how calm or chaotic our star is behaving. Though again, the changes are very minor (mere fractions of a percent), it shows that the assumption that decay rates are constant and a reliable form of measurement is wrong.
      Scientists have been studying the decay rates of unstable isotopes for a little less than one hundred years. In that time they’ve had to make numerous adjustments to their calculations as they’ve found that the numbers they thought were accurate have turned out to be wrong or not as constant as they once thought. This is only about a hundred years worth of observation by modern science. With one hundred years of observation and despite having to change the values numerous times during that short time period, scientists assume that their calculations are accurate enough to calculate rocks across hundreds of millions, even billions, of years. That means that in all that time, the decay rates of unstable atoms cannot have ever been different than what scientists calculate the decay rate as today. Yet even in the last hundred years they’ve had to make many adjustments as they’ve realized that their calculations were flawed, sometimes resulting in hundreds of millions of years being taken off the age of previously dated materials. To then assume that the decay rate has matched what we observe today throughout billions of years of history is extremely problematic and very likely a bad and inaccurate assumption of the reality.

Assumption 3: No Contamination
      Scientists believe that once a rock has turned solid (usually lava that has cooled and solidified), that nothing can change the make up of that rock. This assumption states that the rock has not been contaminated by an outside source since it became solid. Yet laughably, when a rock is dated and comes up with a date very different than what scientists expect (based on the rock’s location in rock layers), scientists quickly fall onto the assumption that the rock must have been contaminated. Think about that for a moment. The assumption is that a rock is very unlikely to be contaminated across hundreds of millions of years of time by its surrounding environment, so that we can accurately calculate the ratio of parent and child isotopes, thus dating the rock. Yet when the date turns out different than expected, they assume (this is the second of two assumptions now) that the rock, as unlikely as they believe it is, must have been contaminated. Essentially this means, “Contamination is extremely rare and unlikely, but we blame bad dates on it anyway. We trust the dates we agree with, usually, and don’t trust the dates we disagree with.”

Big Problems With Bad Dates
      Many rock samples have been taken from lava flows that have been observed by mankind and when radiometric dated, have given very inaccurate dates. This automatically suggests that the assumptions that radiometric dating are based upon are severely flawed. When a lava flow has been observed by eye witnesses and dated as having happened in a particular year, and then radiometric dating says that the lava flow is actually hundreds of thousands (or even millions of years) older than it actually is, something is very wrong with this dating method. Lava that formed and was witnessed forming in 1986 at Mount Saint Helens was radiometric dated in 1996 and returned a date of 350,000 years, when in fact it was only 10 years old. Similar problems with lava flows that had eyewitnesses from New Zealand, Hawaii, and other places on the planet over the last few hundred years have also been radiometric dated and come up with very different dates from when they were observed to have actually formed. These are only a FEW examples of “bad radiometric dates”. There are also many instances of two samples from the same set being given completely different dates despite very clearly belonging to the exact same material (artefact).
      Coal and diamonds present another big problem for radiometric dating because the assumed age of these deposits in the earth do not work at all with radiometric dating. Scientists then assume that to radiometricly date these materials is a waste of time. Essentially this is an example of “the dates don’t match what we believe, so we’re going to ignore the radiometric data in favour of what we already believe and assume to be true”.
      The problem arises because Carbon 14, an unstable isotope of Carbon, has a decay rate that is quite fast compared to many other unstable isotopes. Because of this fast decay rate, no amount of Carbon 14 should be detectable in any sample older than 250 thousand years. Yet coal and diamonds HAVE detectable amounts of Carbon 14 within them. This should be impossible because most coal deposits are believed to be hundreds of millions of years old, and diamonds are believed to be some of the oldest substances on the planet, being billions of years old. They should have NO Carbon 14 in them at all because it should have all decayed away a very long time ago, but it hasn’t. So scientists assume that the radiometric dates are wrong for these materials, for one reason or another, since they don’t match what they expect/assume to be true.
      That these problematic examples of radiometric dating exist at all is a testament to the fact that the method and assumptions it is based around are wrong. We can definitely know the current decay rate of unstable isotopes through modern observations, but we cannot adequately know that our assumptions about the factors involved in this dating method have been true for all of history. Errors such as the few (of many) listed above show that we can’t possibly know enough about the variables and factors that contribute to the current conditions or contents of a sample sent for radiometric dating. There are far too many complete unknowns that can easily cloud or destroy the accuracy of using radiometric dating as a reliable way to date materials.

Conclusion
      If ANY of the multiple assumptions that radiometric dating is based upon are at all flawed (inaccurate), then the dates calculated will be wrong. Because of the number of assumptions this process must adhere to and “trust” based on a very limited amount of available information, using radiometric dating as a method to date material beyond a short amount of time is extremely unreliable. Scientists essentially have to trust that they know what the original contents of the rock were when it first solidified, they have to trust that the decay rate of unstable isotopes within that rock has not changed across hundreds of millions of years, and they also have to trust that the rock has not been contaminated by outside sources in all of that time. These are massive assumptions that simply cannot be adequately verified! In fact, these assumptions have been proven wrong and required adjustment so many times in only the last hundred years that to still trust them as even “mostly accurate” is unbelievably optimistic. If even ONE of the many assumptions is slightly off (wrong) with regards to a sample being tested, then the calculated date is also going to be wrong.


(In a write up I’m still working on, I’ll explore the origins of the very ancient age of the earth touted by modern science and show you that radiometric dating didn’t enter the picture until well after scientists had already decided that the planet was probably billions of years old. In other words, radiometric dating matched nicely with what they mostly already believed, so it became the “proof” that their assumptions must be true.)