William Roosa June 24, 2014 at 9:40 AM
Fact: Short wavelength solar radiation heats the earth.
Fact: The hot earth re-emits longer wavelength radiation.
Fact: CO2 absorbs some of this longer wavelength radiation
Fact: By absorbing this radiation CO2 causes a thermo dynamical imbalance in the energy in-out equation for the earth as a whole and causes the earth to warm. Aka the greenhouse effect.
All fine and dandy and nobody is questioning that CO2 is a greenhouse gas.
Fact: The extinction distance for CO2 in the earth’s atmosphere is about 300 feet. That is to say that any thermal radiation emitted by the ground and in the wavelength that CO2 absorbs strongly at will all be absorbed within the first 300 ish feet. That would be no matter how intense the radiation was. Engineers discovered this back in the 60s when they were trying to develop laser communication through the atmosphere. CO2 an H2O completely absorbed the laser light in those wavelengths that they strongly absorb at. 300 feet is based on the 1960 CO2 concentrations BTW.
Fact: Increasing the concentration of CO2 only shortens the extinction distance it does not trap more heat. ALL the heat that CO2 can trap is already being absorbed in the extinction distance.
Conclusion: increasing the concentration of CO2 does not cause any more heat to be trapped and therefore cannot be the cause of global warming.
And to the point, Man is not causing global warming because CO2 can’t be the method by which it is happening.
Response:
I want to say that this submission, like the previous submission on the AMO, is a real treat. This submission is based on science, is logical, well thought out and the submitter did some homework. Well done.
So, let's look at it in detail.
The first four facts he states are all good. No problems there.
But then, some issues begin to arise.
Let's discuss extinction distance. This is the distance in which something is totally absorbed in a second something and adding more of the second something will not affect how much of the first something gets through.
As an example, the ocean has an extinction distance for sunlight. Within a kilometer of the surface 100% of sunlight will be absorbed by the ocean. That is the extinction distance for sunlight in the ocean (sorry, I don't know the exact value). Anything below that will be in darkness and adding more depth will not change that any. If you are deeper than the extinction distance there will be no light. That is equally true for 1.1 kilometers as it is for 5 kilometers.
So, the extinction distance for CO2 in the atmosphere is 300 feet (I don't know that myself, but I'll take your word for it. I know the extinction distances for the gases is dramatically less than the thickness of the atmosphere). What this means is that when the heat is radiated from the surface of the planet as infrared (IR) radiation it will be 100% absorbed (at the levels of CO2 present in 1960). So far, so good. We're in agreement.
Now, though, is when we get into trouble and this is where the extinction distance of light in the ocean is different than the extinction distance of IR in the atmosphere. When light is absorbed in the oceans it is turned into a different kind of energy - heat - and is not reemitted as visible light. When IR radiation is absorbed by CO2 molecules, it is reemitted as IR light.
In the ocean, light is absorbed and turned into heat, which then transmits it in pretty much random directions (let's keep it a simple environment without currents and convection and gradients and such). That means the heat could go up, down or sideways.
When IR is absorbed by CO2, it is reemitted as IR and it can go up, down or sideways. This means each molecule of CO2 in the atmosphere becomes a new source of IR radiation and there is an extinction distance associated with that source of IR radiation (300 feet). So, if we increase the amount of CO2, we increase the number of points where the IR can be absorbed and then be reemitted.
Suppose we had a layer of CO2 that was 300 feet thick. All of the IR will be absorbed before it gets to the top (using 1960 CO2 levels). But, each molecule will then reemit the IR light. If a molecule is at the top of the CO2 layer, there is a high probability the IR will be emitted in a direction away from the layer and that photon of light will be free to go off into space.
Now, add another layer of CO2 on top of the first one (let's make it less than 300 feet thick). That photon of energy that was free to go off into space will now have a chance of being absorbed in the second layer. As the second layer gets thicker and thicker, that probability goes up. By the time it reaches 300 feet thick there is a 100% probability that photon of IR light will be absorbed and then reemitted in a random direction.
So, you see, adding CO2 to the atmosphere really does increase the amount of IR that is absorbed.
And, there is conclusive proof of this. If it worked the way you posited, 100% of the heat would be retained in the atmosphere and we would incinerate in only a short time. The very fact that we have not burst into flames proves that the heat is able to escape the atmosphere after being absorbed by the CO2 molecules.
By the way, here is a really good article detailing the history of the discovery of the CO2 greenhouse effect. It is from the American Institute of Physics.
So, this was a great submission and a lot of fun. But, it certainly does not show man made global warming is not real.