We are a shoe-string operation. Unfortunately no BigOil funding! Help expose the hoax.

Westpac BSB 035612, Account No. 239469

All Scientists are Sceptics ~Professor Bob Carter

“Climate is and always has been variable. The only constant about climate is change; it changes continually.” ~Professor Tim Patterson

Perhaps the most frustrating aspect of the science of climate change is the lack of any real substance in attempts to justify the hypothesis ~Professor Stewart Franks

A lie told often enough becomes the truth.
-- Vladimir Ilyich Lenin - See more at: http://thepeoplescube.com/lenin/lenin-s-own-20-monster-quotes-t185.html#sthash.aTrSI3tG.dpuf
A lie told often enough becomes the truth.
-- Vladimir Ilyich Lenin - See more at: http://thepeoplescube.com/lenin/lenin-s-own-20-monster-quotes-t185.html#sthash.aTrSI3tG.dpuf
A lie told often enough becomes the truth.
-- Vladimir Ilyich Lenin - See more at: http://thepeoplescube.com/lenin/lenin-s-own-20-monster-quotes-t185.html#sthash.aTrSI3tG.dpuf

Tuesday, September 16, 2014

Climate scare is the real threat to civilization.

Tom Harris and Bob Carter, in a piece for the NY Post have addressed a series of films released by Leo DiCaprio about the so-called "climate crisis."

Forget Leo. He is just spruiking the hot air issued by the Alarmists.

What Tom and Bob have to say answers all the Alarmists.

They say:

  • It is the climate scare itself that is the real threat to civilization.
  • Science is never settled, but the current state of “climate change” science is quite clear: There is essentially zero evidence that carbon dioxide from human activities is causing catastrophic climate change.
  • And the Nongovernmental International Panel on Climate Change lists thousands of scientific papers that either debunk or cast serious doubt on the supposed “consensus” model.
  • Oregon-based physicist Gordon Fulks sums it up well: “CO2 is said to be responsible for global warming that is not occurring, for accelerated sea-level rise that is not occurring, for net glacial and sea ice melt that is not occurring . . . and for increasing extreme weather that is not occurring.”
And lists these facts:

  •  According to NASA satellites and all ground-based temperature measurements, global warming ceased in the late 1990s. This when CO2 levels have risen almost 10 percent since 1997. The post-1997 CO2 emissions represent an astonishing 30 percent of all human-related emissions since the Industrial Revolution began. That we’ve seen no warming contradicts all CO2-based climate models upon which global-warming concerns are founded.
  • Rates of sea-level rise remain small and are even slowing, over recent decades averaging about 1 millimeter per year as measured by tide gauges and 2 to 3 mm/year as inferred from “adjusted” satellite data. Again, this is far less than what the alarmists suggested.
  •  Satellites also show that a greater area of Antarctic sea ice exists now than any time since space-based measurements began in 1979. In other words, the ice caps aren’t melting.
  •  A 2012 IPCC report concluded that there has been no significant increase in either the frequency or intensity of extreme weather events in the modern era. The NIPCC 2013 report concluded the same. Yes, Hurricane Sandy was devastating — but it’s not part of any new trend.
Read more at the NY Post - HERE

Monday, September 15, 2014

Is the Australian Temperature record Accurate?

Opinion: Anthony Cox

Dr Marohasy
The Australian Bureau of Meteorology [BOM] prepares the Australian temperature record. This record is called Australian Climate Observations Reference Network–Surface Air Temperature dataset (known as ACORN-SAT, or ACORN).

ACORN is prepared by the BOM from readings of temperature from sites around Australia. These readings are the raw data.

Recently a group of researchers including Dr Jennifer Marohasy, have questioned the accuracy and reliability of ACORN because the final temperatures which form ACORN differ from the raw data. Dr Marohasy has found that the raw data has been adjusted or homogenised by the BOM so that warming appears where no warming or less warming was shown by the raw data.

Sometimes there is a valid reason for adjusting raw temperature data. In his 1996 thesis Simon Torok analysed the temperature sites around Australia and found a number of sites which required adjustments. These sites needed an adjustment because there was a discontinuity in the data. A discontinuity could be either a gap in the data where for some reason the data was stopped, or there was a fluctuation in the data which was inconsistent with what could be expected from the climate.

Torok provided some amusing and typically Australian examples of discontinuities such as Cockatoos stealing or destroying the thermometers and their screen and a suspicion that unusual hot records at a site were due to the site recorder increasing the temperature so his council worker friends could have the day off with pay.

A common reason for an unusual temperature is a move in the position of the thermometer.

These potential reasons for adjusting the temperature are called metadata. It is crucial that adjustments have some metadata reason to justify them and most importantly that the adjustment not increase or alter the trend in the raw data.

However it seems that adjustments at several sites have occurred without genuine evidence in the metadata about a discontinuity in the raw data.

In the analysis of the ACORN records compared with the raw data Dr Marohasy and the research group have found adjustments which increase or even create a warming temperature trend without any support from the metadata.

For instance at Bourke a long temperature record of over 100 years going back to 1880 has effectively been truncated to 2000. The researchers discovered the raw data showed an Australian maximum temperature record of 51.7ºC on 3rd January 1909. This record is no longer used by BOM. 

Overall Bourke raw data show a cooling maximum temperature trend from 1880 of 1.7ºC per century. After adjustment ACORN at Bourke shows a slight warming temperature trend. This change of maximum temperature trend has a great effect on the whole of Australia’s temperature record. The BOM has offered no particular reason for these adjustments.

In the Newcastle region the temperature site at Williamtown also shows a marked difference between raw minimum temperature data and the ACORN temperature after homogenisation by BOM.

Researcher Ken Stewart has shown that adjusting the minimum temperature data can also affect
the overall temperature trend. Stewart’s research shows a consistent warming bias for the minimum temperature over all ACORN sites in Australia. This is uncontroversial with BOM head scientist Dr Blair Trewin conceding:
“negative adjustments are somewhat more numerous for minimum temperatures, which is likely to result in ACORN-SAT minimum temperatures showing a stronger warming trend than the raw data do"

In the case of Rutherglen because the BOM has decreased the minimum temperature backwards from 1973 to 1913 the difference between the raw data and the adjusted data in 1913 is 1.8C. BOM has said this was justified because the thermometer site at Rutherglen was moved. 
However, retired
natural resources scientist Dr Bill Johnston, who worked at Rutherglen says that no evidence of a site move at Rutherglen has been provided by BOM.

Government policy is based on climate data which needs to be reliable and this includes temperature records. If well-credentialed amateurs have revealed some possible faults with the temperature record then there is an obligation on government departments such as the BOM, which advise and determine government policy, to consider those possible faults in an open and transparent manner

Sunday, September 14, 2014

Connecting the Climate Dots

Certain Temperature and CO2 Gradients Were Required For IPCC >90% Human Caused Conclusion

Opinion: Dr. Tim Ball

Source: SPPI
The complex nature of the climate system makes connecting dots very difficult. 

However, every child knows the picture doesn’t emerge until you do. In climate science connecting dots is complicated by the dominance of individual climate science specialists in government and on the Intergovernmental Panel on Climate Change (IPCC). They control and produce official science. It is a controlled and deliberate example of E. R Beadle’s observation that,
 “Half the work that is done in the world is to make things appear what they are not.”   

Usually, the climate science dots don’t connect because of illogical assumptions and inadequate or manipulated data.  Inaccurate predictions are the manifestation of the problems. It is time to re-examine the larger picture, to look at apparently incongruent issues, such as the 2007 IPCC claim that >90 percent of the global warming since approximately 1950 is due to human CO2. It is illogical on its face. The claim only holds with the gradient they created for temperature and CO2 curves.

Most of the observed increase in global average temperatures since the mid-20th century is very likely (>90%) due to the observed increase in anthropogenic GHG concentrations. It is likely that there has been significant anthropogenic warming over the past 50 years averaged over each continent (except Antarctica).

Why would such a minute fraction of an enormous system suddenly have such a huge influence in the mid 1950s? The claim amounts to a very large change in a decade. Why would a fractional increase in atmospheric CO2 cause the change, when the temperature increase from doubling or tripling CO2 concentration, is fractional? How can there be such certainty, when the annual human portion of atmospheric CO2 figures, produced by the IPCC, is within the error range of two natural sources, oceans and rotting vegetation? Besides, how are the figures so certain, when Antarctica, a massive continent with enormous influence on global temperature, is omitted? Consider, alone, the change in total Earth albedo triggered by a poleward, extension of the sea ice by a couple of degrees of latitude. Over a century ago, speculation suggested a 10° latitude expansion of that ice may have triggered the last Ice Age.

Adjusting the Historic Record

In an apparently disconnected dot, national weather agencies are ‘adjusting’ the historic temperature record. Why? In every case, the temperature record is adjusted to lower historic temperatures. The apparent answer is to achieve a specific result. It is part of a pattern the IPCC created when they chose to prove rather than disprove the hypothesis that human CO2 was causing global warming. Every move involved ‘proving’, today is the warmest in history and temperatures and CO2 levels have risen significantly since pre-industrial times.

Results of adjustments for New Zealand, illustrated in Figure 1, triggered a lawsuit in New Zealand.

Similar adjustments by the Australian Bureau of Meteorology (BOM) were given headlines with Jennifer Marohasy’s revelations. There is little or no valid reason for making these adjustments, as we learned from the BOM response.

“The BOM has ignored or circumvented all these, refusing to explain why individual stations were adjusted in detail.”

Why did the BOM refuse to provide answers? Why do all the changes follow the same pattern of creating an increase in the temperature gradient? The answer appears to be inferred in this statement.

The IPCC has drawn attention to an apparent leveling-off of globally-averaged temperatures over the past 15 years or so. Measuring the duration of the hiatus has implications for determining if the underlying trend has changed, and for evaluating climate models.

Another clue is in Judith Curry’s observation,

The key challenge is this:  convincing attribution of ‘more than half’ of the recent warming to humans requires understanding natural variability and rejecting natural variability as a predominant explanation for the overall century scale warming and also the warming in the latter half of the 20th century.  Global climate models and tree ring based proxy reconstructions are not fit for this purpose.

Part of the IPCC claim results from the limited definition of climate change to human causes, created by the United Nations Framework Convention on Climate Change (UNFCCC). Most of the claim is a result of the database created to prove the hypothesis. The claim is only valid because of the gradients the IPCC and its authors established for temperature and CO2. The >90 percent, due to humans claim, is invalid without the increased gradient.

It Has Gone On From The Start

These manipulations of curves and gradients began with the need to show pre-industrial temperatures and CO2 levels were lower than today. After that they needed to create a constantly increasing, significant, upward trend of temperature and CO2. The troublesome Figure 7c in the 1990 IPCC Report that showed a Medieval Warm Period (MWP) warmer than today, was dealt with by the infamous “hockey stick”. Using tree rings alone, their models showed declining temperatures in the 20th century. They hid the decline by unscientifically tacking on a modern temperature record produced by Phil Jones. This record, of which Jones subsequently lost the original data, has an error range of ±33%. Despite this they claimed the increase was beyond normal.

Those who adjusted the temperature curves were put under increasing pressure with the advent of satellite measures around the year 2000. They were attacked almost immediately as unreliable, but are now considered a better measure because they cover more of the globe than surface stations. What appears to be deliberate inflation of temperatures, especially by HadCRUT and NASA GISS prior to 2000, was now challenged. A different approach was required, hence the shift to lowering the historic temperature record. If you don’t think such coordinated strategy is possible, consider the planned series of mini-films prepared by the World Meteorological Organization (WMO), prior to the Climate Summit in New York, September 23. If you have the stomach for it, you can watch the series for yourself.

The first challenge was to establish a low pre-industrial level of CO2. It was more important than temperature because warming from the Little Ice Age (LIA) was an accepted climatological trend. Climatologists became aware of the selection of CO2 data to establish a low pre-industrial level with publication of Tom Wigley’s 1983 article “The pre-industrial carbon dioxide level” in Climatic Change.  Wigley established the low pre-industrial level at 270 ppm in the climate science community. It paralleled Callendar’s narrow selection of the same data. Both set the required low pre-industrial level.

Zbigniew Jaworowski told a US Senate Committee on Commerce, Science, and Transportation Hearing.
The basis of most of the IPCC conclusions on anthropogenic causes and on projections of climatic change is the assumption of low level of CO2 in the pre-industrial atmosphere. This assumption, based on glaciological studies, is false.”
The notion of low pre-industrial CO2 atmospheric level, based on such poor knowledge, became a widely accepted Holy Grail of climate warming models. The modelers ignored the evidence from direct measurements of CO2 in atmospheric air indicating that in 19th century its average concentration was 335 ppmv.

Jaworowski’s research was subsequently confirmed by the work of Ernst-Georg Beck. An article in Energy and Environment examined the readings in great deal and validated their findings. In a devastating conclusion Beck states,

Modern greenhouse hypothesis is based on the work of G.S. Callendar and C.D. Keeling, following S. Arrhenius, as latterly popularized by the IPCC. Review of available literature raise the question if these authors have systematically discarded a large number of valid technical papers and older atmospheric CO2 determinations because they did not fit their hypothesis? Obviously they use only a few carefully selected values from the older literature, invariably choosing results that are consistent with the hypothesis of an induced rise of CO2 in air caused by the burning of fossil fuel.

So the pre-industrial level is actually some 65 ppm (335 – 270)  higher than the level used in IPCC computer models. No wonder they are consistently wrong.

Some argue that the 65 ppm higher level is wrong because it implies the pre-industrial ocean temperature was approximately the same as today. A warmer ocean absorbs more CO2 so there should be more CO2 in the atmosphere This supposedly contradicts the argument that the world has warmed since the nadir of the Little Ice Age (LIA). It doesn’t, it contradicts the incorrect assumption that CO2 is the major cause of the warming since pre-industrial times; even the IPCC don’t make that claim. It also assumes the climate sensitivity, that is how much temperature increase occurs with increasing CO2, is much greater than claimed. In fact, the climate sensitivity level has consistently reduced and is close to approximating zero. The real question is, how can it be positive if, as is the case in every single record of any duration for any time period, temperature increases before CO2?

The importance of Beck’s work is measured by the fierce and personal level of the attacks. As I wrote in my obituary,  

I was flattered when he asked me to review one of his early papers on the historic pattern of atmospheric CO2 and its relationship to global warming. I was struck by the precision, detail and perceptiveness of his work and urged its publication. I also warned him about the personal attacks and unscientific challenges he could expect. On 6 November 2009 he wrote to me, “In Germany the situation is comparable to the times of medieval inquisition.” Fortunately, he was not deterred. His friend Edgar Gartner explained Ernst’s contribution in his obituary. “Due to his immense specialized knowledge and his methodical severity Ernst very promptly noticed numerous inconsistencies in the statements of the Intergovernmental Panel on Climate Change IPCC. He considered the warming of the earth’s atmosphere as a result of a rise of the carbon dioxide content of the air of approximately 0.03 to 0.04 percent as impossible. And it doubted that the curve of the CO2 increase noted on the Hawaii volcano Mauna Loa since 1957/58 could be extrapolated linear back to the 19th century.” (This is translated from German).

Attacks came, as expected, from AGW proponents, but some of the nastier and narrower attacks were from some professing to be skeptics. Skepticism is critical, but a growing trend among climate skeptics is attacks with greater and usually unjustified vigor against those who question skeptical claims. Often, they carve out a skeptical position and consider it their property and sacrosanct, only they know and understand. They become as dogmatic as those they claim to challenge. There is no place for ego in science. As Mary McCarthy said, “In science, all facts, no matter how trivial or banal, enjoy democratic equality.”

Most of my research has involved historical sources and data. It also involved comparing historic records against modern data. It must all be considered and used with extreme care and awareness of the limitations.  The 19th century CO2 data has many traits that make it a reasonably reliable source for approximating what was going on with CO2 during that century. Beck examined and detailed each record with what his friend described as, his “immense specialized knowledge and methodical severity…” Here are some reasons for the validity of Beck’s work on the 19th century data as representative of atmospheric CO2 levels.

·      Mostly scientists produced the data, although they were then, like Darwin, called naturalists.
·      They were trying to determine the percentage of gases in the atmosphere following Priestley’s work on oxygen at the end of the 18th century. The first measures of CO2 began in 1812.
·      The objective was pure scientific discovery, with no thought to future concerns about CO2 as a so-called greenhouse gas. This, in direct contrast to the deliberately structured and manipulated instruments and analysis of Mauna Loa.
·      The sites and distribution are comparable to those for temperature for the 19th century and early 20th century.
·      My experience is that the work of historic record keepers is superior in dedication to detail and integrity of modern, especially government, keepers. Anthony Watts’ study of modern US weather stations underscores this.
·      The pattern of the plotted data is similar to other unmodified CO2 records. Figure 3 is very informative because it contrasts the much-modified artificially smooth ice core record with atmospheric levels of CO2 from stomata measures.
Figure 3 

·      There is an obsession with misrepresenting CO2 distribution in the atmosphere. It extends from the IPCC claim that it is evenly distributed to the elimination of variability in the layer of air near the ground. Extremes are removed with no justification and so much data eliminated that the actual data finally used, bears little or no resemblance to the raw data or reality.

Those proving the AGW hypothesis had to produce a smooth, constantly increasing curve, from three sources. They linked the ice core record to the 19th century data to the Mauna Loa measures. Ernst-Georg Beck put them together, (Figure 4), showing how it could only be done with unjustified assumptions.

Figure 4

Variability is critical but a 70-year smoothing average applied to the ice core record eliminated extreme readings and a great deal of information. It means the results are not meaningfully comparable to the short Mauna Loa record. It is made worse as that record is also smoothed because  readings vary up to 600 ppm in the course of a day; just like the 19th century data. Elimination of high readings prior to the smoothing makes the loss even greater. The radiative effect of greenhouse gases doesn’t work to an average. It is in effect all the time and throughout the entire atmospheric column.

The End Justifies The Means

The IPCC was set up to prove a very narrow hypothesis. The goal was to show human produced CO2 was primarily responsible for global warming. Their claim that >90 of warming from 1950 to the present is due to human CO2, is only valid with the gradients of the temperature and CO2 curves they created. The critical portion of this agenda was to show human CO2 was causing temperature increase in a rapidly increasing, unnatural, trend. Wherever the historic data did not fit they adjusted it. When they had control of modern data they adjusted it. Each time they were thwarted, such as with the advent of satellite temperature data, they introduced another ‘adjustment’. Only a few are capable of connecting all the dots, when done, the pattern of activities revealed is a grim picture of manipulating slope and gradient of temperature and CO2 to prove the AGW hypothesis.

Friday, September 12, 2014

Bureau of Mythology ..er..(Fake) Meteorology outed: Cooking the Cooling.

Some dedicated scientists have exposed Australia's Bureau of Meteorology's flawed fake figures.

Dr Jennifer Marohasy has had a long running dialogue with David Jones of the Bureau of Meteorology, with many unanswered questions.

In January 2014, she questioned a statement that David Jones made in a radio interview:
“We know every place across Australia is getting hotter, and very similarly almost every place on this planet. So, you know, we know it is getting hotter and we know it will continue to get hotter. It’s a reality, and something we will be living with for the rest of this century.”
In March 2014, Jennifer wrote an open letter to the Minister for the Environment, Greg Hunt, questioning the reliability of the bureau's data, beginning with (link)
I am writing to you as the Minister for Environment, ultimately responsible for the activities of the Australian Bureau of Meteorology, to ask you consider the following seven issues pertaining to the activities of the Bureau.
and including this gem:
While the Australian taxpayer invested upward of $30 million dollars in just one supercomputer in March 2009 on the basis that this would make weather predictions more accurate, some individual forecasters, operating outside the mainstream climate-science community, and without any government support, are producing more reliable and accurate medium-term rainfall forecasts than the Bureau. 
and concluding: (bold added)
In arriving at theories that explain the natural world, the best scientists always use all the available data, not just the data that happens to fit a particular viewpoint. Furthermore, long historical data series are critical for statistical methods of rainfall forecasts, including the application of artificial neural networks that can currently provide more skillful forecasts than POAMA. That the Bureau persists with POAMA, while failing to disclose to the Australian public the absence of any measurable skill in its monthly and seasonal forecasts, should be of grave concern to the Australian parliament. 

‘Facts don’t cease to exist because they are ignored.’ 

In June, Jen wrote again to the to the Minister for the Environment (link) containing details of her ongoing research with a letter concluding:
As an Australian scientist with a keen interest in public policy and temperature records, I ask you as the Minister ultimately responsible for the activities of the Australian Bureau of Meteorology, to consider how you might reconcile increasing atmospheric concentrations of carbon dioxide with a falling temperature trend, and what needs to be done if we are to adequately prepare as a nation for the possible onset of a period of sustained cooling.
Even if Minister Hunt has been taken in by the  Global Warming Hoax, he seems derelict in his duties if he has not called the head of the Bureau into his office with a "Please Explain."

Later in the year she published  a paper co-authored with John Abbott, Ken Stewart and David Stockwell titled

Modelling Australian and Global Temperatures: 
What’s Wrong? Bourke and Amberley as Case Studies (link-pdf)
This paper considers the records for Bourke and Amberley and the methodology employed by the Bureau in compiling the annual statistics from such temperature series. We will also consider how NASA’s Goddard Institute for Space Studies (GISS) homogenizes data from Amberley in the development of its annual global mean temperature. Homogenization refers to a process of changing the actual temperature records using mathematical algorithms.
This blog has covered her address to the Sydney Institute with a blog titled

Taking the Warming out of Global Warming (Link)

The Environment Editor of the Australian, Graham Lloyd, picked up the story and exposed the fake data to the world in a series of articles, including

Where are we up to? As Jennifer has revealed on her blog today (link
THE Australian Bureau of Meteorology now acknowledge that they change the temperatures at most, if not all, the weather stations that make-up the official station network from which national temperature trends are calculated. Indeed, earlier in the week, 28 pages of ‘adjustments’ were released online....(and Alarmists defenders take note)...Scrutinise the detail in this document of adjustments and not only is the rationale and methodology indefensible, but it contradicts information published in the official Station Catalogue which is meant to be the go-to document for understanding this official network known as ACORN-SAT (Australian Climate Observations Reference Network –Surface Air Temperature).
The post continues: (my bold added..perhaps the whole para should be emboldened)
That the Minister has not yet intervened, and that many within the Australian scientific community attempt to justify the practice of homogenisation that creates these ‘adjustments’ that changes cooling trends to warming trends at a whim, is reason for national shame. It all amounts to corruption of the scientific process on a grand scale, with significant economic implications. But not even a whisper about the scandal can be heard from the Australian national broadcaster or the many other typically righteous institutions and individuals that claim to be motivated by the truth.
Shame on you, cheating BoM scientists, shame on you Greg Hunt,  shame on you MSM reporters.
Read the whole post HERE. Jennifer concludes with:
In a democracy it is the role of government to oversee the correct function of institutions like the Bureau of Meteorology. Greg Hunt is the Minister ultimately responsible. So far he has been silent on the issue. This is in effect condoning what until recently would have been considered a totally unethical practice: changing received evidence to fit a preferred storyline. Its unacceptable, but will Minister Hunt do anything about it? Will the national broadcaster even report on it? What can you do about it?
- - - - - - - - - - - - - - 
And Paul Clark (see comments) adds: