Monday, December 21, 2009

The Labour Theory of Capitalism; or Rubes, Rednecks and Hicks: The Makers of the Modern World

Reading The Industrial Revolution, published this year, and written by Lee Wyatt. It seems like an advanced undergraduate text, but general histories of this momentous event are surprisingly hard to come by. 

My long-running thesis is that "industrialization" had little to do, at first, with inventions such as the steam engine. Instead, it was the division of labour which was key. Wyatt hews to the "consensus" that machinery was essential to the industrial revolution, but presents evidence that suggests otherwise. 

The classic example of early division of labour was found, of course, in Adam Smith's Wealth of Nations, from 1776, in which is described a pin factory where work is "divided into a number of branches, of which the greater part are likewise peculiar trades. One man draws out the wire, another straights it, a third cuts it, a fourth points it, a fifth grinds it at the top for receiving the head; to make the head requires two or three distinct operations; to put it on, is a peculiar business, to whiten the pins is another; it is even a trade by itself to put them into the paper; and the important business of making a pin is, in this manner, divided into about eighteen distinct operations, which, in some manufactories, are all performed by distinct hands, though in others the same man will sometimes perform two or three of them." (page 8 in pdf text at link above) 

What Smith described was essentially a process of manual labour — very tedious and even strenuous labour — that went largely or wholly unaided by water- or steam-power at all. The classic case of the division of labour, very familiar to modern society, is the McDonald's restaurant. Established as a single outlet in California in the 1950s, it was the McDonald brothers' themselves who established the production-line approach to service (the title of a 1972 Harvard Business Review article by Theodore Levitt) that became characteristic of the later worldwide chain, when they eliminated wait-staff (including all female employees, who were presumed to be magnets for amorous punks), radically simplified the menu (eliminating any dish that required the use of a fork and knife), and of course, divided up the responsibilities for the cooking and cashiering between several more people than would normally be employed at a hamburger joint — staffing levels made affordable by the very low wages paid for the work. 

This is, I think, "industrial revolution" in a nutshell. Like any instance of the division of labour, a McDonald's (or any fast-food) restaurant results in the de-skilling of work. McDonald's has long been the byword for low-paid, low-skill work (the "McJob") that doesn't require much talent or even brightness at all. Wages are evaluated so meagrely precisely because "any idiot" can do a McJob. It works out from the employer's point of view, because employees who quit or grow insubordinate can be quickly replaced by the next idiot. The key point is that McDonald's has never employed actual powered-machinery to achieve the "assembly-line" levels of productivity that made it the global success that it remains. Of course, the original McDonald's restaurant no doubt employed the most up-to-date appliances and other technology for fast-food production. However, in this respect, it was no different than hundreds, and even thousands, of competitors at the time. Where it was dissimilar was in the utilization of the manual labour of making hamburgers and french-fries. The McDonald's division of labour rapidly increased hamburger-productivity, and with it, the profits from selling fast-food. Eventually, of course, it was this method which resulted in billions and billions in profits, from "serving millions and millions" all around the world. 

It was the same with the pin factory and similar efforts at the division of labour in industrializing Britain. It allowed — unaided in large part by machinery — for a workforce of ten to "make among them upwards of forty-eight thousand pins in a day," as Smith described it. He went on: "Each person, therefore, making a tenth part of forty-eight thousand pins, might be considered as making four thousand eight hundred pins in a day. But if they had all wrought separately and independently, and without any of them having been educated to this peculiar business, they certainly could not each of them have made twenty, perhaps not one pin in a day; that is, certainly, not the two hundred and fortieth, perhaps not the four thousand eight hundredth part of what they are at present capable of performing, in consequence of a proper division and combination of their different operations." (Wealth of Nations, page 9 of above pdf text) 

The steam engine and other engineered machinery came to be employed for productive purposes, because of the division of labour, rather than the latter being a consequence of the former. The division of labour was made possible in turn, by the widespread acceptance of wage-labour. It is the chief reason why Britain became the first industrialized country. 

There, far more than on the Continent, the feudal system had given way to enclosure, and landowners cleared their possessions of wastelands and peasantry, to farm cash crops and raise livestock. The nobility converted themselves into agrarian capitalists (the word "firm" comes from "farm"), and the toiling masses were converted into wage-labourers. 

The rural proletariat of the early-modern period were doubtless no better off than the peasantry of the Middle Ages. However, the enclosure of farmlands vastly increased agricultural productivity, thereby causing a decrease in the price of basic staples. This is the reason, too, why wages in the agricultural sector remained so pitifully low. 

However, the capitalization of the agricultural industry was also the spur for innovation and improvement in farming techniques (such as those introduced in the early eighteenth century by the pioneer agronomist Jethro Tull). These innovations, in turn, boosted productivity all the more, thereby making food staples all the more cheaper. This had the effect of boosting population in Britain considerably (an increase of thirty-three percent to nine million between 1700 and 1790), while higher productivity and a larger workforce continued to depress wages. 

According to Wyatt, already by 1700, the proportion of the workforce involved in the agricultural sector was considerably smaller than in the major European nations: "... in 1600 the average farmer in Great Britain had produced enough food to support his family and half an additional one. By 1800 that same farmer could feed his own family and one and one-half more. By the mid-19th century, Great Britain had the lowest proportion of its workforce in agriculture than any other country in the entire world." 

At that time, according to Wyatt, only 22% of the British workforce was involved in farming. It was the "surplus" non-farming population which supplied the workforce for the early manufactories, in the textiles and other industries (such as pin-making) — not to mention the markets for the cheap (in price and quality) goods that were produced from this process. It is no coincidence that the factory system developed in the very areas (namely, the midlands and north of England) where enclosure was pursued most vigorously and successfully (Wyatt points out that not all, or even the vast majority of efforts at enclosure were carried out off). It shows how industrialization, at least in its early stages, represents not the colonization of the metropole by the hinterlands, but rather, the reverse: factory-industry developed initially far from the centres of power, culture and influence, eventually drawing metropolitan areas into its orbit. This is why, as Wyatt points out, the five largest cities after London in 1800 — Manchester, Liverpool, Birmingham, Leeds, and Sheffield — were small towns or mere villages in 1600. No coincidence again, that all of these were major manufacturing centres at the turn of the nineteenth century. 

This pattern held, too, for the United States, which in 1800 could be considered one vast hinterland, in relation to the economic might of its former colonial master, Great Britain. The American industrial revolution, though, took place largely away from the old centres of power — Boston, New York city, Philadelphia, Washington, Atlanta and so on — in backwater places that later became Pittsburgh, Detroit, Chicago (which grew from a population of 250 in 1833 to three quarters of a million at the time of the great fire in 1871), Indianapolis, and Los Angeles. Manufacturing that did take place in the states that were the original Thirteen Colonies, was concentrated away from larger centres: rural New Jersey, Connecticut or New York upstate instead of New York city, Lowell, Massachusetts instead of Boston, Pittsburgh instead of Philadelphia. 

I believe conventional historical understanding of the past is mistaken, as well, in respect to the the notion that machine-industrialization (in Britain) as elsewhere, developed under "laissez-faire" or "invisible-hand" conditions. Manufacturing under division-of-labour conditions was, in eighteenth-century Britain, largely accomplished without government intervention. 

However, as Wyatt himself notes, until the Revolutionary/Napoleonic wars near the close of the eighteenth century, British industrialization was not characterized by the sort of the heavy, steam- or coal-driven machinery that would characterize factory work in the nineteenth and into the twentieth centuries: "In reality, until the 19th century the large factory was not the common sight in industrial districts, as most mills were essentially just more sophisticated workshops of the past." 

The latter form of factory industry occurred in Britain, as elsewhere, due to deliberate government involvement in the economy — whether to fight war or a result of a dirigiste economic policy (ie., as in later nineteenth-century France, Germany, Japan and, much later, Soviet Russia). 

The idea that government subsidy and other forms of intervention were necessary for factory-industry to grow was one of the few areas of agreement between Thomas Jefferson, the third U.S. president and advocate of an agrarian-yeoman republic, and Alexander Hamilton, the Caribbean-born American revolutionary and later the first Secretary of the Treasury, who advocated an industrial policy

They simply disagreed as to how desirable such intervention was. Hamilton pursued industrial development both in and out of government. 

As a private citizen, though, Hamilton acted not merely as a venture investor, but as a lobbyist to federal and state government for subsidies and trade restrictions which would help industry develop. Eventually, he and others entered into partnership with the New Jersey government, encouraging industrial activity in a remote area that eventually became the city of Paterson. Hamilton's efforts were, as it turns out, largely desultory, and the United States remained an agrarian nation until another form of government intervention in the civilian economy — the American civil war — sparked a real machine-industrial revolution. 

The division of labour itself, while made practicable by the existence of wage-labour as the standard form of contract (to the exclusion also of slavery), is a by-product of analytical consciousness, given such potency in modern times by the printing press, optical technologies such as telescopes, abstract-icons such as graphs, maps and clocks — timepieces especially. 

The factory itself has been described as an extension of the clock, and even before the introduction of heavy-machinery into the factory workplace, the division of labour itself was dependent upon the iron rule of the clock. The factory system's dependence upon rationality is behind the split between bourgeois and proletariat. Lord Bertrand Russell once remarked that all work consists of rearranging matter at or near the surface of the earth, and telling others to do so. The division of labour demands that a portion of the workforce must carry out tasks which are repetitive, tedious, and even robotic in nature, requiring little in the way of skill and intelligence. 

But the variegated, particularized activities of the factory demand also highly cerebral and calculative oversight — that part of the workforce which is now referred to as "management." This is the bourgeoisie, a class that once consisted largely of the direct owners of capital, but which is now made up of professional delegates of those in ownership. This basic split between worker and management, is as inevitable under the division of labour as that between lord and peasant under feudalism. 

The distinction persists even where, as in the advanced capitalist countries, a unionized worker in heavy industry (such as an automobile plant) can expect to make as much (or often much more) than many belonging to management. The class division in industrial society arises, as Marx said, in how the proletariat and bourgeoisie approach work or labour. It persists even where ownership of capital, or machine-engineering, is in the hands of the state. Marx and Engels argued that the abolition of capital would eliminate the division of labour. 

But productive wealth is based on this division. Rendering the factory system more "humane", eliminates the productivity that is the whole point of the division of labour. In the Soviet Union or any other industrialized Communist country, the division of labour was not abolished, of course, and inevitably, a managerial class emerged — the nomenclatura — which simply became the "new boss, the same as the old boss", or rather, much worse than the old boss. 

The abolition of command socialism has met with contrasting results in the two major Communist states of the twentieth century, China and Russia. Twenty years ago, the expectation might have been that Russia, which had already undergone full urbanization and industrialization, would quickly become a Westernized, developed liberal democracy after not too many years. 

China, on the other hand, was still very poor, with a vast population and Communist leadership that, while promoting economic liberalism, was ready to shoot down its own young people in the heart of the capital, Beijing, rather than submit to political reform. Instead, Chinese industrial growth zoomed far ahead not only of Russia or any other former Communist state, but also the Japanese "superstate", as well as every country in the world except for the United States, which in turn became the market for the export industries that sprang up in China during the 1990s and the new century. In the meantime, Russia went into near collapse. Not only its industrial base, but its birth rate and life expectancy went into free-fall during the ‘90s, the government unable to restore order or even to remain in office for very long, until the turn of the century when the state was taken over by a former KGB colonel. 

The conventional explanation for this divergence, is that in China, unlike Russia, the Communist party did not relinquish political control, instead abjuring political reform in favour of economic liberalization, while Russia did the opposite. 

In an article published by the Hoover Institute at Stanford university, Paul Gregory and Kate Zhou argue that the dissimilar paths taken by the two largest former Communist states, have different sources. In sum, the authors state that in China, unlike Russia, traditions of single-family ownership of farms were very ancient, an endured in spite of the period of collectivization of agriculture. 

Gregory and Zhou submit that, when privatization of land holdings came to Russia, the workforce was reluctant to depart from the security of the farm collective. But in China, the authors argue that it was the peasantry which led the way to economic reform, setting up illegal private farming operations following the lifting of totalitarian oppression after the death of Mao Tse-tung in 1976, and the removal of his radical allies the "gage of four" (including Mao's widow) thereafter. 

In contrast to China, Russia had by the time of Gorbachev's reforms experienced more than three decades of (relative) stability and (relative) liberality following the death of Stalin, Zhou and Gregory write that when reforms came in the late ‘70s, "a large percentage of the population was recovering from the catastrophes of the Mao years. Rural dwellers, in particular, had witnessed the chaos of the Great Leap and had seen their parents and children die from starvation during the 1958–61 famine. They learned they had to take care of themselves." 

In the 1980s, as Mikhail Gorbachev was offering 50-year leases of land to a resisting rural workforce, Chinese peasants "began to quietly distribute the land, with each family delivering production for the state quota. Gorbachev called for decollectivization from above; China's farmers decollectivized spontaneously from below. They created their own `contract responsibility system,' initially at risk of severe punishment. There were no leaders; there were no face-to-face confrontations. ... As agricultural production soared, Deng Xiaoping and his party realized they could not resist and could take advantage of something that was working." 

That economic reform originated in the Chinese countryside is not in dispute. There, as in Britain and the United States in the past, industrialization originated in the rural regions, before spreading to the major centres. 

Gregory and Zhou observe that in Russia in the Gorbachev era, "the farm population had shrunk to a quarter of its former size; only older workers remained, working perfunctorily on state land or tending their private plots. They had long been converted into wage workers and received pensions and socialized medical care, albeit of a low quality. In China, rural dwellers accounted for 80 percent of the population; compared to Russian farmers they were young and vibrant. They lived without the social guarantees of Russian farmers. In China, only the young had not experienced private agriculture. Small private plots had existed in China for 2,000 years." 

When, in the 1980s, both Russia and China began to privatize its non-agricultural sector, Russian entrepreneurs came largely from the city, but "China's first entrepreneurs hailed primarily from the countryside, and they got their start by marketing farm products in the cities. Private trade developed in China at the grassroots level, emerging from rural regions and prospering because it filled a vital need. The rural contract responsibility system created huge agricultural surpluses which had to be marketed outside the state system. Farm products had to be moved over long distances, either directly or through intermediaries — in violation of laws and without contracts that could be enforced in courts." 

Zhou and Gregory write, "China's early trader-entrepreneurs had to first overcome the problem of distance between producers and consumers. ... Throughout the early 1980s, farmers in north Jiangsu packed their bikes with chickens, ducks, and other fowl, crossed the Yangzi River, and shipped their products by rail to urban centers in the Yangzi basin. ... By 1983, the majority of consumers in major cities purchased their products in free markets rather than in government stores. Within one year (between 1979 and 1980), most state vegetable markets, except the highly subsidized Beijing and Shanghai markets, were out of business." 

By 2007, the authors note, the wealthiest Chinese citizen "was the daughter of a poor farmer from the southern province of Guangdong, whose family became wealthy after acquiring large tracts of land and distressed assets in the countryside, where there was no real estate business, in the early 1990s." Private firms, non-existent in 1978, numbered almost thirty million in 1997, with nearly one million corporate or joint ventures. Private capital consisted in that year two-thirds of GDP, again up from nothing almost thirty years earlier. 

Gregory and Zhou state, "Private business originated in agriculture, spread to the cities, and then returned to the countryside as rural-based industry. Many large private manufacturing firms developed in predominantly agricultural provinces (Zhejiang, Shandong, Guangdong, Hunan, and Sichuan). China's largest agribusiness, the Hoep Group, was founded by the Liu brothers, who left the city to found their company in a rural part of Sichuan province. Wang Guoduan, a rural entrepreneur from southern Guangdong province, built the largest refrigerator maker, Kelon Group; Huanyuan, China's largest air conditioner maker, is based in the agricultural province of Hunan. China's first automobile exports will likely come `from the agricultural hinterland of Anhui province...'." 

All this shows that the "capitalist" system is dependent upon the labour-factor of production, above all. Communist economics could productively organize both land and capital quite well — often better than capitalist economies (witness of the superiority of initial Soviet space technology or the MiG jet-fighter over its Western counterparts). Economic history has shown that the free market, or "invisible hand" is wont to invest in complex or engineered-machinery, i.e. productive capital, before its utility is proven by state investment in such machinery, either for war-making or as official economic policy. 

Communist economies, which have the workforce dictating the productive decisions, fail in their inability to properly organize the labour factor of production. As mentioned, Communism did not and could not abolish the division of labour. However, the organization of the workforce in this manner had desultory results, just because of the inability for command socialism to properly serve and service the vast capital infrastructure. A worker's wages could buy nothing beyond staples, and anyone was rewarded thereof regardless of how hard or little one worked (rewards came through other means, such as acting as an informant on others). As factory and industrial work generally has little inherent reward, most people chose not to work beyond what was necessary.

Wednesday, December 9, 2009

Deniers, Fraudsters, Hoaxers and Sceptics

Reading Crowded with Genius, about Edinburgh in the 1700s, by James Buchan, British novelist and historian, and the grandson of Lord Tweedsmuir, also a novelist and one of the last British governors-general of Canada. In the late seventeenth and early seventeenth centuries, Scotland had been one of the world centres of high Calvinism. According to Buchan, this left a dreary pall over Scotland's preeminent city, especially prominent on the Lord's Day, when it appeared that the entire town had died of bubonic plague.


However, by 1719, the population was already slacking, at least according to a pronouncement drafted that year by church elders, severely criticizing Edinburghians for the many sins committed on Sundays, including: gathering in groups on the street for conversation, receiving visitors to their homes, leaving the city for the countryside, eating during daylight hours, attending ale- and milk-houses, and worst of all apparently, sitting and staring out of windows.


Such zealotry seems to contemporary sensibilities as contrary to a truly free society. On the other hand, radical moral asceticism is the paradoxical background to the development of a tradition of independent scientific tradition. Just a few decades after the Presbyters' brimstone tract of 1719, Edinburgh became known as the "Paris of the north" for its contribution to the Enlightenment, in history, economics, the sciences, not to mention literature (Dr. Johnson, who held out little affection for Scotland, called Edinburgh "Britain's other eye"). It is similar to the way to how New England became a centre of science and learning during the eighteenth and nineteenth centuries (and even today, Boston has the most schools of higher education per capita than any other major city in the U.S.), after the zealotry of its earlier Puritan period had died down. It may even be the reason for the "Islamic enlightenment", the rise of the sciences among largely non-Arab Muslims in the Middle East and southern Europe around the turn of the second Christian millennium.


But isn't such ascetic religion the enemy of "value-free" empiricism? The very destruction of mysticism and Gnosticism through radical asceticism, lays the groundwork for the sort of reality-based perception necessary for scientific advance to take place. All this is to say that the line between science and religion is not very clear.


This is germane, of course, to the recent controversy over e-mails that were leaked from the Climate Research Unit at the University of East Anglia, in Norwich, U.K. The CRU is one of four institutes providing the official data compiled in reports issued by the United Nations agency overseeing global-warming treaties, and which in turn have concluded that the earth's climate is rapidly warming, and that human activity, namely through the burning of fossil fuels, is causing the atmosphere to heat up — with catastrophic consequences for the planet.


I've rarely commented on the whole global-warming issue. I did so for the first time three years ago. It was occasioned by the fact that, for the first time in my life, December had come to Ottawa without a snowfall. Not only that, it was not particularly cold, either. But, the next winter and the next after, saw snow coming very early, and staying on to March. This past summer was intemperately cool and wet, on the other hand.


For years, advocates of measures to counter global-warming had pointed to just the apparent shortening of winter, as well as the heating up of summer, as conclusive proof that "global warming is real." Did the cooler than expected summers, and much-colder than expected winters in the northern hemisphere in 2007-08, affect the climate-apocalypse rhetoric at all? No. The advocates simply de-emphasized the term "global warming", and substituted "climate change", arguing that while the "greenhouse effect" would make the earth hotter on average, in some areas, it might become much colder than before. Fair enough. Except, it became increasingly clear that, indeed, the earth was not getting hotter at all. The global mean temperature had, since 1998, been decreasing.


I must say, I didn't become sceptical about global warming or climate change until a few years ago. After all, it is indeed established scientific fact that carbon dioxide in the atmosphere serves to trap heat the earth's surface. Doesn't it make sense that a greenhouse effect could occur? I began to doubt global-warming theories, not because of anything in particular I had read by climate-change "sceptics" (is the polite term), but from the response to these by global-warming scientists, and their advocates. The rule was that, instead of attempting to counter the sceptics' arguments by reference to their own supposedly unassailable theories, the sceptics were attacked in turn as "shills of the hydrocarbon industry", and given the label "deniers", as though to associate them with deniers of the Nazi Holocaust, and to imply that the sceptics knew their arguments were wrong, and yet argued in bad faith because they were being paid to do so by oil companies.


Basically, the sceptics' arguments were always ignored, and they were attacked personally, if not for having a self-serving agenda, then because they were (allegedly) unqualified. The mantra was that "the science is settled" about global-warming, and that anyone who contests this alleged settlement, is behaving in an "anti-science" manner.


Such personal attacks were my first indicator that something was very fishy about this whole thing. It was the assertion that the "science is settled" that really got me steamed. I'm sorry: no scientific theory or proposition is ever settled. Indeed, the whole point of science is to convey statements in a manner so that they can be falsifiable. If global warming theories cannot stand up to such scrutiny, then they are not theories at all (quite like the conspiracy fables or narratives propounded by Kennedy-assassination buffs, or 9/11 troofers). To simply assert and reassert that the "science is settled", is in itself an anti-scientific statement.


I've been similarly unimpressed with statements from global-warming researchers and politicos to the effect that human-made climate change represents the "consensus" of active researchers in the field, and thus there is no need to consider the arguments of the sceptics. Again, every empirically-backed theory is the consensus of active researchers, up to the moment it is upset by a rival or "revolutionary" theory. In 1905, it was the consensus of physicists that the universe existed in the way described in the theories of Isaac Newton. Thereafter, the relativity theory of Albert Einstein became the consensus view as to the operation of the universe, as it remains today. One day, relatively theory too could well be supplanted, but it is precisely this that the climate-change researchers and their advocates deny with respect to their own specialty, a stance that is thereby contrary to science.


Scientists were, before 1905, at least partially mistaken in their view of the physical universe. It is easy to identify other instances of scientific consensus which were rather more baleful, even catastrophic in their political and social implications. It was once the consensus among biologists, from the late nineteenth century into well into the 1930s, that human evolution occurred in the manner described by eugenic theory. Heeding the scientific consensus of the time, authorities put into place laws that sought to identify the "feeble-minded", sterilizing them to prevent their inherent stupidity from being multiplied through subsequent generations. Winston Churchill, as a rising young cabinet minister, was an active proponent of eugenic theory. The Social Democratic party of Sweden, upon coming to power in the 1930s, instituted a eugenics-based policy sterilizing the mentally retarded (a policy which persisted into the 1970s). Tommy Douglas, the socialist politician and later premier of Saskatchewan, also in the thirties wrote his doctoral thesis in support of eugenics. Feminist Margaret Sanger was driven to form Planned Parenthood, just with the intent of preventing the biologically inferior from breeding too much. A bastardized version of eugenic theory was, of course, used to justify not only the sterilization, but also the outright murder of the feeble-minded, along with six million others, under Nazi Germany.


After the Second World War, a different scientific consensus emerged, which treated homosexuality not as a sin, but a psychological perversion, one which required treatment by drugs or hospitalization. It led the association representing American psychologists to include homosexuality as a mental illness in the 1957 edition of its master-diagnostic handbook.


Even aside from all this, though, references to "the consensus" are anti-science and even anti-logical, in their appeal to authority. To say "the scientific consensus is that man-made global warming is real", or similar assertions like "ninety-one national science academies agree that climate change is a problem", is the same as saying, "To quote the Bible..." Incontrovertibly, the scientific consensus has been wrong in the past. There is no reason thereby to assume that ninety-nine or a thousand science academies are automatically correct if they hold something to be scientifically true. There is, too, the character of the consensus in regard to global warming. Climate-science became the major field that it is, just because governments and foundations have poured billions of dollars into the field over the last thirty years or so — tens or even hundreds of billions by now. The whole bias of the field has been to "prove" that human-caused global-warming is real. Is it any surprise thus that the "consensus" among the many scientists trained in climate science should be that climate change is real?


This isn't at all to question the good faith of climate scientists, who believe sincerely in what their theories say. The scientists who accepted Newtonian physics (without ever reenacting the experiments which led Newton to his conclusions) were acting in good faith, and I will say that, too, about the consensus view in the early twentieth century which held eugenics to be true. This is a courtesy that climate scientists themselves never extend to their critics, however. At best, the sceptical scientists are regarded as inexpert. Just a little less politely, as these things go, the "deniers" are deemed to be cranks. When this doesn't suffice, the old chestnut of conflict-of-interest is trundled out: "Scientist A has received funding from oil company B..." I've read through a great deal of articles making these claims. Never has a global-warming hysteric been able to convincingly show that a sceptical scientist has been on the payroll of any oil company.


The "evidence" seems to stop at some vague "corporate" funding, or even merely funds received from "right-wing foundations." A couple of years ago, the Canadian magazine Walrus ran a piece on the nefarious connections of global-warming sceptics to... the tobacco industry.


There is several curious things about this tack. It is, once again, contrary to science to refuse to engage a theory, simply because of how its theorist was financed. Science is, by definition, empirical. To be "scientific" is to falsify a theory by alternative facts. It is anti-scientific to ignore a theory because the theorist is judged unreliable. Newtonian physics was not refuted because it was discovered Isaac Newton believed in numerology; it was replaced by Einstein's physics because the latter had the better proofs. Similarly, whether a scientist's work is financed by the oil-wealth of Exxon Mobil or of Osama Bin Laden, is a matter of indifference as to whether it is scientifically viable (and once again, there is no evidence at all that any climate-change sceptic is acting at the behest of hydrocarbon-burning firms).


This is, quite aside from the fact that, in regard funding and financing, all the big bucks are on the side of the global-warming hysterics. Compared to very paltry sums given (usually indirectly) to climate-change sceptics, scientific advocates of human-caused climate change receive hundreds of millions and even billions in funding — very often from the very oil firms they themselves attack as being behind all the global-warming scepticism. Again, all of this leaves aside the fact that, now, the "green" market (including, but not restricted to trading in so-called "carbon-credits") is probably just as lucrative as the entire oil industry.


Are we to understand, using their own logic, that climate-change hysterics are on the payroll of vested "green" interests? The oil companies themselves have done everything they can to remake themselves as "green" firms (British Petroleum, for example, changed its name to "Beyond Petroleum"), and far from railing against the Kyoto accord, have lobbied actively in favour of it. It is, of course, entirely logical that carbon-belching firms would be in favour of Kyoto or any other international accord that artificially restricted the supply of oil, and thus made that commodity much more pricey, and thus more profitable, than before.


In recent years, attempts to discredit sceptical experts on climate science have become a lot more vicious — and irrelevant — than what is described above. Is it coincidental that this new phase of smear and innuendo has kicked in just as the weather is not cooperating in being too warm? In 2007, a group of impeccably-credentialed, not-associated-with-oil-companies experts on climate signed a letter to the U.N. agency which had just issued an alarmist report that, contrary to the actual findings, stated that there is no doubt whatsoever that human activity is causing global warming. A p.r. flack associated with the foundation belonging to former geneticist and current TV presenter David Suzuki (the public relations firm in question shares the same Vancouver office space with the David Suzuki Foundation) touched on the usual hysteric talking-points (the signatories are "not experts", we're uncertain of their funding, and so on), but then hit on a new low: bigotry.


The flack claimed that, of the sixty who signed the document, "most were Americans." This is a line of argument that I don't think has been seriously employed by anyone since the end of World War II, at the most: that someone's national origin is the key ingredient in their intellectual credibility. It says something as to how desperate the climate-change hysterics have become, that they have begun to employ logic familiar to early twentieth-century eugenics theorists, in order to buffet their argument. It was such an embarrassment that, when it was pointed out that, indeed, less than a third of the sixty were actually American, the p.r. flack himself didn't reappear to defend himself: instead he had one of his flunkies write a letter to the editor which acknowledged the error (but not, naturally, the irrelevancy of the assertion itself) by stating "Jim misspoke himself".


More recently, the web log of the Breakthrough Institute, ran a series on a phenomenon it called "Climate McCarthyism." The Breakthrough Institute advertises itself as a "small think-tank with big ideas", and says its mission is "committed to creating a new progressive politics, one that is large, aspirational, and asset-based. We believe that any effective politics must speak to core needs and values, not issues and interests, and we thus situate ourselves at the intersection of politics, policy, philosophy, and the social sciences." Hardly a manifesto of the New Right, and climate researchers who publish at its blog are not in fact sceptical as to whether human beings cause global warming, not at all.


They do differ from others, though, in arguing that climate change is a problem that will be overcome not through carbon-reducing schemes such as emissions-trading (or placing a tax on carbon-use), or not through these alone, but from investment in new technology, and the reinvestment in older, non-carbon emission technologies such as nuclear power.


Series authors Michael Shellenberger and Ted Nordhaus focus their ire upon Joe Romm, a member of a partisan Democratic party think-tank in Washington, and climate blogger at the web site of the New Republic. In part one, they detail how Romm went on a campaign against Keith Kloor, a former editor of Audubon magazine and no sceptic of climate-change. Kloor had criticized Romm for feeding quotes to a climate researcher who'd been quoted, inaccurately, in a recently published book exploring alternatives to the strict-carbon reduction plans of treaties such as Kyoto. Email exchanges between Romm and the scientist revealed that Romm had insisted the researcher be quoted as saying the authors "utterly misrepresented my work." In fact, the scientist had been given proofs of the book to read, and had (by his own admission) overlooked a relatively minor inaccuracy in the authors' characterization of his theories. Romm went with the "utterly inaccurate" quote in his blog post attacking the book's authors. When criticized by Kloor, Romm responded with a post entitled "Meet Trash Journalist Keith Kloor" (which according to Shellenberger and Nordhaus, Romm changed to "Meet Journalist Keith Kloor" following the publication of the first part of "Climate McCarthyism"). There, Romm conjured up some offence that he imagined Kloor had committed against Romm's small-time journalist father, all the while never linking to Kloor's actual Internet posting on the matter. Shellenberger and Nordhaus remark on the irony of Romm characterizing Kloor's journalism as "trash", when he himself had said in email exchanges with the climate scientist, that he was looking for material to "trash" the authors of the aforementioned book.


Meanwhile, Romm has gone after other climate scientists — not even those who question man-made global warming, but who simply believe other measures than strict carbon rationing are necessary in order to stem climate change. Scientists that published an article in Nature magazine, advocating rapid investment in technology by governments, were branded by Romm as "global-warming delayers." Romm absurdly characterized them as being part of some cabal to which also absurdly belong George Bush the younger, Newt Gingrich and Danish statistician Bjorn Lomberg (author of the Sceptical Environmentalist). They are instead apparently Democratic partisans as well, and open supporters of U.S. president Barack Obama.


In part two of "Climate McCarthyism", Shellenberger and Nordhaus show how these scientists were wrongly associated with the American Enterprise Institute, also a partisan think-tank, but to the right, just because their Nature analysis superficially resembles one published by the Institute some years ago. They write: "The character assassination, the bullying, the psychological projection — it all adds up to Climate McCarthyism, and Joe Romm is Climate McCarthyite-in-chief. Joe Romm's `Global Warming Deniers and Delayers' play the same role as Joe McCarthy's "Communists and Communist sympathizers." While Romm built a loyal liberal and environmentalist following for attacking right-wing `global warming deniers' — a designation meant to invoke `Holocaust denier' — he spends much of his time attacking well-meaning journalists, academics, and activists, who take the issue of global warming seriously, accept climate science, and support immediate action to address it."


The sort of yellow cyber-journalism that Joe Romm engages in, is the rule throughout the hyper-partisan Internet. What makes it significant is his apparent influence on mainstream news-media columnists and commentators, who apparently pick up on Romm's blogging without bothering to verifying his attacks with a cursory check of the work of those being attacked.


Shellenberger and Nordhaus observe, "Joe McCarthy, like Romm, was compulsive in projecting his own dark side onto others." But isn't this the case with global-warming hysterics as a community? Isn't it the global warming alarmists who are enslaved to vested interests — that is, the billions upon billions in taxpayer and corporate funds that support the effort to bring carbon emissions under a regime of global control? Isn't it they too behave in an anti-scientific manner, when they absolutely refuse to engage the sceptics' arguments and instead attack them ad hominem for their alleged financing, their mental health, even their nationality? Climate-change theories of sceptics that I've read spend virtually no time going after the proponents of global-warming catastrophism personally, instead focussing on the actual theories at hand. These theories may be wrong. But this is science: the proposal of hypotheses that in most instances have no experimental or empirical bases at all. To attack the good faith of scientists who don't hue to the "consensus" is, to emphasize, not science at all, but the practice of religious zealots throughout the ages.


And, as the "hacked" e-mails from the East Anglia Climate Research Unit show, isn't it the global-warming scientific alarmists, not their critics, who are engaging in fraud and conspiracy, or at least of pursuing research in bad faith. Indeed, the leaked exchanges do reveal this in spades. One needn't bother with the extreme lack of professionalism that is encountered in the climate researchers' messages to one another, when referring to global-warming sceptics (or even those who, in Romm's terminology, wish to "delay" action against climate change). What is consequential is the scientists' frank admissions as to their efforts not only to jerry-rig temperature figures to show that warming has been more dramatic than it otherwise would be without (in the words of one e-mail) a certain statistical "trick".


The CRU climate scientists also worked assiduously to prevent independent agencies and researchers access to their raw data, pressured research journals to reject submissions from scientists believed to be "delayers and deniers", and lobbied these same publications to "get rid of" board members and staff-members likewise identified as "deniers." Earlier this year, a Toronto statistician associated with the Web site Climate Audit (not to my knowledge a climate sceptic at all), revealed that the East Anglia climate research facility had "accidentally" destroyed the raw figures by which it was able to calculate the warming trend in the twentieth century. The leak revealed e-mail exchanges on this very subject, involving the Climate Research Unit head (now been suspended from his job pending investigation of the leak), who said that, if British Freedom of Information statute forces him to reveal this data to Climate Audit, he will instead have to destroy it. Under the FOI, it is illegal to destroy information subject to an access of information request. This is a frank admission of intent to commit a crime. The e-mails are damaging enough, but also leaked were data files containing the underlying codes by which global warming calculations were carried out. These proved to be such a mess that a CRU software engineer spent months or even years trying to make sense of them — but finally gave up trying.


This is simply scandalous. Naturally, the climate-change alarmists tried to minimize the whole thing. But even George Monbiot, a British leftist known as a global-warming extremist, admitted that the leak was a great blow to the atmosphere of hysteria that he and the many others like him have managed to create over the last few years. Nevertheless, one can judge how titanic the whole thing is, by how assiduously the mainstream news media has been in ignoring the controversy. An early story, posted on the "right-wing" Fox cable-news site, simply stated "Climate sceptics see `smoking gun' in researchers leaked e-mails" (November 20, 2009). The controversy was covered in the New York Times, which in turn refused to publish the e-mails, on the grounds that they were "stolen communications."


Nearly forty years ago, the editors of the New York Times, along with Benji Bradlee of the Washington Post, went to the Supreme Court to request permission to publish other stolen communications, the top-secret analyses of the justification and strategy for U.S. involvement in Vietnam, which became known as the "Pentagon papers." The high court ruled that, while the purloining of the Pentagon papers was indeed illegal, the "public interest" imperative in making them public, overwhelmed any national-security considerations. A precedent was established wherein third-party recipients of illegally obtain information, were not bound by any contract or promise between parties to keep information secret. With a major international conference about to convene in a matter of days in Copenhagen, with the aim of imposing restrictions on carbon even more severely than under the Kyoto treaty, surely the public interest in learning about criminal fraud from a facility responsible for much of the research justifying these restrictions, outweighs any alleged right to privacy among researchers who are, after receiving millions in public funding. But at least, initially, such reasoning did not prevail among the editorial board of the New York Times. For its part, the Washington Post's first stories on the matter focussed almost entirely on the "rivalry" between researchers, as revealed in the leaked e-mails. Fraud? What fraud?


The leak has given succour to those in the "denier" community who are the counterparts of the McCarthyite Joe Romm among the alarmists: those who believe global warming is a "scam", a "fraud", even a "conspiracy". But, as I said, I don't question the good faith even of those scientists who are indeed implicated in fraud, any more than I question the good faith of Maynard Keynes, who pursued compulsory sterilization as head of the British Eugenics Society in the 1930s and ‘40, for his genuine belief in biological quackery — any more than I question the good faith even of police and prosecutors who have been shown to have either withheld or planted evidence that convicted those later proven innocent. The latter genuinely believe (like Orson Welles' duplicitous sheriff in Touch of Evil) that they've "framed no one who isn't guilty."