Saturday, May 23, 2015

The Quality of Movies These Days

Since the advent of the “blockbuster” film (usually dated to 1975, with the release of Steven Spielberg’s Jaws), it has often been lamented that “movies aren’t what they used to be.” 

Are recent Hollywood movies of less quality than movies of the “golden age”, as many believe? Or is this a bias resulting from the fact that, as a rule, movie-goers are exposed only to the very good or excellent films of the black-and-white era, while the average or poor films from contemporary times are not yet forgotten, and thus, seem the rule rather than the exception? 

I think there is a great deal to be said for the latter view.


"I am mad north-north-west."
www.doctormacro.com

As a cinephile who loves black-and-white movies but also has seen the virtues of the ultramodern digital photoplay, my observation is that the standard Hollywood film of today is qualitatively different from what prevailed during the first three decades of the sound-film (I will admit that my exposure to silent pictures is very delimited). 

Using the term “quality” in a neutral and academic sense, just how are Hollywood movies today different from the black-and-white era? 

Back then, a great number of movies (from southern California) were musicals or Westerns

Very few of either genre are produced today, even though musicals at least, when presented onstage, remain very popular. In spite of the phenomenal success of Andrew Lloyd Webber’s Cats, or the more recent reinterpretation of the Wizard of Oz, Wicked, neither of these productions have as yet been adapted into cinematic musicals (though a direct-to-video Cats was produced in 1998, apparently).  

Westerns are even rarer these days, though they remained popular right up until the 1960s, on the big-screen and on the boob-tube as well. The Bonanza program, for example, ran from 1959 to 1973, while Gunsmoke had the record for the longest-running primetime American TV series, lasting from 1955 to ’75, until it was overtaken a few years ago by the Simpsons

Conversely, genres which are commonplace today – such as the science-fiction or “action” film – were uncommon or unknown when most films were produced in black-and-white. Why is this the case? 

During the early decades of commercial sound-film, California film-studios sought to produce movies that would appeal to the largest possible audience. This meant, in practice, that film-production was biassed toward the standard, melodramatic, pseudo-romantic style of narrative, adopted from the novel and stage-play. 

From 1930, when the first Oscar for best picture was awarded to a sound film (All Quiet on the Western Front), to 1950, all the Best Pictures were films based on novels, short-stories and stage-plays (except in 1944, when the Bing Crosby musical-comedy Going My Way took the trophy). The majority of the best-picture nominees from this period were also based on books and the theatre, and this continued well into the 1960s. 

The photoplay form itself transcended the written word and the theatre, but especially during the first decade of the soundtrack, when filmmaking had to be sequestered upon enormous sound-stages (to prevent ambient sound — as well as the noise of the camera itself — from drowning out dialogue), stage-professionals were ideally suited to bringing the new medium to life. 

Thus, Hollywood movies of the golden age are different from those of today, with respect not only to their overall schemata and aesthetic, but also their themes and subject-matter. 

With the advance of movie technology, and the challenge of television to movies as the dominant audile-video medium, movies have undergone a qualitative evolution. 

It isn’t that the content of the blockbuster is at all novel. 

To the contrary, most find their themes, characters and even their plotlines, in the “genre” and “B”-movies of the black-and-white era. This is key. 

Filmmakers have always produced the sort of “genre” pictures that are characteristic of the blockbuster. 

But whereas decades ago, these films were invariably produced with a fraction of the budgets that were devoted to the big Hollywood melodramas or musicals, now science-fiction, horror, and “action” films are budgeted for the tens and hundreds of millions of dollars, while dramatic and melodramatic films are left to the “independents” (once the domain of drive-in shlock) or “prestige” films that are produced by a major studio at the behest of an actor or director with a proven box-office track record. 

The change came not because movie companies are today more interested in money than art, than they were decades ago. Film companies have always been profit-hungry concerns. 

The difference is, of course, that it is only genre pictures – given enough effects and hoopla to become blockbusters – that attract an audience large enough to be spectacularly profitable. 

The change is reflected in the winners of the best-picture Oscars, since the 1970s. Several winners have been based on novels (none upon stage-plays, however, except for Driving Miss Daisy, 1989), but most were original screenplays. 

Even "prestige" contemporary motion pictures are biassed away from the novel and the stage, for material that is conceived originally for the screen. 

It is true, as well, that several of the novels that were the basis for best-picture winners during recent decades, were themselves of the “genre” type (such as Silence of the Lambs, the 1992 winner based on one of a detective series about a female FBI agent, and even The Godfather was viewed as a pseudo-genre “page-turner” when originally published in 1969). 


Above: the Hollywood film industry.
screencrave.com



And while the screen-adaptations of novels and plays during the golden age were biassed toward the hoary classics, more recent Academy Award movies based on novels, more or less have become the definitive version of the story, superseding their literary sources. 

How many know, for example, that the 1967 best picture winner, In The Heat of the Night, was originally published in novel form? Certainly, the Godfather movies have superseded the original novel, and even Schindler’s Ark, the acclaimed novel that became the basis for the 1993 best picture winner (directed by Spielberg), was republished as Schindler’s List, to conform to the movie’s title. 

In a real way, thus, cinema has become the “literature” of the electronic age.

Sunday, May 17, 2015

No One Is a Luddite

Some time ago, I read Against the Machine, by Nicols Fox, about “the hidden Luddite tradition in literature, art and everyday lives”, published in 2002. 

The author champions “neo-Luddism”, the allegedly anti-technological philosophy that enjoyed a vogue during the late twentieth century. I don’t think these ideas have had much purchase in recent years, perhaps because the Internet seemed to empower ordinary people to contribute and access information, to an extent unknown in the past. 

Still, revelations about the extent to which the U.S. National Security Agency (and associated and equivalent organizations in Canada, Britain and throughout the developed world) monitors Internet and cellular communications, should lend credence to the anti-technology school, given that this carries the potential for totalitarian control of the populace. 


Not even you, Kirkpatrick Sale.
www.mikechurch.com
But “Luddism” can never manage to be more than a fad, for two good reasons. First, no one is truly a Luddite. It is even questionable the extent to which the actual Luddites, the weavers of northern and midlands England in 1816-17 who smashed the spinning machinery that they believed were taking away their livelihoods, were “Luddites” in the sense of being against modern engineered technology entirely. 

They were against, specifically, the mechanistic devices that they saw as a challenge to traditional textile-making.  

Kirkpatrick Sale, whose Rebels Against the Future, from 1995, inspired Fox’s own book, does not present evidence that they were against all technology, as far as I can recall. I don’t even know if Sale argues that the Luddites had a coherent philosophy of any kind. 

The Luddites were, however, considered a terrific threat. Not only was the smashing of machinery made a death-penalty offence, but the British army was dispatched to the regions were Luddism was most widely practised, to restore order. 

Ultimately, the “neo-Luddism” of the last fin de siècle didn’t amount to anything, because the “movement” was only against the newest forms of engineered technology. In the 1990s, it was the personal computer that was seen as the biggest threat (in a piece of perhaps unintentional comic theatre, Sale during a speech took a computer and smashed it to the floor). 

But, when these novel devices are assimilated by society, initial opposition to them seems reactionary and even silly. The fact is, it is impossible to be human, and to be against technology, since the Homo primate evolved from an anatomical condition quite like present day gorillas and chimpanzees, into what they are today — because of technology. 

The first hominid mastery of a chemical reaction, was the control of fire, which happened as long ago as one-million years. Fire allowed the nascent human to settle over the entire world. Yet, probably no entity has caused more destruction of life and property, than fire. It has razed the greater part of almost every town and city that has ever existed, and thus and otherwise has extinguished probably millions of lives. But no one would suggest that people do without fire. 

I was reminded of this in a letter to the editor to Time magazine, sometime in the early 1990s, in response to a profile of American author Jeremy Rifkin. 

I’m not sure that the latter would consider himself a “Luddite”, but he sounded the same themes of techno-phobia familiar to Sale and the others. Back then, as I recall, Rifkin’s bugbear as the Human Genome Project, which was at the time making its way toward decoding human DNA.  This was, he claimed, a form of knowledge that people just shouldn’t have, for it could lead to totalitarian control and manipulation, a creation of an underclass based on the deficiencies found in their genetic profile. 

No one, I believe, would take such arguments seriously today, and in the Time letter, the writer stated succinctly, “If Rifkin were alive centuries ago, he’d be campaigning against the invention of wooden matches.” The point being, that matches make mass-destruction easier by rendering fire accessible to anyone. What are the upsides compared to that? 

The second reason Luddism has no lasting influence, is that the tiny number who do actually do follow through and attempt to live without modern technology at least, thereby remove themselves literally to the fringe of society. 

In her book, Fox describes the Kellams, a married couple who, sometime in the early 1950s (the chronology is somewhat confused) opted to vacate American society almost entirely (the husband had been an engineer working on materiel for the Second World War), setting up on island off the coast of Maine, where they remained until 1986, at which point the man died of a stroke. His wife, unable to live on her own, departed for the mainland three years later. 

The Kellams had no electricity, they also no radio, television, electric lighting, not even a refrigerator. They kept perishables in their well, and had a limited diet of homegrown vegetables. Even then, of course, they couldn’t live entirely without modern technology. Well-digging itself is a wonderful technique, thousands of years old, certainly, but “technological” nevertheless. 


northpacificseafoods.com
Aside from their garden, the couple ate tinned fish — canning is, of course, an entirely modern technology, invented only in the late eighteenth century, and coming into widespread use in the nineteenth. For lighting during the long and dark winters of that part of the world, the couple used kerosene lamps, the latter being an even more modern technology than canning, and one which, mostly obsolete in the developed world, is still used throughout vast areas of the underdeveloped south, with no access to electricity. A big part of the “Luddite” critique of technology, is the damage that is done to the earth’s ecosystem, through the extraction of resources to feed industrial processes. Perhaps no substance was more immediately destructive to wildlife than kerosene.  

It was the extract of petroleum wrought from the blubber of millions, or tens of millions, of harpooned whales. Whaling ships were perhaps the first types of factory, wherein dead whales were processed into kerosene (and many other products) ready for sale as soon as the ships made landfall. And of course, the Kellams required fuel for to keep warm during winter. This, in turn, was acquired by cutting down trees on the island for burning in a wood stove. The axe, at least, is a very old technology, going back thousands of years. But it is “technological”, nevertheless. 

Thus it is impossible for anyone to be “pre-technological”, unless they wish to live exactly as do apes or chimps. Mrs. Kellam, for example, apparently kept a diary during their years together. Significantly, Fox notes that these writings were “obviously meant for publication.” 

The “Luddite” lifestyle is called a “throwback” to the “pre-modern”, but the mentality that gives rise to it, is resolutely modern. It goes back to what must be the progenitor of the modern anti-technology school, Henry David Thoreau. Famously, Thoreau (who lived between 1817 and 1862) “left civilization behind” to live in solitude at Walden pond, near Concord, Massachusetts, on fourteen acres of land owned by his friend, the poet Ralph Waldo Emerson, between 1845 and ‘47. 


en.wikipedia.org

A book recounting this experience, called Walden, was published in 1854, but did not become well known until after his death. I have in the past derided the conceit that Thoreau was “getting back to nature” during this period of his life. 

His “wilderness” was, if not within walking distance of civilization, quite near to it anyway. If his project in self-sufficiency did not progress as he hoped it would, Thoreau could easily have stepped back into modernity — which he did in fact do, after a period of two years and slightly more than two months. 

If Thoreau had really wished to venture into the wilderness, he could have left behind his long-settled home state and gone west, as millions of his countrywomen and men (not to mention, many immigrants to the United States) were doing at the time — often with desultory or even fatal consequences. 

But essential to the “Walden” philosophy, it seems to me, is to give up the downside of civilization, whilst not dispensing with its advantages. And, of course, Thoreau, like the Kellams, wanted the whole world to know all about how he had turned his back upon it. 

If he had really wanted to live in genuine wilderness, Thoreau would have been just one of many anonymous people who settled what are now the western American states. His specialness lay precisely in how he apparently, and temporarily, remained aloof from civilization, whilst not really being detached from it at all. It is, in essence, a very aristocratic, even misanthropic, mentality. 

It rejects “technology” (or really, novel technology) precisely because it provides access to lifestyles, goods and services that were available only to elites, in past times. “Getting back to nature”, in the sense that Thoreau believed ideal, involved retreat from people to the solitude of life in the woods. 

However, very few people could afford such spacious territory, with concomitant resources to allow sustenance over a long period, as Thoreau could not himself (the Walden land belonged, as mentioned, to Waldo Emerson). 

Self-sufficiency, as advocated by Luddites since Thoreau, is said to be route to personal autonomy. However, the last period during which autarky played a significant role in any economy, was during the Middle Ages. Noble estates were, at least before the first millennium AD, fortified spaces in which lords had virtual ownership over the peasants that worked on them (they were called, back then, “serfs”, which is derived from the Latin word for slave, “servus”). 

Very little trade took place between the estates, resulting in absolute poverty for everyone, even the “richest.” This way of life was nothing approaching “solitude”. On the contrary, life during medieval times was inevitably corporatist. My take is that the Yankee Thoreau, or anyone claiming to be his intellectual heir, would dislike living this type of “simpler” life at all. 

Then there is the issue of “sustainability”, which long ago became almost an empty phrase from within the environmental movement. Self-sufficiency, however, is not sustainable. 

If every individual, or even a relatively small kinship group, were to produce all of their needs without exchange with others, it would result in a net waste of resources. The more efficient and effective use of resources (under the principle of comparative advantage, as adumbrated by nineteenth-century political economists) comes through trade, not autarky.  
On a more practical level, would it really be desirable for each person, or each kinship group or whatever, to live upon their own patch of land comparable to the fourteen acres that Thoreau occupied near Walden pond? 

I doubt if there is enough arable land in the United States, let alone the world, for this to be viable. This is aside from the fact that, I would reckon, there are very few who would wish to live in such a way. 

The Thoreau ethic is, to restate, a thoroughly modern one, whatever its pretenses to reviving the past. Part of this is the self-centred character of the neo-Luddite worldview. 

David Thoreau, of course, did not marry. It is unknown if he became romantically involved with anyone, woman or man. The Kellams, on their island off Maine, also did not have children. 

In Against the Machine, Fox provides no reason for this. Perhaps they could not. It seems more likely that they would not, and that to me makes sense. I would venture that, beyond compulsory schooling laws and so on, it would have been impossible to raise even a single child in the minimal-technology conditions in which they chose to live. It is strongly suggestive also, though, that such a way of living is unsustainable, for it seemingly cannot accommodate a fundamental biological function. 

Another childless and sexually estranged man, Theodore John Kaczynski, took his “Luddite” philosophy to a maliciously pathological extreme. As the “Unabomber”, the Illinois-born Kaczynski sent explosive devices to a couple of dozen university professors and airline executives between 1978 and 1995. Fortunately, only three of his targets died in the resulting blasts. A former child prodigy in mathematics, Kaczynski had become the youngest full professor in the field, aged 25, at the state university in Berkeley, California, by 1967. He abruptly resigned two years later, and some time after that went to Montana to live in an isolated cabin — without electricity or running water — where he masterminded his bombing campaign. 

It is clear, however, that Kaczynski didn’t wish to turn his back on society entirely. Like Thoreau, he wanted the world at large to know the reasons for his solitude. After some of his bombings, he sent letters claiming (on behalf of some shadowy group) responsibility for them. He wasn’t apprehended, though, until he sent a 35,000-word essay, entitled Industrial Society and Its Future, to the New York Times, to be published in its entirety, as the condition for ceasing the serial bombing-murders. 

Therein, Kaczynski explained his anti-technological beliefs. But after the Times published the tract, his younger brother David recognized in these writings not only the philosophy, but also quirks of grammar and style, as that of Kaczynski. 

He tipped off authorities, and Ted Kaczynski was arrested and charged with capital murder in the bombing deaths (a guilty plea allowed him to serve life without parole). The reaction from other neo-Luddites at the time was that, while they disagreed with his methods, they were in broad agreement with Kaczynski’s philosophy. 

It would be too much to say that all the “Luddites” of recent times, were similarly alienated from intimate relationships, or family life, as were Kaczynski, Thoreau and the Maine couple. Yet, when I checked the back of the jacket of Against the Machine, and realized that Nicols Fox is a woman, I was surprised. Most the so-called Luddites are men. This is in itself unusual in recent times, especially for a movement that is essentially an offshoot of the “progressive” or left wing of politics (though Fox states, truthfully, that neo-Luddism doesn’t fit easily in the left-right spectrum). I think that women know foremost that, as the saying goes, “the good ol’ days were terrible.” It is no revelation to say that until thirty or forty years ago, women were at best, second-class citizens. Before that, men simply presumed that they were better than women. 

Until the mid-twentieth century females were, in the words of a folksong, “Kept by the parents until they are wise/Slaves to their husbands the rest of their lives.” This ceased to be essentially true, only a few decades ago, and only in Western societies. Even when women entered the workforce in larger numbers, they encountered harassment and objectification at the hands of their male superiors and colleagues. If this wasn’t (as alleged by feminists) the rule, then it was common enough. Women seem to understand (given their lack of representation among the neo-Luddites) intuitively that the age of “technology” (that is, of engineering) has coincided with their liberation from social restraints. The pre-mechanized era, on the other hand, was when the weight of “gender apartheid” was placed most heavily on women’s backs. 

This is evident not only in the most casual perusal of history texts. It is obvious also from looking at the developed Occident, as compared to the still-rustic, technologically backward places throughout the world. As societies move from the latter condition, to the former, the status of women improves almost as a matter of course, it would seem. 

It is true that there is scepticism, even hostility, toward “man-made” engineering amongst academic-feminist scholars at least. A particularly common theme within this literature is (or was) the “medicalization of women’s bodies” as a new means of oppression. 


Fat is a Feminist Word.
www.slideshare.net

Common within academia, I don’t see that this view has much traction amongst women outside of it. Again, the latter pick up on what the former, so often blinded by the devotion to the abstract (as has been the case with scholars for many centuries), cannot see. Namely, that it is the medico-scientific establishment (even when it was male-dominated), that has provided the tools for women’s liberation from the biological constraints that men were mostly free from throughout history: like the birth-control pill, artificial insemination, hygienic birthing procedures and (last but not least) pregnancy termination techniques that have become so safe that the death-rate from having an abortion is far less even than other invasive procedures (as is, indeed, the central point of the slogan “keep abortion safe and legal”). 

I haven’t ever read a feminist who professed to Luddism, though. The feminist movement’s view of (engineered) technology seems to be that, under the aegis of feminist principles, it can be used more humanely than is the case with the “patriarchy.” 

Whatever the case, feminists are “progressive” in so far as they believe (if implicitly) that technology is not in itself a means of oppression (it is only so in the way it is used by the “male-stream” culture). 

But even patriarchal technology has provoked women’s liberation, at least from traditional roles and prejudices. The correlation of all-pervasive engineering, with the rise in the status of women, is too strong to be mere coincidence. Women’s rights are, in the obverse, negligible in cultures where engineering is unknown. 

It isn’t only in the realm of medicine. Engineering has converted domestic activities and chores from the onerous to merely mundane. It is true that “women still do most of the housework”, in spite of being employed full-time outside the home. 

Domestic appliances such as the electric and microwave oven, the refrigerator, the vacuum cleaner, and so on, accelerated the productivity of household tasks, such that they became part-time in nature. They were in fact the necessary cultural ground for the separation of woman from the home. Years before the appearance of the “career woman”, the segment of bourgeois society known as “bored housewives”, were assuming the publican functions within their communities, always in the past held by their husbands — like home and school committees, street-safety councils, recreational bodies and charities, the sort of voluntary associations that middle-class men of the postwar were too busy at work and staying inside watching TV, to bother with. 

Feminist theory describes domestic technology, as another means of slavery for womankind. But as with medical engineering, this simply cannot be. The very fact of women’s success in the modern work world, refutes the theory. The character of work in economies where engineering becomes the norm, seems to be biassed in favour of the female, instead of the male. Paradoxically, in an engineered economy, work requires more intensive personal interaction than ever before. 

The sociologist Daniel Bell wrote: 

In a pre-industrial world, life is a game against nature in which men wrest their living from the soil, the waters, or the forests, working usually in small groups, subject to the vicissitudes of nature. In an industrial society, work is a game against fabricated nature, in which men become dwarfed by machines as they turn out goods and things. But in a post-industrial world, work is primarily a "game between persons" (between bureaucrat and client, doctor and patient, teacher and student, or within research groups, office groups, service groups). Thus in the experience of work and the daily routine, nature is excluded, artifacts are excluded, and persons have to learn how to live with one another. In the history of human society this is a completely new state of affairs. 

From the perspective of the early 1970s, Bell noted that: 

Work in the industrial sector (e.g., the factory) has largely been men's work, from which women have been usually excluded. Work in the post-industrial sector (e.g., human services) provides expanded employment opportunities for women. For the first time, one can say that women have a secure base for economic independence. One sees this in the steadily rising curve of women's participation in the labor force, in the number of families (now 60 percent of the total) that have more than one regular wage earner, and in the rising incidence of divorce as women increasingly feel less dependent, economically, on men. (The Coming of Post-Industrial Society: A Venture in Social Forecasting, Basic Books, 1973; republished with new Foreword: Harper Colophon Books, 1976, p. xvi-xvii.) 

According to “difference” feminists, women are by nature more virtuous than men, embracing a pacifist, participatory and nurturing ethic, as against the competitive, warrior male mentality. 

These activists, who were increasingly influential during the late twentieth century at least, go halfway to reviving the image of woman primarily as Mother. Except that they abstract it into a general social theory, or etherealize it in neo-pagan cults, such as “Wicca.” 

Putting consideration of the relative virtue of each sex, it could be said truthfully that women are the more social (and immorality is scarcely incompatible with sociality, as indeed, no one could be unethical to themselves). It is precisely because women tend to be more effective than men as social actors, that they became so dominant within the organizations that coordinate the use of engineered technology. 

Such administration is far more than people being “cogs in a machine.” There is an inevitable human dimension, precisely because it is mostly detached from the actual engineered technology itself. Bourgeois women at least, used their background in community voluntarism, as an anchor into the private and public bureaucracies of the information age. 

Men, meanwhile, saw their traditional “game against fabrication”, as characteristic of the economy of heavy-industry, go offshore or obsolete. This reality, obscured as it is by feminist theorizing, accounts for the lack of women within the ranks of the modern-day so-called Luddites. 

Even if, however, the “Luddite” vision is unsustainable and impossible, its philosophers are not incorrect to ascribe inherently negative consequences to the use of engineered — or any — technology. 

Engineering is so gargantuan and all-embracing in its consequences, that its effects cannot be other than both for good and bad. The awkwardness with which the anti-technology school fits into the conventional political spectrum, shouldn’t go against the truism that most “Luddites” started their political lives at least, on the left, and generally identify with socialist goals (except insofar as they reject engineered technology). 

They are thus as a rule biassed in their understanding of engineered innovation as the work of corporations. In fact, technological development has been driven mainly by the state, in its effort to devise new ways to fight wars. This was so, from the beginning of recorded history, and it remains the case in the present day. The role of private business in industrialism, has been to exploit for profit those inventions, devices and processes that have already been proven in warfare, through investment by the state. Industrial corporations may well have been involved in war production, government contracts which served to make them so large in the first place. 

The real “American way” is not really the invention of new technologies (for many products of “Yankee ingenuity” were imported from elsewhere), but their merchandising, converting luxury goods into staple-items sold on a mass scale. The reality that modern engineering is mostly the result of war, should be a central part of the anti-technology bill of indictment, given the overlap of “peacenik” philosophy with Luddism. Betraying their left wing roots, however, the neo-Luddites are so blinded to the framework laid down by Marx and the Marxists (namely, that Capital is behind all the badness in the world), they seem to overlook this elemental fact. The neo-“Luddite” philosophers are also mistaken for believing that engineered technology is, as Fox puts it, an “unstoppable juggernaut.” It is a prejudice they share, in fact, with the pro-technology mainstream, which says “you can’t stop progress.” 

It has become fairly well-known in recent years that many of what are considered Western inventions, were in fact innovations of the far east. This proves that merely having sophisticated engineering doesn’t make for what Jacques Ellul called the “technological society.” The history of Chinese technology shows that the continued refinement and improvement in engineering, what is generally understood by the term “progress”, is anything close to inevitable. 


What time do you got?
cina.panduanwisata.id
Engineering flourished in Western society during modern times, because social conditions made this possible. The most important factor, perhaps, was the political disunity of Western Christianity, going back long before the schism between Catholicism and Reform. Western Europe was unique, amongst the civilizations of Eurasia, but for transient periods having no single dominant continental power. 

China, on the other end, has been ruled by a single monarch for more than two thousand years. Periods of disunity were characterized by intense warfare, but for successive not secessionist ends. When kingly power faltered during Chinese history, warlords fought to rule over all others, not carve out their own domain. 

Upon the Indian subcontinent, absolute rule by a Hindu monarch was replaced by similar government, in the fourteenth century, under Mughal kings. In the middle and near east, meanwhile, prolonged rule by Arab potentates was gradually replaced, after the turn of the last millennium, by Turkic dominance, with the Ottoman empire lasting until the Great War. Even Eastern Christianity, so long dominated by the Byzantine and then Russian empires, was more unified politically than the West. 

European disunity was not, as in other societies, also a cause for disorder and breakdown. The continent was warlike, with one European country at war with another (or others) for seventy-five percent of the time between AD 1500 and 2000, according to historian Niall Ferguson. 

But each of the half a dozen or so European powers were in themselves orderly and legitimate (in the sense that populations were governed by law), and did seek to dominate, but not (again, but for transient periods) actually rule over all the others. It was their intense rivalry, though, which fuelled Western technological progress, as each power sought to gain advantage over their opponents, through subsidy of engineered technology. It was only subsequently, that these processes were commercialized for a consumer marketplace. 

From the Middle Ages emerged a burgher or bourgeois class that had borrowed many of its tools and general lifestyle, from the monastery. The middle class was key to bringing technology to the marketplace. But nothing of an industrial revolution would have occurred without the intervention of the state in the economy, subsidizing technological research and innovation to fight wars. 

Thus Britain really underwent industrialization during the long Napoleonic wars, and this occurred with the United States only after its civil war. Germany became the second-biggest industrial power by the close of the nineteenth century, as an “army in possession of a state”, as someone described Prussian militarism. Japan similarly underwent industrial revolution as it rearmed, and the global technological revolution of the twentieth century came largely at the behest of two World Wars and the Cold War. 

For such conspicuous and long-memorialized events, these wars are virtually the invisible background of modernity, such is the inattention directed at them as the motor of engineered progress. 

The bourgeoisie did not develop industrialism on its own initiative. Their primary concern, in modern times as throughout the history of civilization, is commerce, making a profit by “buying low and selling dear.” Only when engineered machinery was proven through its use in war, was it also then converted to civilian purposes by private capital. Due to the serial-lineal conception of time, the literate bourgeois conceives of this refinement in production and products, as somehow “inevitable”. Ignored is the true history of technological progress (or Improvement, as it was once called) as spasmodic, with innovation remaining stagnant until accelerated by armed conflict. 

The German experience verifies that it isn’t even necessary for the bourgeois to be the ruling class, for a society to become thoroughly modernized. Western democracies’ lack of investment in military preparedness during peacetime, apparently served to occlude what was obvious under the German empire, that militarism is key to industrial development. 

The bourgeoisie are essential, too, as the means for selling engineered technology, or its products, to the masses. That was also part of nineteenth-century Germany, especially after the tumult of 1848. The Junker aristocracy granted the middle classes limited political rights, and broad commercial rights, calculating in turn (and correctly) that the possession of the latter, would cool the bourgeois ardour for the former.

We see that, in regard both to China and the Occident, progress in engineered-technology is by no means "inevitable".  It arose in both places because of the large-scale intervention of the state in society.  And, in the case of China at least, when this governmental motive force is removed, engineering and science go into rapid decay.

Tuesday, May 12, 2015

The Violent Vegetarians

Not long ago, I read an article which applied the term “herbivore” to contemporary great-powers (such as the European Union or Japan) that eschew militarism for diplomacy. This was contrasted to the “carnivorous” actions of the United States or Putin’s Russia. 

It betrays an ignorance of animal ethology, because actual herbivores are hardly unaggressive. In fact, foraging species are amongst the conspicuously belligerent of all

Tusk, tusk
ngm.nationalgeographic.com

The name given to a person who dominates through violence or (more usually) intimidation — “bully” — comes from the colloquialism for the male bovine, the bull, an entirely herbivorous animal that is also perhaps the most aggressive of domesticated fauna. 

The term “eight-hundred pound gorilla” has also been used metaphorically in the realm of geopolitics — mostly in reference to the behaviour of the United States during the Cold War and after. It even has its own Wikipedia entry, described “an American English expression for a person or organization so powerful that it can act without regard to the rights of others or the law.” 

The web-page also states that it is hyperbole, since gorillas weigh no more than six-hundred pounds (the average weight being just 400 lb). 

Be that as it may, gorillas are perhaps the only primate which are not in any way carnivorous. They are, on the other hand, very territorial, ready to use their might to discourage other gorillas from encroachment. 

Since the male gorilla at least, did not acquire its imposing physique as a result of predation, natural selection favoured those primates that could literally throw their weight around with others of their kind (and any other kind, as well), in order to secure resources and reproductive opportunities. 

The robust anatomy characteristic not only of bulls and gorillas, but of other truculent beasts such as elephants, rhinoceroses, and hippopotami, are a biological fact of these species being herbivores. 

Predatory species are characteristically more lithe in constitution, as with canines, felines, and Homo sapiens, again as a result of the evolutionary need to move quickly to catch prey. Of course, the less-dominant herbivores require leaner frames in order to outrun predators, and the carnivorous ursine is typically of bulkier shape. 

Even so, intraspecific violence seems more common between herbivores than among carnivores, paradoxically because predators are more naturally equipped to kill. To avoid extinction, carnivores had to evolve inhibitions on the use of teeth, claws and other deadly organs against others of their own kind (since doing so would, in turn, reduce the size of the population available to reproduce). 

There was, on the other hand, little evolutionary need for herbivores to disinherit interspecific aggression, simply because such activity was not so lethal among them. Foragers have evolved bodily strength and bulk, so as to defend against carnivores, as well as territorial intrusion from their own kind. 

But since participants in herbivorous aggression are relatively matched in physical terms, fighting rarely ends in death (unless by “accident”). Carnivores’ need to discern kin from quarry in the pursuit of aggression, paradoxically encourages sympathetic behaviour. 

Many predator species are solitary, but some of the most complex animal societies exist among carnivores. Orca-dolphin pods, for example, can include up to one-hundred individuals, the largest of any mammal sea-creatures. 

Wolves of the sea.
www.kayakingtours.com

On land, wolf-packs are characterized by social complexity approaching that of primate species. This is why canines and human-primates at least, have been able to coexist for such a long period (as far back as thirty-five thousand years, by some estimates). 

Human beings are, too, highly social but also very aggressive. Homo sapiens use violence not only against all other species: they engage in interspecific aggression on a scale not witnessed in the animal kingdom. 

Unlike with other predators, human violence is usually carried out by artificial means — spears, arrows, knives, and in recent times, firearms and bombs. I think it was Konrad Lorenz who pointed out that weaponry imposes psychological distance from victims, so that it becomes that much easier for an aggressor to inflict harm. 

However, traditional weapons such as knives or swords are more intimately murderous than guns or bombs. It has more to do the fact that, whether edge or projectile in nature, weapons are artefacts. Not evolving organically, humans could not evolve natural inhibitions for the deadly extensions of their own, usually harmless, faculties. The brakes that are placed on aggressive use of weapons, are sociocultural — as are, in fact, the liberties that are placed on the use of the same.

Friday, May 8, 2015

When Words Go Bad

A while ago I watched the cable “reality” program Come Dine With Me

Originating in Britain, this has four or five contestants hosting one other for dinner in turn. They rate each meal on a scale of one to ten, and the chef with the highest score gets some prize or another. 

What caught my attention, though, was when one of the contestants, a middle-aged woman, referred the pet cat of that segment’s flamboyant host. She wondering if “this is the only pussy you’ve ever had” (or something to this effect). 

www.hdwallsources.com

Except that the double-entendre was bleeped out. 

Over the course of my lifetime (and I scarcely very old) a vulgar word for the female pudendum has its assumed its primary definition, a term not to be used on TV or in polite conversation. 

When I was young, middle-aged and older folks could be heard saying “pussy” to refer to only to an animal. 

At the same time, though, I remember teenagers and youth of the slightly more delinquent variety use it to refer to women’s sexuality in a way I didn’t quite understand. 

To say “pussy” as such, satisfies the criterion of euphemism. It directs attention away from the object in question, so that no actual word for female pudenda is used. 

But an ordinary cat is so vividly hirsute, and the animal’s association with the human female so lengthy, that to use “pussy” to refer to genitalia, was recognized immediately as a vulgar term, even (or especially) by those who used it. 

As a vulgarity, “pussy” must have long existed before I was born, its understanding as such remaining within the circles of male-chauvinist culture. 

In the 1950s, James Bond novelist Ian Fleming named one of his female characters, Pussy Galore, and this was the name used in the movie adaptation a decade later. 

Sure it is.
nationalave.com


But thereafter, the vulgarity became better-known to the generality, and the use of “pussy” in its “literal” sense went into corresponding decline. 

Still, it was possible during the late twentieth century to hear that word uttered without irony or double-entendre. This did in fact occur during a visit by an elderly relative to the family home. This woman was perhaps a centenary old, but I think even my parents were shocked initially when, looking our housecat, she asked, “Who’s pussy is that?”

Sunday, May 3, 2015

We Are All Sven's Mother, Now

The issue of “free-range” parenting has come to the fore, after a couple in Maryland allowed their children, aged six and ten years, to go to the local park all by their lonesome selves.  


ruthmazo.com

Someone apparently contacted authorities as to the anomaly of children that young being somewhere without adult accompaniment. The pair were taken into custody; but the parents were not contacted about this until some hours later. Naturally, the parents became alarmed when their son and daughter essentially went missing during that period. After they were finally informed of the whereabouts of their kids, they were allowed to regain custody only after signing a “safety-plan” that would ensure the parents wouldn’t be so rash as to let their children out to play alone. 

Discussing this case, and the issue generally, with an older relative the other day, she described “free-range” parenting simply as “the way children played when you were a kid.” 

I chuckled, but this was both true and untrue. Unfortunately, my childhood overlapped the period during which children were progressively less free to “go out and play” than they were ever before. 

It led to the paradox wherein greater restrictions were placed on my liberty when I was older, than was the case a few years earlier. Whereas, at seven or eight, I was instructed each day during the summer to “go down to the park and find some friends”, by the time I was eleven or twelve, my parents wanted to know exactly where I was going, who I was going with, and the phone numbers of the parents of the kids with whom I intended to pal around, so that they could check up on me if necessary. 

I remember thinking this odd, and didn’t understand the reason why. 

It was, of course, fear: sometime around the turn of the 1980s, it was almost as though a switch was toggled, or a button was pushed, and parents were consumed with the paranoia that their children would end up (as the saying went) “on the side of a milk carton.” 

And, contrary to what most believed at the time, and probably continue to believe today, it wasn’t as though there were more stranger-abductions that were suddenly occurring back then. 

It was a change in cultural attitudes – a very swift one, as these things usually are – and not an objective changes in social conditions, which brought an end to the age-old practice of “free-range” parenting, and inaugurated age of the “helicopter” parent – those who hover over their children in a manner akin to that type of aircraft – a period in which North Americans at least, still live. It is precisely my goal at this site, to examine – to anatomize – cultural changes such as these. 

The transformation can be illustrated by anecdote about a parent and her child from my schooldays. 


focusgolfgroup.com

My elementary school was rather far away from the house in which I grew up. I had to walk more than a mile to and from the schoolhouse, though much of it was on a long street that gradually transformed from a busy commercial thoroughfare into a quiet residential road, before terminating at a cross-street bordering a park. 

My classmate lived near the end of this street, much closer to the school than myself. Nevertheless, each and every day after school, there would be this boy’s mother, on her bike, worriedly pedaling up the street toward the school, asking everyone on the way, “Have you seen my Sven? Where is Sven?” 

This was not the lad’s real name: nevertheless the poor boy was an object of ridicule amongst the other kids in the class, not only for his parents being foreign (though I believe “Sven” was born in Canada), but also because his mother would come looking for him after the school-bell rang, as “though he’s in kindergarten.” 

This “ridiculous” mother was, at it turns out, a harbinger of the style of parenting that became commonplace just a few years later. For, when my own children were grade-school age, I too was a helicopter parent. I had become, without thinking about it even, Sven’s mother. 

This, even though, I knew intellectually that stranger-abductions are so extremely rare as to be statistically insignificant; I knew that, when children come to harm at the hands of another, overwhelmingly that the perpetrators are someone known, usually known intimately to them (as in the case of this hapless child); even though I knew that I would have, as a child, chafed at the restrictions that I myself insisted upon my own kids; nevertheless the fear of that something bad would happen to them, was enough for me to irrationally believe that I had to hover over my children in order to prevent something bad from happening to them. 


The question remains, however, as to just why this irrationality overtook me and most other parents in North America, and probably throughout the Western world. There were a number of reasons, I would suggest. 

There were, beginning in the 1980s, several highly-publicized cases of child-abduction. It wasn’t as though such incidents were unknown in the past. It was the attention given them by news-media, that made the difference. In the U.S., helicopter-parenting became the norm after the apparent abduction in New York city of Etan Patz, a six-year-old boy who in May 1979, went missing in Manhattan while on his way to school. (The 1983 film Without a Trace, seems to have been inspired by the case).  

No trace of Etan was ever found, and he was declared dead in 2001.  A suspect was charged a couple of years ago, but right now in New York, a jury seems deadlocked in coming to a decision on the man’s guilt. 

In Canada, I think the paranoia about child-abductions really commenced with a couple of such cases in the Toronto area, that of nine-year-old Christine Jessop, and sometime later, of Allison Parrott, aged 11.  

Again, however, such crimes are extremely rare. The last stranger abduction of a child in Canada occurred, I believe, in 2009, with the disappearance of Victoria Stafford, aged eight, from her school in Woodstock, Ontario. 

It should be noted, though, that Tori wasn’t a “free-range” kid. She was in fact taken from the grounds of her school, by a young woman who (in league with the man later convicted of the girl’s murder) enticed Tori to accompany her with the promise of a free puppy. 

I think the end of free-range kids came not because of any particular highly-publicized case, or series of cases, however. 

It was more so because of the advent of local television in news. In 1982, Don Henley, having just recently left “the” Eagles, put out the solo hit, Dirty Laundry

I must admit that, when the song was new on the radio, I was a bit confused as to its excoriating criticism of local news. For, in my obscure corner of the world, the local newscast was a pretty innocuous affair, soberly delivered, mostly light in tone, and featuring stories that were really of no interest to anyone outside the region. 


www.inquisitr.com
There was in those days, no way of seeing local news in other larger, but local markets – such as Los Angeles or New York, without actually visiting these places. It wasn’t until years later, when these “U.S.-style” newscasts came here, that I understood the point that Henley was making in being so critical of local news. 

Local news especially, delivers the news in the fashion of a weather report. A constant refrain, in promotions for local newscasts, is as to “how the news affects you.” 

Indeed, the news is intended to affect the viewer, in so far as it is supposed to provoke a particular emotional reaction. The weather affects everyone, which is why the subject is often broached in conversation between distant acquaintances and outright strangers. 

News producers, and especially TV “journalists”, try to affect the universal resonance of the weather report. They do so by provoking basic emotions: sentimentality sometimes, but more usually, fear. As with the enemy in war, people pay attention to things they fear. 

This is why viewership of local TV news is so much higher than national-news reports. 

Child-abductions are just one of the suite of modern fears that TV news producers use to get people to watch. Given that they are so uncommon, newscasts must rely on more banal dangers to make people afraid: fires, drugs, environmental toxins, street crime, and so on. 

In so far as free-range parenting is concerned, the Mietevs of Maryland just may be trying to break the spell of paranoia that descended on Western (or North American) society more than three decades ago on the issue of stranger-abduction. 

From what I can see, “parenting experts” and lay parents themselves, the response has been enthusiastic (or more typically) tentative support. Whether or not this will bring about a sea-change in cultural attitudes, remains to be seen.