Wednesday, July 1, 2015

Down by the Cultural Borderlands

Greece has been much in the news recently, for the stance of its recently-elected government, rejecting the terms of a European Union bailout. 

Copyright 2010 Alison Cornford-Matheson. All rights reserved.

Though it appears that the Greek government has softened its stance, it shouldn't be any surprise that the Greeks should be standing alone amongst other European nations.

For, ethno-linguistically, the Greeks are one of a kind in Europe. All the other Western nations come in families: Italians, Spanish, Portugese, Roumanians and French, are Latin kin; Dutch, Germans, English and Scandinavians, all have Gothic origins; and the Slavs are several varieties called Russian, Polish, Czech, Slovak, Bulgarian, and so on. 

But there are no nations related to Greece; what is more, they are known by everyone else not by the name they call themselves (Hellenes), but by the name the Latins gave them: “Greco.” 

Yet, Hellenic civilization has bordered what is probably many more cultures than even the Latinate (who were, in turn, greatly influenced by the Greeks). Perhaps only Chinese or Arabic cultures have affected so many other discrete societies, as have the Hellenes. 

Dating back before Classical times, colonization from the Hellas spread Greek culture throughout the Mediterranean. Courtesy the invasions of Alexander the Great, Hellenic civilization spread throughout the near east, all the way to India. 

Classicists have speculated as to how history might have progressed if Alexander lived to be elderly, or even middle-aged. Would Greek civilization have fused more certainly with the cultures of Persia and further east, meaning that the rent between Eastern and Western civilization would be minimal or non-existent afterward? 

www.gods-word-first.org

Before I read Nicholas Ostler’s Language History of the World a few years ago, I didn’t realize how long-lived were the dynasties that succeeded Alexander, in the near and middle east, as well as in northern Africa. 

Many of them persisted, in fact, until they were conquered by Arab jihadi following the collapse of Rome in the fifth century AD. Unlike with the Romans in Europe, however, Greek civilization apparently remained an elite affair in western Asia, such that when the Hellenic dynasts were overthrown, little of its influence remained on the general society. 

Or it could be that the Greek influence was less extirpated, than almost literally effaced? As is well known, Hellenic philosophy was preserved in Islamic lands, even as these writings were lost to the medieval Occident. Greek thought affected Islam as a systemic code, as well as Muslim scholarship generally. 

But Hellenic-style science and philosophy during the Islamic golden age was conducted in Arabic, and so the Greek language and alphabet itself, were marginalized as scholarly-media there. They played a much more important role in the Byzantine empire, which wasn’t conquered by Muslims until the fifteenth century. 

The Byzantine clerisy was, however, determined to refute the pagan free-thinking of the Classical age, which suffered neglect thereby. Scholarly Greek became, too, the rarefied and refined preserve of the Orthodox priesthood, largely incomprehensible to the average Greek-speaking person of Byzantium (let alone the majority who did not speak in the Hellenic tongue at all). 

The Orthodox Church was successful in converting the Slavic barbarians to the north (just as the Roman Church brought nordic Europeans into Christianity). After the Turks overran Constantinople, Orthodoxy decamped to Muscovy, such that it became effectively a Slavic church. 

Adopting Old Church Slavonic as the “father tongue” of Russian Orthodoxy, the priesthood wrote it down in the Cyrillic modification of the Greek alphabet. With the Greek-speaking Orthodox Church under the rule of Islam, Hellenic speech and script — the language of the founding work of Christianity — was marginalized throughout the rival sects of the Church. 

Western Christianity’s use of the Latin alphabet (a borrowing at second-hand by the Etruscans from the Greeks) rendered the Hellenic language “all Greek” to most Europeans. 

By collecting the New Testament stories in Greek, the Jewish founders of Christianity intended to influence the educated Roman. It was only later, when Christianity became established, that the Gospels were translated into Latin for a popular audience (which ultimately became of the father tongue of the medieval church). 

Thus it is that, in spite of its vast contribution to Western civilization, the Greek tongue and alphabet are nowadays completely parochial. 

I live directly upon a cultural borderland, between French- and English-speaking Canada. The interprovincial border, just a few hundred yards from my door on the Ottawa river, is the remnant of a skirmish line from the Seven-Years’ world-war that occurred more than two-and-a-half centuries ago. 

This was another permutation in the rivalry between English and French royalty going back centuries before that, to the conquest of 1066 and earlier. 

Overshadowed by later conflicts, the “French and Indian war”, as it was called by the Thirteen-Colonists, it was the background for the American-independence rebellion that broke out just over a decade after the signing of the Treaty of Paris in 1763. 

The Colonists’ final break from the Crown, came with the passage of the Quebec act in 1774, which restored religious rights and seigneurial duties to the French-speaking population of Canada. 

This was viewed by the British Americans as a betrayal of the hard-fought gains of the Seven-Years’ war. Many of the American revolutionaries were veterans of that conflict, and the War of Independence was in fact a minor skirmish compared to it: whereas only seven-thousand Americans actually died between 1775-1783, the death-toll of the Seven-Years’ war could well have been one-and-half million in all sides

Whatever the case, a side-effect of the American revolution was the swelling of the anglophone population of Loyalist or Tory refugees north of the Great Lakes and St. Lawrence river. 

These settlers eventually helped found the Dominion of Canada. But Confederation was not intended to be an equal partnership between the French and English. 

The latter were happy to allow francophones to exist in their provincial redoubt. Decades before it turned to secession, Quebec nationalism strove for equal partnership with the English elite. 

But the latter, viewing themselves as part of a global empire, instead sought anglo-supremacy. Quebec separatism, emerging during the 1960s, was directed toward erasing the anomaly of cultural borderlands not being coincident with political sovereignty. 

The United States has been the only advanced, industrialized democracy to border a peasant-based economy of primary industry, Mexico. Unlike with Canada, however, until recently the cultural borders of the continental U.S. at least, were relatively coterminous with its political territory. 

Or rather, it has always been more significant for Canadians that the Anglo-American nation’s frontiers extends both far north and south of the 49th parallel. In recent decades, with legal and illicit immigration from Asia and especially Latin America, the U.S. has encountered cultural schism not unlike seen throughout the history of Canada, between an anglophone majority with broadly Protestant values, and a Catholic Latin-language minority, who themselves consist of a majority or nearly so of people within specified province or region. 

www.viewfromaloft.org

After the conquest of the western states from Mexico in the war of 1848, U.S. anglos were determined that their values would remain superior over the remnant Latin culture there (as were English Canadians with the French). 

But as the great prosperity of the U.S. economy was accompanied by the persistent lack of development in most of Spanish America, migratory pressure in the labour market was irresistible. The Mexican-U.S. border at the Rio Grande became in effect a legal fiction, simply because there was every economic incentive — on the part of Mexicans and other Latin Americans, as well as Americans themselves — to treat it as such. 

In many ways, Confederation (which came into being 148 years ago today) was an expression of anti-Americanism. But the process was greatly influenced by the U.S. example. In particular, the founding of the Dominion of Canada based on a constitution, which specified powers divided between different legislatures and levels of government, is directly copied from the American polity, since famously, Britain has no constitution. But now the cultural borderlands of the continental U.S. at least, are no longer coterminous with its political borders, the Americans may have to look to Canada to peaceably guide them to the future.

Tuesday, June 30, 2015

The Bromancing of Humankind

A semi-sequel to this entry.

Over the last decade or so, the word “bromance” has entered the common vocabulary. A contraction of “brother” and “romance”, it refers to an intense but non-sexual friendship between two men. 

Benacoop?
copyright, 2013 Kevin Mazur

Learning just recently that academics have named it “homosociality”, I’ve long thought nevertheless that the simple fact of male friendship and camaraderie is key to human evolution. 

The male of Homo sapiens, is distinct from other primates for not being asocial. Female chimps, gorillas, bonobos and so on, are able to forge bonds with each other, even when they are not direct kin. 

There is no friendship, however, between male primates, including those who are related by blood. 

From what I have read about mammal ethology generally, solitary behaviour amongst males of this order of life, is the rule rather than the exception. 

Somehow, the Homo line was able to extend the sympathetic drive inherent in the mammalian bond, to be a commanding instinct among the entire species. 

The instinctual status of human sympathy is attested to, by how certain people manipulate it for their own benefit. Confidence artists, for example, often commence their scams by appearing to be infirm, or by regaling their mark with some “sob-story” or another. 

Children, too, seek to avoid punishment for mischief by (as a friend put it years ago) “crying my way out of it.” Tears are a fascinating part of the human behavioural repertoire. Though expressions of sadness and distress are common amongst mammals at least, only anecdotal evidence exists of tears being shed by animals other than the human — even amongst close primate relatives such as chimpanzees and gorillas. During infancy and childhood, crying is as much or more so a response to actual physical harm or injury, as to hurt feelings. 

Didn't want him to cut it ALL off...
www.whensallymetsally.co.uk

But with maturity, crying is almost always an emotional as opposed to physical reaction. It is perhaps the most conspicuous example of how physiologically-integrated is the human primate into the social milieu. According to the abstract of a paper published by Dr. Oren Hasson of Tel Aviv university, “Multiple studies across cultures show that crying helps us bond with our families, loved ones and allies ... By blurring vision, tears reliably signal your vulnerability and that you love someone, a good evolutionary strategy to emotionally bind people closer to you.” According to Dr. Hasson, “"Crying is a highly evolved behavior. Tears give clues and reliable information about submission, needs and social attachments between one another. ... My analysis suggests that by blurring vision, tears lower defences and reliably function as signals of submission, a cry for help, and even in a mutual display of attachment and as a group display of cohesion."” 

Most everyone can make themselves cry, but tears usually come involuntarily. This is why it has been viewed as shameful to cry, especially for men. Women frequently cry when in argument with their boyfriends or husbands, but female attestations of crying when alone demonstrate that they, too, are embarrassed about the “weakness” of tears. 

Since it is either not possible (or extremely rare) for animals to cry, tears of sadness must be a product especially of Homo evolution. The sympathetic bond in hominids goes back a very long way, for to have evolved a special, involuntary faculty — tears — to arouse compassion automatically in others. 

Selection pressures inherent in an environment of intense sociality, caused an adaptive change in biology for the Homo primate. Tear-ducts have been an essential part of the eyeball, as the sense organ evolved in multicellular lifeforms. 

Excessive secretion of tears occurs throughout the animal kingdom, in response to contamination of the retina (people often excuse emotional tears by saying that “smoke or something is in my eye”). The ancestors of humans developed the ability to tear up in response to social, in addition to natural, stimuli (i.e., smoke). 

Sobbing gives the physiognomy a striking resemblance to that of a sick person. Taking care of the ill and infirm is very ancient behaviour, too, as evidenced by Homo fossil specimens found without teeth or with crippling diseases or injuries that would have precluded them from taking care of themselves. 

As noted, con-artists often pretend illness as a means of gain, but such a rouse is not always done for greed alone. People feign sickness simply for the sympathy it evokes, which is telling in itself. And, if typically involuntary, people can make themselves cry, as well, for manipulative purposes. Conversely, sadness can induce illness, either psychosomatically, or by suppressing immunity to virus and disease. 

It is a vivid example of how culture affects the very physiology of the human primate. For the vast majority of the human race, sympathy is an instinct that is evoked by the appropriate stimuli, such as crying. 

Most health-scam victims are shocked and ashamed at their own gullibility. They didn’t, as the saying goes, “hear alarm bells going off,” because their cold reason was overcome by the drive to help others in need. In spite of the neologism, “bromance” is thus very primal to the Homo family. Human uniqueness began not in any particular technical skill, but with the turn toward sociality by the hominid male. 

fathertheo.wordpress.com

Evidence of male involvement in Homo social groups goes back a long way, at least to the “first family”, the set of adult and child fossils discovered by Donald Johanson in Africa, which date to at least three million years ago. 

It isn’t only that the hominid male was able to bond with a family group, a remarkable turnabout from normal primate behaviour. It was even more so that Australopithecus (or later hominid) males learned how to get along with each other, which is very exceptional for a mammalian species. 

Predator and forager, the asocial behaviour of the male is a result of selection pressures as well. The main evolutionary purpose of aggression for male mammals at least, is to establish territory for reproductive and sustenance needs. This will ensure that their genes, and not those of other males of their species, will be passed on. 

For hominid males, though, reproductive success was assured by sociality, and not individualistic territoriality. Aggressiveness was retained as part of Homo behavioural repertoire. Except that this was practised socially as well. 

In this, male aggression followed the normal mammalian pattern. Throughout the animal kingdom, mothers are driven to violence when their young are seen to be under threat. Bonding with the kinship group, Homo males used cooperative aggression to allay menace from others of their kind, and from different species as well (including other hominids). 

Brutality carried out as vengeance for injury visited upon kin or comrade, is a commonplace in human history. Such aggression is provoked even (or especially) when the initiating harm is symbolic, as much as physical. Sympathy exists dialectically with aggression throughout the mammalian order (as with the “mama bear protecting her cubs”). 

The involvement of males in the Homo social group, established a territorial ground expansive enough for complex social and cultural behaviour to take shape. 

It came through the use of aggression as an expression of sympathy by the individual for the group. Long before humankind settled down in farms and cities, it had colonized all the continents but one. 

This would have been impossible without the intense cooperation witnessed particularly in the hunting tribe, a cultural artefact responsible wholly or in part for the extinction of most land-dwelling mega-fauna (and many other creatures medium and small). 

The skills and abilities honed by hunting, were institutionalized in the warrior band, the explicit purpose of which is to attack humans, not animals. Martial elites have dominated the rest of humankind throughout recorded history. 

This was achieved pragmatically through the control of the means of destruction — weaponry. Just as important, though, was the “spiritual” part of the warrior ethos, which demands the surrender of mere egotistical concerns for the sake of the corps. 

The career soldier in any era lives to practice selfless commitment to his “brothers”, right up to laying down his life for them. It is the pan-cultural tradition that goes under the name of “honour”, and which often inspires respect, even friendship, between those who previously fought on opposite sides. 

Warriors lack respect for those who do not fight, hence the violence and brutality meted out to non-combatant populations during the conduct of war. The military has been the most prominent example of homosociality throughout history, but barracks’ life has apparently included plenty of active homosexuality as well

This was famously the case with the armies of the Classical world, and more recently life in the Royal Navy was said to consist (in words often attributed to Winston Churchill) of “rum, sodomy and the lash.” 

The warrior tolerance or encouragement of same-sex relations, may well have inspired the monotheistic faiths to forbid homosexuality outright. Judaism, Christianity and Islam all look to encourage sympathy in a principled and pacifist way anathema to the warriors’ bond, and toward all of humanity, regardless sex, race or status. 

Practically, of course, each of the monotheistic faiths have had to fight holy war. But the conspicuous example of Christian homosociality was the monastery, in which participants were supposed abjure not only from aggression and other sin behaviours (such as having sex), but also to give up family life and other traditional means of belonging. 

Interestingly, monasticism is not just discouraged, but forbidden outright in Islam, the same religion which uniquely and explicitly lays out the rules for just holy war, or jihad. 

It was hominid-male sociality which, sooner or later, displaced the female as the centre of the kinship group among human primates. It has been periodically fashionable to suggest that during the time of the hunter-gathering economy, all human societies were matriarchal and thus peaceful and cooperative in character

This was despoiled only with the rise of farming and civilization, and with these, warrior elites. It is true that virtually all civilizations have been highly patriarchal; of ancient cultures, only the Etruscans are believed to have granted any status at all to women. Perhaps tellingly, though, Etruscan dominance of the Italian peninsula was short-lived, falling with relative speed to the avowedly patriarchal Latins. 

It isn’t entirely clear, however, that matriarchy prevailed prior to the rise of civilization. Certainly, this isn’t even the case with hunter-gatherer societies that survived into the historical era. Some are matrilineal, it is true, but others are strictly patriarchal. And regardless of family structure, in no hunter-gatherer society do the men behave peaceably. On the contrary, they are often more violent than those in agrarian or urbanized cultures. 

Whatever happened during prehistory, though, there is good reason why patriarchy became the dominant mode of kinship as human beings became sedentary and civilized. It is that, paradoxically, males are more expendable than women. If hypothetically, a society were to lose most of its men, it could rebound in population quite easily, by having the survivors impregnate the rest of the women. But the converse is not true: not matter how many men a women has relations with, she will become pregnant by only one of them. 

Thus it is that polygamy is extremely common throughout history — especially in war-like cultures — while polyandry (the marriage of more than one man to a single woman) is correspondingly rare. 

As progenitors of the race, women were so valuable that they came to held as property by men, whose lives in turn were rated much cheaper when not in ownership of women. The ancient and common practice was for a conquering army to kill the men and enslave the women, who could give birth to the next generation of male slaves. Reducing women to the status of chattel, allowed warrior men to act upon their sympathetic drive mostly in regard to their fellow soldiers. 

The intense homosociality necessary to create a domineering male elite, is compromised by the mundane and tender concerns inherent in family life. There must be an enduring tension, for men, inherent between family life and the brotherhood of arms (or of anything else). 

Monotheistic religion attempted to direct the male away from a martial mentality, toward a marital one. The warrior ethos was promulgated almost as an ideal type in the Spartan polis, counterpart and rival to the Atticans during Classical times. 

supermanherbs.com

There, warriors were separated from their families physically as well as psychologically. With the onset of adolescence, boys were removed from the care of their mothers, sent to barracks for intense training and complete immersion into military life. 

Upon adulthood, arranged marriages would occur, after which men and women lived apart (but for periodic assignations in which the husband would pretentiously or literally rape his wife). Whilst Spartan women remained in the domus to keep care of the young, the men kept rigidly to themselves, training for the next combat, conquering enemies, and otherwise engaging in homosexuality to an extent that scandalized even the Athenians. 

The degree to which military success depends upon a homo-social esprit de corps, has been revealed in recent decades as women have been permitted to enter the armed services, even taking up positions in combat battalions. 

Not long ago, for example, a retired Canadian supreme court justice issued a report, which found that sexual harassment was rife in the country’s co-ed military units. The American military, meanwhile, does not allow females to serve alongside men, or in combat at all, precisely because it would undermine unit cohesion, as apparently is the case with the Canadian and other armed forces that (unlike the U.S.) are remote from actual fighting.

Monday, June 29, 2015

The Ancient Words We Use Each Day: An Attempt

In order to describe the novel techniques and devices of modernity, it was necessary to employ the terminology of ancient (if not dead) languages. 

Thus, we have the telegraph (a construction from ancient Greek meaning, “writing at a distance”) and telephone (“sound at a distance”) and television (which combines the Greek “tele-” with the Latin, “to see”). 


Because "tele-thea" would sound too awkward...
www.pyroandballyhoo.com


Indeed, the words “technology” and “engineering” are derived from ancient sources. 

In this, however, there is a puzzle. Most words used for technology are Greek in origin, whereas the ancient language associated with Western Christianity was Latin. 

The latter survived as a father tongue, long after its native speakers had died out, because it was useful to communicate in script between clerics of different mother tongues. 

After it disappeared as the scholarly language, Latin evolved into the language of scientific taxonomy, again to overcome the problem of categorizing different species for researchers speaking different languages. 

Yet, when scientific principles were applied to practical ends, to produce engineered technology, inventors and investors in these novel, automated techniques, turned mainly to Greek, rather than Latin, to provide the nomenclature for them. 

This preference for the Greek over the Latin was witnessed not only the applied arts. The academic fields that developed within the nineteenth century university, such as psychology, sociology, anthropology, and so on, took on Hellenic monikers, instead of the Latin, in spite of the fact that the latter was the language of classification in the biological and chemical sciences (both “biology” and “chemistry” are terms derived from ancient Greek). 


Too dry: a sample of ancient Greek script.
(c) Asafta | Dreamstime.com


This is probably the result of the revival of classical Hellenism as the university grew during the nineteenth century. In their battle against the ecclesiastical masters of the medieval university, the classicists preferred the old Greek language over the Latin, as a tongue untainted by Christian influences. 

In spite of the fact that the New Testament was written in Greek, western European scholars were far enough removed from Eastern Orthodoxy, such that their view of the language remained not jaundiced by clerical influences. 

Although classical Greek never became, for the nineteenth-century university don, the language of international scholarly intercourse that Latin served during the Middle Ages, it did obviously exerted its influence upon the academic specialties that grew (paradoxically) under their influence. Even religious teachings became specialized as “theology” (another ancient Greek term).

Saturday, June 20, 2015

Money is the Root of ... Some Evil

I may begin an irregular series in which I attempt to debunk many cliches.

For instance, the saying, “money is the root of all evil”, has been passed down through the centuries. 

It was inspired by the writings of the Apostle Paul, who actually stated, “All wrongdoing can be traced to an excessive attachment to material wealth.” Usually, this is translated as “the love of money is the root of evil.” 

tay.kinja.com


But really, even with this qualifier, such a sentiment cannot be upheld as even half-true. Could even half the evil in the world be attributed to a “love of money”? Certainly, there are many who believe so, and say that all evil is attributable to a love of money. Indeed, self-sacrifice is often accused of having greed as its ulterior motive. 

Certainly, many forms of criminality, such as theft, robbery, extortion, blackmail, and so on, have as their motive a lust for money. However, more serious - and truly evil - crimes have other causes besides. 

For instance, rape is not motivated by greed (though the word “rape” originally meant “theft”), even where the assault is accompanied by theft of money and materials. 

For that matter, most murders and deadly assaults are not committed in the name of greed, but because of jealousy, feud, rage and mental illness. The worst forms of evil, as recognized by criminal law the world over, have not greed as their motive most of the time at all. 

It is often said that war is always, at root, about the lust for wealth. This is true, in its way. For it is lust for necessities, rather than luxuries, which drives states to go to war with each other. Civilized society depends upon the extraction of resources for its existence. 

Natural wealth dwindles as human riches grow, so too are states forced to range further afield to secure staple resources. Inevitably, each state encounters others with the same goal. Mutual need sometimes leads to peaceful trade, but more often to warring sides. 

Recorded history is in large part the annals of states going to war to secure resources that they literally cannot do without. Kings and strongmen have waged war for reasons other than mere plunder. War is driven more so by a necessity to neutralize some presumed rival, whose equality of power and resources seems a threat in itself. Even where the explicit goal of war is lust for wealth, it most often results in loss of precious blood and coin for all involved. 

A lust for money does not the motivate, but ameliorates the lust for war. By expediting all forms of exchange, money encourages production of what is plentiful, in trade for what is scarce. As war interrupts trade, and devalues money, cash provides an incentive for civil rather than martial efforts to acquire resources. 

In pacifist literature, war is often portrayed as the fault of profiteering bankers and bomb-makers. The high-interest loans that banks grant to governments to pay for war, more than compensate for the loss of civilian business by destitute populations, except that governments have always defaulted, delayed, reneged, renegotiated these loans in peacetime. 

During the Renaissance, several prominent banking houses went bankrupt after many princes defaulted on their loans, stalling the cash economy for centuries. 

In the twentieth century, lust for wealth seems to have a very secondary motive in the pursuit of World Wars, except for a need to acquire staple resources. Nazi Germany was determined to set up a rustic leisure-state of hundreds of millions of Aryans, supported by a slave-class of Slavs and other race mongrels. 

The Nazis’ eye was always to the east, to the mythical homeland of the Aryan race, and they saw Russia as the obvious target for conquest. They were hardly concerned with the material splendour of the Soviet Union, most of which had already been destroyed by the Bolsheviks. They instead coveted the oil, wheat and other untapped riches of the Rus. 

The Nazis intended to level the conquered Soviet Union, and replace it with the new Aryan super-state. There, the racially pure would live in a cashless, medieval utopia. 

Similarly, the first Great War was inspired by not by the love of money, but of that primal concern of statesman, to neutralize rivals of equal power to themselves. Germany emerged from its defeat of France in 1870, as the new global rival of the British. From then until 1914, the Brits and Germans engaged in an arms race, the former becoming allies with old rivals France and Russia to complete what the Germans saw as “encirclement.” When war finally came, all the participants seemed reluctant and horrified at having to carry out what they nevertheless understood was inevitable. 

Each side consciously decided to sacrifice in order to stand up for an ideal — for the Austrians, to avenge the murder of the heir to throne; for the Russians, to defend a fellow Slav land from Germanic aggression; for the Germans, to defend their Austrian allies against Russian aggression, and to show the British that they hadn’t dominion over the world; for the French, to avenge the conquest of Alsace-Lorraine; for the British, to defend against “Prussian” authoritarianism in Europe and around the world. 

www.keystagehistory.co.uk

The diplomacy leading to the First World War has also been described as a case of several bald men fighting over a comb. In the end, all sides sacrificed far more in material wealth (not mentioning the millions of lives lost) than anyone believed possible at the outset. Each side managed to do far more damage to themselves, than they ever inflicted on their enemies, and the victors were scarcely victorious. The great powers consciously liquidated their considerable reserves of wealth, all for the principle of the thing, and they didn’t succeed in making war obsolete, either. 

If war discloses just how evil the human animal can be to others of its kind, this malice has its seat in the idealistic or abstract faculty. War erupts as opponents threaten, or seem to threaten, each other’s access to the necessities of civilization. It is often precipitated when one side or the other is seen to violate some basic or sacred ideal or moral, and is undertaken usually in spite of its cost. 

Wars fought on principle, whether between states or between factions of the same state, have fomented more evil than other sorts of conflict. They not only visit torture and brutality upon combatants, they destroy material wealth all around. 

Often, the principle being fought over is religious in character. Religious civil war frequently has a “decadent” establishment pitted against a puritanical insurgency. When the latter prevails, yet more material wealth is destroyed in the name of the ascetic. 

In the twentieth century, highly-principled enemies of material wealth succeeded to power in many, very populous lands. The result was a literal decimation (the destruction of one in ten) of the affected populations, with millions more killed in inter-state wars. 

It is hard to describe either the Nazi or Soviet regime without reference to evil. The Nazis were satanic, for their deliberate genocide. But the “Aryan” German not involved with trade-unionism or communism was not subject to the sort of ongoing terror that affected the citizen-subjects of the Soviet empire. 

The latter was a regime of arbitrary arrest, detention, imprisonment and banishment, all to the end of cowing the people into cooperation with the latest five-year plan. 

Given all this, it would be more true to say that “the hatred of money is root of much evil.” In contemporary times, at least, there has been a demonstrative link between the capacity to commit great evil, and the willingness to provide humanitarian measures for the downtrodden. 

Those who have committed evil on a mass scale, are praised by many exactly for their selflessness. Thus, while Ulyanov overthrew nascent Russian democracy, and set into place the apparatus of terror that later consumed millions, Lenin is still praised by many for his frugal lifestyle and humble demeanour. When his successor condemned thousands in show trials during the 1930s, Walter Duranty, Russian correspondent for the New York Times, observed that making an omelette required broken eggs (Duranty won a Pulitzer Prize for his reporting of the Soviet Union). 

Later, after the war, Duranty wrote in the Nation journal, that “purge” in Russia meant merely “to clean”, and thus the purges that Stalin was undertaking at that time were merely like a colonic for the Soviet body politic. 

Those socialist apostates, the fascists, had their admirers near and abroad, as well. Before and during the war, even years afterward, it was common to hear that while persecution and Holocaust are surely bad things, "Hitler did put millions to work." Similarly, Mussolini was credited with "making the trains run on time."  

In the anti-colonial wars which followed World War II, the most brutal insurgents were also those who provided humanitarian assistance to the peasantry. The lack of civil liberty in Castro’s Cuba, similarly, is excused because of the regime’s alleged provision of the best medical care and free schooling. It implies no cynicism in the motives of parties just noted, to suggest that humanitarianism and brutality are expressions of the same will to power and control. Terror and welfare instill dependency in populations: giving is implicitly or explicitly coupled with taking away.

Of course, money itself is scarcely problem free.  It is, in fact, a fetish, a contemporary version of the witch-doctor's talisman.  It is perhaps for this reason, in turn, that some of the early Christians considered if not money itself, then the excessive love of it, as "evil"; it was simply too pagan but to be important

Monday, June 15, 2015

The "People's Princess", Not

In essence, television is entertainment made into a domestic appliance. 

As such, television “personalities” cannot be, as with entertainers in other media, unique and outrageous. Instead, they are characterized by blandness and inoffensiveness. Their role is simply to be. 

This is attested to, by the fact that when performers in other media become TV “personalities”, their actual talent (singing, dancing, comedy, acting etc.) become secondary, even forgotten entirely. 

When these personalities retire or otherwise disappear from the mass media, they are almost always forgotten in a matter of months. 

To analogize, a particular dish eaten at a restaurant may well be remembered months or even years later, for its scrumptious unfamiliarity. An eatery that serves only bland and conventional foods, is a “greasy spoon”. Public entertainment similarly should lack blandness and convention. 

By moving theatre indoors, the impact of any individual television entertainer became no more significant or durable in viewers’ experience, than is the consumption of a piece of toasted bread with bottled jam. Which is to say, it may be enjoyed in the moment, but who remembers eating a delicious slice of toast after even a day? 

www.fanpop.com


I think this explains the disappearance from consciousness of Diana Spencer, the Princess of Wales who died in a car wreck in Paris in 1997. 

This event brought shock and sadness throughout the globe. Hundreds of millions of people watched the funeral telecast, which was attended by celebrities and dignitaries from all countries. In the days before the service, crowds in the tens of thousands remained outside Buckingham palace. There were rumblings of discontent amongst these “mourners”, directed at the Royal family generally, and at the news media especially. 

The latter allegedly contributed to the death of the Princess, as it was reported that Diana and her companions were attempting to flee the paparazzi when their vehicle crashed in a Paris tunnel. The grandiose claim that Diana’s tragic passing would hasten the end of the monarchy, became commonplace. Of course, nothing of the sort occurred. It turned out that, in fact, the paparazzi were not in "hot pursuit" of Diana’s car when it crashed. 

If the mass media did not kill her, then, it did indeed create the Princess of Wales. Which is to say, it created the “personality” that was so omnipresent on television, that the death of the real person behind it was enough to apparently stop the business of the world for days at a time. 

In the end, though, Diana was merely a personality. In this, she played the part of bland and inoffensive so perfectly. She became famous — at the age of nineteen — not for any talent or achievement. 

It was simply that Lady Diana won the hand of the heir to the British throne. Never truly beautiful, Diana Spencer was quite sightly nevertheless, and seemed an appropriate person to be queen. 

There is, in any case, a cultural analogy between television “personalities” and the British royal family as figureheads of state. The role of the latter is simply to be, to somehow “embody the nation” (in the Middle Ages, this was a literal thing). 

It is no wonder that in the television age, the “Windsors” became so celebrated, as the medium focuses on being over becoming as a matter of course. 

Again, Diana seemed insubstantial enough to fulfill the role chosen for her. Her initial shyness with being a public figure, only enhanced her status as a creature of television. In spite of her marriage to prince Charles ending (if I may) in a wreck, Diana remained a global celebrity until her death, because of her televisual qualities. It really seemed, when she did die, that Diana was the “people’s princess”, the “queen of our hearts”, as one newspaper headline put it. 

She was not, though. It turned out she was no more important to people than any other mass-produced consumer product. It is the reason why interest in Diana dropped off precipitously in the years after her death. I’m certain that most of what has been written about her, since then, has consisted of conspiracy narratives relating to the circumstances of her death. 


Ah, who cares?
www.fm-base.co.uk

The passing of Diana Spencer was the last major media event, during which the Internet was irrelevant. In 1997 still, electronic mail was the only truly revolutionary medium associated with Internet technology. The World Wide Web at that time, largely consisted of amateur sites devoted to Star Trek, the American civil war and other hobbyist and “fandom” subjects. 

It was, in essence, an enhanced “bulletin-board service”, the local computer networks set up by geeks during the 1980s. It had global reach, but the Internet even in ‘97 was not truly global, given the number of people (probably a large majority at the time) who didn’t even have Internet access. 

The only e-commerce that took place then, too, was for the exchange of pornographic imagery. 

News organizations had not yet treated the web as anything more than an afterthought — if at all. The “paradigm-shifting” services characteristic of the more recent Internet, got going just after Diana’s death: the peer-to-peer file sharing service, Napster, started in 1999, for example. 

Her death was thus the last hurrah of “old media”, the few-to-many transmission of information processed by gatekeepers. 

Upon news of the accident, even before her death was confirmed, all regular networks and cable-news services threw out their scheduled programming and devoted live, ongoing coverage not only to reporting on the accident itself, and the funeral of Diana, but also the many non-events that took place in the days between these two landmarks. 

Such live coverage is paradoxical, at least for commercial television services. Tragedy and its aftermath, attracts big audiences. Yet, ongoing live-coverage of political assassinations, accidental deaths, terrorist attacks, natural disasters, and the like is a money-losing proposition for network and cable TV. Not only is it costly in overtime paid for on-air talent and behind the scenes technicians. 

Commercial television services also forego, during extensive live coverage, their only true source of revenue, advertising. TV networks don’t wish to “sully” the sombre attention paid to the tragic event in question, with inappropriately jaunty or upbeat commercial messages. 

Similarly, advertisers are reluctant to have their own products associated with downbeat and dreadful subjects. Thus, live coverage is for commercial television a vast sea of red ink — yet, everyone involved seems highly motivated to undertake this money-pit enterprise. It confirms the assertion of Father Ong, years ago, that the live event is television’s true metier. 

Originally, television broadcasts were all live. In essence, comedy and dramatic programmes of the first decade and more of TV, were simulcasts of stage-plays performed live in front of an audience. Even today, the most highly rated regular television programmes are live events — that is, coverage of sports and other athletic contests. 

As with the stage actor, the actors on live TV — the various “anchors”, “analysts”, pundits, reporters and so on — come into to themselves when performing without a script, doing improv as it were. It is their chance to shine. Yet, coverage of the death of the Princess of Wales, became unintentional self-parody. Diana Spencer was, after all, merely a celebrity — someone famous (as Daniel Boorstin said) for being famous. 

She could no longer even stake symbolic claim to political importance, as heiress of the British throne. Again, though, TV-news services treated her passing in a manner befitting an important world states-woman. There was a telling contrast in the coverage of Diana’s death, with that of another world famous individual, Mother Theresa, which occurred a day or two before or after the Paris crash. An Albanian-born nun, Theresa had for many decades run an orphanage in the poorest parts of Calcutta, India. 

www.olaughingpress.org

Her work has been, posthumously, subject to snarky commentary. But whatever the truth of Mother Theresa, she was noted for actual things she did in the real world. Her death was, however, completely overshadowed by the Diana marathon-coverage. And although everyone seemed united in grief for the passing of the Princess of Wales, the very disproportion in the coverage of her death, as compared to her actual accomplishments, inspired its own dissent in the form of humour. 

I said before that email was the truly revolutionary product of the Internet, circa 1997. It was through this medium that “Diana jokes” began to be passed back and forth. More than a month after the accident, these were documented in a story in the Globe and Mail (national edition of Oct. 7, 1997, page A12). 

Chris Defoe reported that, “So far, the mainstream comedy world has been largely silent on the subject, even as it has dominated the headlines and newscasts around the world. [Chat-show hosts Jay] Leno and [David] Letterman, those two touchstones of comedy consensus, seem to have avoided the subject completely. So have most other "official" comedy voices...” But, Defoe went on, “The first Diana joke appeared on the Internet — that hightech watercooler — within days of the crash, and over the past month more than 100 jokes have been posted, collected and circulated on and off the Net.” They included this one, “Prince Charles was out early the other day when a passerby said, "Morning," Charles said, "No, just walking the dog."” Another went, “What's the difference between a Mercedes and a BMW? Diana would never be caught dead in a BMW.” This sort of “gallows humour” response to hyped-up media coverage of tragic events, was precedent to the accidental death of Diana Spencer. 

More than a decade earlier, with the explosion of the space-shuttle Challenger, jokes quickly began circulating around north America at least, presumably transmitted by long-distance telephone call. These included the new acronym for “NASA”, “Need Another Seven Astronauts”, or “What did Christa McAulliffe [the schoolteacher on the Challenger who was going to be the first civilian aboard a space shuttle] say to her husband before leaving for the flight?: `You feed the dog, I’ll feed the fish.’” (The space craft was launched at Cape Canaveral, and disintegrated several dozen seconds later, either over the Atlantic ocean or the Gulf of Mexico). The loss of the Challenger crew and passenger, like the passing of Diana and her companions, was indeed a tragedy. 

But the response by mass-media was completely over-the-top. As when a mourner at a funeral who screams and sobs uncontrollably for a distantly-related or barely-known departed, observers cannot be help but to roll their eyes and whisper snark into each other’s ears.