Tuesday, June 30, 2015

The Bromancing of Humankind

A semi-sequel to this entry.

Over the last decade or so, the word “bromance” has entered the common vocabulary. A contraction of “brother” and “romance”, it refers to an intense but non-sexual friendship between two men. 

copyright, 2013 Kevin Mazur

Learning just recently that academics have named it “homosociality”, I’ve long thought nevertheless that the simple fact of male friendship and camaraderie is key to human evolution. 

The male of Homo sapiens, is distinct from other primates for not being asocial. Female chimps, gorillas, bonobos and so on, are able to forge bonds with each other, even when they are not direct kin. 

There is no friendship, however, between male primates, including those who are related by blood. 

From what I have read about mammal ethology generally, solitary behaviour amongst males of this order of life, is the rule rather than the exception. 

Somehow, the Homo line was able to extend the sympathetic drive inherent in the mammalian bond, to be a commanding instinct among the entire species. 

The instinctual status of human sympathy is attested to, by how certain people manipulate it for their own benefit. Confidence artists, for example, often commence their scams by appearing to be infirm, or by regaling their mark with some “sob-story” or another. 

Children, too, seek to avoid punishment for mischief by (as a friend put it years ago) “crying my way out of it.” Tears are a fascinating part of the human behavioural repertoire. Though expressions of sadness and distress are common amongst mammals at least, only anecdotal evidence exists of tears being shed by animals other than the human — even amongst close primate relatives such as chimpanzees and gorillas. During infancy and childhood, crying is as much or more so a response to actual physical harm or injury, as to hurt feelings. 

Didn't want him to cut it ALL off...

But with maturity, crying is almost always an emotional as opposed to physical reaction. It is perhaps the most conspicuous example of how physiologically-integrated is the human primate into the social milieu. According to the abstract of a paper published by Dr. Oren Hasson of Tel Aviv university, “Multiple studies across cultures show that crying helps us bond with our families, loved ones and allies ... By blurring vision, tears reliably signal your vulnerability and that you love someone, a good evolutionary strategy to emotionally bind people closer to you.” According to Dr. Hasson, “"Crying is a highly evolved behavior. Tears give clues and reliable information about submission, needs and social attachments between one another. ... My analysis suggests that by blurring vision, tears lower defences and reliably function as signals of submission, a cry for help, and even in a mutual display of attachment and as a group display of cohesion."” 

Most everyone can make themselves cry, but tears usually come involuntarily. This is why it has been viewed as shameful to cry, especially for men. Women frequently cry when in argument with their boyfriends or husbands, but female attestations of crying when alone demonstrate that they, too, are embarrassed about the “weakness” of tears. 

Since it is either not possible (or extremely rare) for animals to cry, tears of sadness must be a product especially of Homo evolution. The sympathetic bond in hominids goes back a very long way, for to have evolved a special, involuntary faculty — tears — to arouse compassion automatically in others. 

Selection pressures inherent in an environment of intense sociality, caused an adaptive change in biology for the Homo primate. Tear-ducts have been an essential part of the eyeball, as the sense organ evolved in multicellular lifeforms. 

Excessive secretion of tears occurs throughout the animal kingdom, in response to contamination of the retina (people often excuse emotional tears by saying that “smoke or something is in my eye”). The ancestors of humans developed the ability to tear up in response to social, in addition to natural, stimuli (i.e., smoke). 

Sobbing gives the physiognomy a striking resemblance to that of a sick person. Taking care of the ill and infirm is very ancient behaviour, too, as evidenced by Homo fossil specimens found without teeth or with crippling diseases or injuries that would have precluded them from taking care of themselves. 

As noted, con-artists often pretend illness as a means of gain, but such a rouse is not always done for greed alone. People feign sickness simply for the sympathy it evokes, which is telling in itself. And, if typically involuntary, people can make themselves cry, as well, for manipulative purposes. Conversely, sadness can induce illness, either psychosomatically, or by suppressing immunity to virus and disease. 

It is a vivid example of how culture affects the very physiology of the human primate. For the vast majority of the human race, sympathy is an instinct that is evoked by the appropriate stimuli, such as crying. 

Most health-scam victims are shocked and ashamed at their own gullibility. They didn’t, as the saying goes, “hear alarm bells going off,” because their cold reason was overcome by the drive to help others in need. In spite of the neologism, “bromance” is thus very primal to the Homo family. Human uniqueness began not in any particular technical skill, but with the turn toward sociality by the hominid male. 


Evidence of male involvement in Homo social groups goes back a long way, at least to the “first family”, the set of adult and child fossils discovered by Donald Johanson in Africa, which date to at least three million years ago. 

It isn’t only that the hominid male was able to bond with a family group, a remarkable turnabout from normal primate behaviour. It was even more so that Australopithecus (or later hominid) males learned how to get along with each other, which is very exceptional for a mammalian species. 

Predator and forager, the asocial behaviour of the male is a result of selection pressures as well. The main evolutionary purpose of aggression for male mammals at least, is to establish territory for reproductive and sustenance needs. This will ensure that their genes, and not those of other males of their species, will be passed on. 

For hominid males, though, reproductive success was assured by sociality, and not individualistic territoriality. Aggressiveness was retained as part of Homo behavioural repertoire. Except that this was practised socially as well. 

In this, male aggression followed the normal mammalian pattern. Throughout the animal kingdom, mothers are driven to violence when their young are seen to be under threat. Bonding with the kinship group, Homo males used cooperative aggression to allay menace from others of their kind, and from different species as well (including other hominids). 

Brutality carried out as vengeance for injury visited upon kin or comrade, is a commonplace in human history. Such aggression is provoked even (or especially) when the initiating harm is symbolic, as much as physical. Sympathy exists dialectically with aggression throughout the mammalian order (as with the “mama bear protecting her cubs”). 

The involvement of males in the Homo social group, established a territorial ground expansive enough for complex social and cultural behaviour to take shape. 

It came through the use of aggression as an expression of sympathy by the individual for the group. Long before humankind settled down in farms and cities, it had colonized all the continents but one. 

This would have been impossible without the intense cooperation witnessed particularly in the hunting tribe, a cultural artefact responsible wholly or in part for the extinction of most land-dwelling mega-fauna (and many other creatures medium and small). 

The skills and abilities honed by hunting, were institutionalized in the warrior band, the explicit purpose of which is to attack humans, not animals. Martial elites have dominated the rest of humankind throughout recorded history. 

This was achieved pragmatically through the control of the means of destruction — weaponry. Just as important, though, was the “spiritual” part of the warrior ethos, which demands the surrender of mere egotistical concerns for the sake of the corps. 

The career soldier in any era lives to practice selfless commitment to his “brothers”, right up to laying down his life for them. It is the pan-cultural tradition that goes under the name of “honour”, and which often inspires respect, even friendship, between those who previously fought on opposite sides. 

Warriors lack respect for those who do not fight, hence the violence and brutality meted out to non-combatant populations during the conduct of war. The military has been the most prominent example of homosociality throughout history, but barracks’ life has apparently included plenty of active homosexuality as well

This was famously the case with the armies of the Classical world, and more recently life in the Royal Navy was said to consist (in words often attributed to Winston Churchill) of “rum, sodomy and the lash.” 

The warrior tolerance or encouragement of same-sex relations, may well have inspired the monotheistic faiths to forbid homosexuality outright. Judaism, Christianity and Islam all look to encourage sympathy in a principled and pacifist way anathema to the warriors’ bond, and toward all of humanity, regardless sex, race or status. 

Practically, of course, each of the monotheistic faiths have had to fight holy war. But the conspicuous example of Christian homosociality was the monastery, in which participants were supposed abjure not only from aggression and other sin behaviours (such as having sex), but also to give up family life and other traditional means of belonging. 

Interestingly, monasticism is not just discouraged, but forbidden outright in Islam, the same religion which uniquely and explicitly lays out the rules for just holy war, or jihad. 

It was hominid-male sociality which, sooner or later, displaced the female as the centre of the kinship group among human primates. It has been periodically fashionable to suggest that during the time of the hunter-gathering economy, all human societies were matriarchal and thus peaceful and cooperative in character

This was despoiled only with the rise of farming and civilization, and with these, warrior elites. It is true that virtually all civilizations have been highly patriarchal; of ancient cultures, only the Etruscans are believed to have granted any status at all to women. Perhaps tellingly, though, Etruscan dominance of the Italian peninsula was short-lived, falling with relative speed to the avowedly patriarchal Latins. 

It isn’t entirely clear, however, that matriarchy prevailed prior to the rise of civilization. Certainly, this isn’t even the case with hunter-gatherer societies that survived into the historical era. Some are matrilineal, it is true, but others are strictly patriarchal. And regardless of family structure, in no hunter-gatherer society do the men behave peaceably. On the contrary, they are often more violent than those in agrarian or urbanized cultures. 

Whatever happened during prehistory, though, there is good reason why patriarchy became the dominant mode of kinship as human beings became sedentary and civilized. It is that, paradoxically, males are more expendable than women. If hypothetically, a society were to lose most of its men, it could rebound in population quite easily, by having the survivors impregnate the rest of the women. But the converse is not true: not matter how many men a women has relations with, she will become pregnant by only one of them. 

Thus it is that polygamy is extremely common throughout history — especially in war-like cultures — while polyandry (the marriage of more than one man to a single woman) is correspondingly rare. 

As progenitors of the race, women were so valuable that they came to held as property by men, whose lives in turn were rated much cheaper when not in ownership of women. The ancient and common practice was for a conquering army to kill the men and enslave the women, who could give birth to the next generation of male slaves. Reducing women to the status of chattel, allowed warrior men to act upon their sympathetic drive mostly in regard to their fellow soldiers. 

The intense homosociality necessary to create a domineering male elite, is compromised by the mundane and tender concerns inherent in family life. There must be an enduring tension, for men, inherent between family life and the brotherhood of arms (or of anything else). 

Monotheistic religion attempted to direct the male away from a martial mentality, toward a marital one. The warrior ethos was promulgated almost as an ideal type in the Spartan polis, counterpart and rival to the Atticans during Classical times. 


There, warriors were separated from their families physically as well as psychologically. With the onset of adolescence, boys were removed from the care of their mothers, sent to barracks for intense training and complete immersion into military life. 

Upon adulthood, arranged marriages would occur, after which men and women lived apart (but for periodic assignations in which the husband would pretentiously or literally rape his wife). Whilst Spartan women remained in the domus to keep care of the young, the men kept rigidly to themselves, training for the next combat, conquering enemies, and otherwise engaging in homosexuality to an extent that scandalized even the Athenians. 

The degree to which military success depends upon a homo-social esprit de corps, has been revealed in recent decades as women have been permitted to enter the armed services, even taking up positions in combat battalions. 

Not long ago, for example, a retired Canadian supreme court justice issued a report, which found that sexual harassment was rife in the country’s co-ed military units. The American military, meanwhile, does not allow females to serve alongside men, or in combat at all, precisely because it would undermine unit cohesion, as apparently is the case with the Canadian and other armed forces that (unlike the U.S.) are remote from actual fighting.

Monday, June 29, 2015

The Ancient Words We Use Each Day: An Attempt

In order to describe the novel techniques and devices of modernity, it was necessary to employ the terminology of ancient (if not dead) languages. 

Thus, we have the telegraph (a construction from ancient Greek meaning, “writing at a distance”) and telephone (“sound at a distance”) and television (which combines the Greek “tele-” with the Latin, “to see”). 

Because "tele-thea" would sound too awkward...

Indeed, the words “technology” and “engineering” are derived from ancient sources. 

In this, however, there is a puzzle. Most words used for technology are Greek in origin, whereas the ancient language associated with Western Christianity was Latin. 

The latter survived as a father tongue, long after its native speakers had died out, because it was useful to communicate in script between clerics of different mother tongues. 

After it disappeared as the scholarly language, Latin evolved into the language of scientific taxonomy, again to overcome the problem of categorizing different species for researchers speaking different languages. 

Yet, when scientific principles were applied to practical ends, to produce engineered technology, inventors and investors in these novel, automated techniques, turned mainly to Greek, rather than Latin, to provide the nomenclature for them. 

This preference for the Greek over the Latin was witnessed not only the applied arts. The academic fields that developed within the nineteenth century university, such as psychology, sociology, anthropology, and so on, took on Hellenic monikers, instead of the Latin, in spite of the fact that the latter was the language of classification in the biological and chemical sciences (both “biology” and “chemistry” are terms derived from ancient Greek). 

Too dry: a sample of ancient Greek script.
(c) Asafta | Dreamstime.com

This is probably the result of the revival of classical Hellenism as the university grew during the nineteenth century. In their battle against the ecclesiastical masters of the medieval university, the classicists preferred the old Greek language over the Latin, as a tongue untainted by Christian influences. 

In spite of the fact that the New Testament was written in Greek, western European scholars were far enough removed from Eastern Orthodoxy, such that their view of the language remained not jaundiced by clerical influences. 

Although classical Greek never became, for the nineteenth-century university don, the language of international scholarly intercourse that Latin served during the Middle Ages, it did obviously exerted its influence upon the academic specialties that grew (paradoxically) under their influence. Even religious teachings became specialized as “theology” (another ancient Greek term).

Saturday, June 20, 2015

Money is the Root of ... Some Evil

I may begin an irregular series in which I attempt to debunk many cliches.

For instance, the saying, “money is the root of all evil”, has been passed down through the centuries. 

It was inspired by the writings of the Apostle Paul, who actually stated, “All wrongdoing can be traced to an excessive attachment to material wealth.” Usually, this is translated as “the love of money is the root of evil.” 


But really, even with this qualifier, such a sentiment cannot be upheld as even half-true. Could even half the evil in the world be attributed to a “love of money”? Certainly, there are many who believe so, and say that all evil is attributable to a love of money. Indeed, self-sacrifice is often accused of having greed as its ulterior motive. 

Certainly, many forms of criminality, such as theft, robbery, extortion, blackmail, and so on, have as their motive a lust for money. However, more serious - and truly evil - crimes have other causes besides. 

For instance, rape is not motivated by greed (though the word “rape” originally meant “theft”), even where the assault is accompanied by theft of money and materials. 

For that matter, most murders and deadly assaults are not committed in the name of greed, but because of jealousy, feud, rage and mental illness. The worst forms of evil, as recognized by criminal law the world over, have not greed as their motive most of the time at all. 

It is often said that war is always, at root, about the lust for wealth. This is true, in its way. For it is lust for necessities, rather than luxuries, which drives states to go to war with each other. Civilized society depends upon the extraction of resources for its existence. 

Natural wealth dwindles as human riches grow, so too are states forced to range further afield to secure staple resources. Inevitably, each state encounters others with the same goal. Mutual need sometimes leads to peaceful trade, but more often to warring sides. 

Recorded history is in large part the annals of states going to war to secure resources that they literally cannot do without. Kings and strongmen have waged war for reasons other than mere plunder. War is driven more so by a necessity to neutralize some presumed rival, whose equality of power and resources seems a threat in itself. Even where the explicit goal of war is lust for wealth, it most often results in loss of precious blood and coin for all involved. 

A lust for money does not the motivate, but ameliorates the lust for war. By expediting all forms of exchange, money encourages production of what is plentiful, in trade for what is scarce. As war interrupts trade, and devalues money, cash provides an incentive for civil rather than martial efforts to acquire resources. 

In pacifist literature, war is often portrayed as the fault of profiteering bankers and bomb-makers. The high-interest loans that banks grant to governments to pay for war, more than compensate for the loss of civilian business by destitute populations, except that governments have always defaulted, delayed, reneged, renegotiated these loans in peacetime. 

During the Renaissance, several prominent banking houses went bankrupt after many princes defaulted on their loans, stalling the cash economy for centuries. 

In the twentieth century, lust for wealth seems to have a very secondary motive in the pursuit of World Wars, except for a need to acquire staple resources. Nazi Germany was determined to set up a rustic leisure-state of hundreds of millions of Aryans, supported by a slave-class of Slavs and other race mongrels. 

The Nazis’ eye was always to the east, to the mythical homeland of the Aryan race, and they saw Russia as the obvious target for conquest. They were hardly concerned with the material splendour of the Soviet Union, most of which had already been destroyed by the Bolsheviks. They instead coveted the oil, wheat and other untapped riches of the Rus. 

The Nazis intended to level the conquered Soviet Union, and replace it with the new Aryan super-state. There, the racially pure would live in a cashless, medieval utopia. 

Similarly, the first Great War was inspired by not by the love of money, but of that primal concern of statesman, to neutralize rivals of equal power to themselves. Germany emerged from its defeat of France in 1870, as the new global rival of the British. From then until 1914, the Brits and Germans engaged in an arms race, the former becoming allies with old rivals France and Russia to complete what the Germans saw as “encirclement.” When war finally came, all the participants seemed reluctant and horrified at having to carry out what they nevertheless understood was inevitable. 

Each side consciously decided to sacrifice in order to stand up for an ideal — for the Austrians, to avenge the murder of the heir to throne; for the Russians, to defend a fellow Slav land from Germanic aggression; for the Germans, to defend their Austrian allies against Russian aggression, and to show the British that they hadn’t dominion over the world; for the French, to avenge the conquest of Alsace-Lorraine; for the British, to defend against “Prussian” authoritarianism in Europe and around the world. 


The diplomacy leading to the First World War has also been described as a case of several bald men fighting over a comb. In the end, all sides sacrificed far more in material wealth (not mentioning the millions of lives lost) than anyone believed possible at the outset. Each side managed to do far more damage to themselves, than they ever inflicted on their enemies, and the victors were scarcely victorious. The great powers consciously liquidated their considerable reserves of wealth, all for the principle of the thing, and they didn’t succeed in making war obsolete, either. 

If war discloses just how evil the human animal can be to others of its kind, this malice has its seat in the idealistic or abstract faculty. War erupts as opponents threaten, or seem to threaten, each other’s access to the necessities of civilization. It is often precipitated when one side or the other is seen to violate some basic or sacred ideal or moral, and is undertaken usually in spite of its cost. 

Wars fought on principle, whether between states or between factions of the same state, have fomented more evil than other sorts of conflict. They not only visit torture and brutality upon combatants, they destroy material wealth all around. 

Often, the principle being fought over is religious in character. Religious civil war frequently has a “decadent” establishment pitted against a puritanical insurgency. When the latter prevails, yet more material wealth is destroyed in the name of the ascetic. 

In the twentieth century, highly-principled enemies of material wealth succeeded to power in many, very populous lands. The result was a literal decimation (the destruction of one in ten) of the affected populations, with millions more killed in inter-state wars. 

It is hard to describe either the Nazi or Soviet regime without reference to evil. The Nazis were satanic, for their deliberate genocide. But the “Aryan” German not involved with trade-unionism or communism was not subject to the sort of ongoing terror that affected the citizen-subjects of the Soviet empire. 

The latter was a regime of arbitrary arrest, detention, imprisonment and banishment, all to the end of cowing the people into cooperation with the latest five-year plan. 

Given all this, it would be more true to say that “the hatred of money is root of much evil.” In contemporary times, at least, there has been a demonstrative link between the capacity to commit great evil, and the willingness to provide humanitarian measures for the downtrodden. 

Those who have committed evil on a mass scale, are praised by many exactly for their selflessness. Thus, while Ulyanov overthrew nascent Russian democracy, and set into place the apparatus of terror that later consumed millions, Lenin is still praised by many for his frugal lifestyle and humble demeanour. When his successor condemned thousands in show trials during the 1930s, Walter Duranty, Russian correspondent for the New York Times, observed that making an omelette required broken eggs (Duranty won a Pulitzer Prize for his reporting of the Soviet Union). 

Later, after the war, Duranty wrote in the Nation journal, that “purge” in Russia meant merely “to clean”, and thus the purges that Stalin was undertaking at that time were merely like a colonic for the Soviet body politic. 

Those socialist apostates, the fascists, had their admirers near and abroad, as well. Before and during the war, even years afterward, it was common to hear that while persecution and Holocaust are surely bad things, "Hitler did put millions to work." Similarly, Mussolini was credited with "making the trains run on time."  

In the anti-colonial wars which followed World War II, the most brutal insurgents were also those who provided humanitarian assistance to the peasantry. The lack of civil liberty in Castro’s Cuba, similarly, is excused because of the regime’s alleged provision of the best medical care and free schooling. It implies no cynicism in the motives of parties just noted, to suggest that humanitarianism and brutality are expressions of the same will to power and control. Terror and welfare instill dependency in populations: giving is implicitly or explicitly coupled with taking away.

Of course, money itself is scarcely problem free.  It is, in fact, a fetish, a contemporary version of the witch-doctor's talisman.  It is perhaps for this reason, in turn, that some of the early Christians considered if not money itself, then the excessive love of it, as "evil"; it was simply too pagan but to be important

Monday, June 15, 2015

The "People's Princess", Not

In essence, television is entertainment made into a domestic appliance. 

As such, television “personalities” cannot be, as with entertainers in other media, unique and outrageous. Instead, they are characterized by blandness and inoffensiveness. Their role is simply to be. 

This is attested to, by the fact that when performers in other media become TV “personalities”, their actual talent (singing, dancing, comedy, acting etc.) become secondary, even forgotten entirely. 

When these personalities retire or otherwise disappear from the mass media, they are almost always forgotten in a matter of months. 

To analogize, a particular dish eaten at a restaurant may well be remembered months or even years later, for its scrumptious unfamiliarity. An eatery that serves only bland and conventional foods, is a “greasy spoon”. Public entertainment similarly should lack blandness and convention. 

By moving theatre indoors, the impact of any individual television entertainer became no more significant or durable in viewers’ experience, than is the consumption of a piece of toasted bread with bottled jam. Which is to say, it may be enjoyed in the moment, but who remembers eating a delicious slice of toast after even a day? 


I think this explains the disappearance from consciousness of Diana Spencer, the Princess of Wales who died in a car wreck in Paris in 1997. 

This event brought shock and sadness throughout the globe. Hundreds of millions of people watched the funeral telecast, which was attended by celebrities and dignitaries from all countries. In the days before the service, crowds in the tens of thousands remained outside Buckingham palace. There were rumblings of discontent amongst these “mourners”, directed at the Royal family generally, and at the news media especially. 

The latter allegedly contributed to the death of the Princess, as it was reported that Diana and her companions were attempting to flee the paparazzi when their vehicle crashed in a Paris tunnel. The grandiose claim that Diana’s tragic passing would hasten the end of the monarchy, became commonplace. Of course, nothing of the sort occurred. It turned out that, in fact, the paparazzi were not in "hot pursuit" of Diana’s car when it crashed. 

If the mass media did not kill her, then, it did indeed create the Princess of Wales. Which is to say, it created the “personality” that was so omnipresent on television, that the death of the real person behind it was enough to apparently stop the business of the world for days at a time. 

In the end, though, Diana was merely a personality. In this, she played the part of bland and inoffensive so perfectly. She became famous — at the age of nineteen — not for any talent or achievement. 

It was simply that Lady Diana won the hand of the heir to the British throne. Never truly beautiful, Diana Spencer was quite sightly nevertheless, and seemed an appropriate person to be queen. 

There is, in any case, a cultural analogy between television “personalities” and the British royal family as figureheads of state. The role of the latter is simply to be, to somehow “embody the nation” (in the Middle Ages, this was a literal thing). 

It is no wonder that in the television age, the “Windsors” became so celebrated, as the medium focuses on being over becoming as a matter of course. 

Again, Diana seemed insubstantial enough to fulfill the role chosen for her. Her initial shyness with being a public figure, only enhanced her status as a creature of television. In spite of her marriage to prince Charles ending (if I may) in a wreck, Diana remained a global celebrity until her death, because of her televisual qualities. It really seemed, when she did die, that Diana was the “people’s princess”, the “queen of our hearts”, as one newspaper headline put it. 

She was not, though. It turned out she was no more important to people than any other mass-produced consumer product. It is the reason why interest in Diana dropped off precipitously in the years after her death. I’m certain that most of what has been written about her, since then, has consisted of conspiracy narratives relating to the circumstances of her death. 

Ah, who cares?

The passing of Diana Spencer was the last major media event, during which the Internet was irrelevant. In 1997 still, electronic mail was the only truly revolutionary medium associated with Internet technology. The World Wide Web at that time, largely consisted of amateur sites devoted to Star Trek, the American civil war and other hobbyist and “fandom” subjects. 

It was, in essence, an enhanced “bulletin-board service”, the local computer networks set up by geeks during the 1980s. It had global reach, but the Internet even in ‘97 was not truly global, given the number of people (probably a large majority at the time) who didn’t even have Internet access. 

The only e-commerce that took place then, too, was for the exchange of pornographic imagery. 

News organizations had not yet treated the web as anything more than an afterthought — if at all. The “paradigm-shifting” services characteristic of the more recent Internet, got going just after Diana’s death: the peer-to-peer file sharing service, Napster, started in 1999, for example. 

Her death was thus the last hurrah of “old media”, the few-to-many transmission of information processed by gatekeepers. 

Upon news of the accident, even before her death was confirmed, all regular networks and cable-news services threw out their scheduled programming and devoted live, ongoing coverage not only to reporting on the accident itself, and the funeral of Diana, but also the many non-events that took place in the days between these two landmarks. 

Such live coverage is paradoxical, at least for commercial television services. Tragedy and its aftermath, attracts big audiences. Yet, ongoing live-coverage of political assassinations, accidental deaths, terrorist attacks, natural disasters, and the like is a money-losing proposition for network and cable TV. Not only is it costly in overtime paid for on-air talent and behind the scenes technicians. 

Commercial television services also forego, during extensive live coverage, their only true source of revenue, advertising. TV networks don’t wish to “sully” the sombre attention paid to the tragic event in question, with inappropriately jaunty or upbeat commercial messages. 

Similarly, advertisers are reluctant to have their own products associated with downbeat and dreadful subjects. Thus, live coverage is for commercial television a vast sea of red ink — yet, everyone involved seems highly motivated to undertake this money-pit enterprise. It confirms the assertion of Father Ong, years ago, that the live event is television’s true metier. 

Originally, television broadcasts were all live. In essence, comedy and dramatic programmes of the first decade and more of TV, were simulcasts of stage-plays performed live in front of an audience. Even today, the most highly rated regular television programmes are live events — that is, coverage of sports and other athletic contests. 

As with the stage actor, the actors on live TV — the various “anchors”, “analysts”, pundits, reporters and so on — come into to themselves when performing without a script, doing improv as it were. It is their chance to shine. Yet, coverage of the death of the Princess of Wales, became unintentional self-parody. Diana Spencer was, after all, merely a celebrity — someone famous (as Daniel Boorstin said) for being famous. 

She could no longer even stake symbolic claim to political importance, as heiress of the British throne. Again, though, TV-news services treated her passing in a manner befitting an important world states-woman. There was a telling contrast in the coverage of Diana’s death, with that of another world famous individual, Mother Theresa, which occurred a day or two before or after the Paris crash. An Albanian-born nun, Theresa had for many decades run an orphanage in the poorest parts of Calcutta, India. 


Her work has been, posthumously, subject to snarky commentary. But whatever the truth of Mother Theresa, she was noted for actual things she did in the real world. Her death was, however, completely overshadowed by the Diana marathon-coverage. And although everyone seemed united in grief for the passing of the Princess of Wales, the very disproportion in the coverage of her death, as compared to her actual accomplishments, inspired its own dissent in the form of humour. 

I said before that email was the truly revolutionary product of the Internet, circa 1997. It was through this medium that “Diana jokes” began to be passed back and forth. More than a month after the accident, these were documented in a story in the Globe and Mail (national edition of Oct. 7, 1997, page A12). 

Chris Defoe reported that, “So far, the mainstream comedy world has been largely silent on the subject, even as it has dominated the headlines and newscasts around the world. [Chat-show hosts Jay] Leno and [David] Letterman, those two touchstones of comedy consensus, seem to have avoided the subject completely. So have most other "official" comedy voices...” But, Defoe went on, “The first Diana joke appeared on the Internet — that hightech watercooler — within days of the crash, and over the past month more than 100 jokes have been posted, collected and circulated on and off the Net.” They included this one, “Prince Charles was out early the other day when a passerby said, "Morning," Charles said, "No, just walking the dog."” Another went, “What's the difference between a Mercedes and a BMW? Diana would never be caught dead in a BMW.” This sort of “gallows humour” response to hyped-up media coverage of tragic events, was precedent to the accidental death of Diana Spencer. 

More than a decade earlier, with the explosion of the space-shuttle Challenger, jokes quickly began circulating around north America at least, presumably transmitted by long-distance telephone call. These included the new acronym for “NASA”, “Need Another Seven Astronauts”, or “What did Christa McAulliffe [the schoolteacher on the Challenger who was going to be the first civilian aboard a space shuttle] say to her husband before leaving for the flight?: `You feed the dog, I’ll feed the fish.’” (The space craft was launched at Cape Canaveral, and disintegrated several dozen seconds later, either over the Atlantic ocean or the Gulf of Mexico). The loss of the Challenger crew and passenger, like the passing of Diana and her companions, was indeed a tragedy. 

But the response by mass-media was completely over-the-top. As when a mourner at a funeral who screams and sobs uncontrollably for a distantly-related or barely-known departed, observers cannot be help but to roll their eyes and whisper snark into each other’s ears.

Saturday, June 6, 2015

The Moon and Vietnam

Political sovereignty rests on control of territory. It’s that simple really: without territory, there is no polity

This control depends upon the monopolization of violence on the part of the state. But throughout history, each state has been faced with one or more other polities whose existence, too, depends upon control of territory. 

War, and interstate rivalry in general, occurs because of the rulers of each state fear that their sovereignty will be compromised by their rivals’ acquisition of territory. 

During the Cold War, this competition between great powers for the control of “territory,” was extended to outer space. 

The “space race” between the United States and the Soviet Union was spurred on by each side’s efforts to one-up the other in terms of astronautic achievement. 

Thus, when the Soviets launched Sputnik, the first artificial satellite, it sent panic through the population and government of the United States, because it seemed the “Ruskies” were about the acquire an absolute sovereignty over America itself. The U.S. quickly adopted their own satellite program. 

When, a few years later, the Russians became the first nation to put a man in space, president John Kennedy felt obliged to commit the United States to placing a man on the moon – the closest terra firma (so to speak) to the earth itself – as a way of claiming sovereignty over it, before the Russians could. 

Thus was born the world’s most expensive science-project: the Apollo moonshot

Under a blood-red moon.

The ramping up of the Apollo program occurred coincidentally with another product of the rivalry with the Soviet Union: the Vietnam War

Like the moonshot, the U.S. mobilized its army to fight in southeast Asia, out of the simple worry that the Soviet Union would, through its control of Vietnam as a satellite, gain control of the entire region through a “domino effect.” 

What has long fascinated me about the Vietnam War is how vociferous the opposition was to it, in spite of the fact that historically, the lethality rate for American troops, was significantly less than the other major wars in which the U.S. was involved. 

The American conflict with the worst casualty rate was, of course, the Civil War. It took place over almost exactly four years, from April 12, 1861, to April 9, 1865, about 210 weeks. The traditionally cited death toll from this conflict was 600,000. But actuarial studies of census figures have more recently determined a total estimated to be 761,000. 

If true, the death-toll for this conflict was 3,600 per week, or more than five-hundred every day! This at a time when the total prewar U.S. population was about 31 million, meaning that more than one in fifty Americans died during the “war between the states.” 

Major twentieth-century wars, invariably not fought directly on American soil, were much less deadly. 

Even so, about 116,000 American soldiers died in World War I, in which U.S. participation was relatively brief, from April, 1917 to November 11, 1918. The death toll was thus about 1,300 per week, or two hundred per day. Of course, this is only the total averaged out over the entire eighty-four weeks in which the U.S. was officially involved in hostilities. In reality, the first U.S. troops did not start arriving in Europe until the autumn of 1917, and the vast majority of them didn’t get there until the next spring. The bulk of American Great War casualties thus occurred only in the seven or eight months prior to the armistice in November. The American death rate in World War I could thus have rivalled that of the Civil War. 

The U.S. involvement in World War II lasted about 192 weeks, from December 7, 1941 to August 15, 1945. The total American dead stood at 405,000, or about 2,100 per week, three hundred per day. Again, these averages are misleading, since U.S. soldiers didn’t start actively fighting either the Japanese or Germans until well into 1942, and the worst of it, involving ground combat, didn’t commence until the next year. 

Again, the daily or weekly death toll during 1943 to ‘45 (but especially after Allied troops landed at Normandy, France, on June 6, 1944) rivalled that of the American Civil War. 

The Korean war was far less deadly, in total American deaths, than either the World Wars that preceded it, or the Vietnam war later. About thirty-six thousand Americans lost their lives on the Korean peninsula. But the duration of the conflict was far shorter than even the most active phase of the Vietnam war: about one hundred sixty four weeks, June 1950 to July 1953.  That’s about 220 deaths per week, or thirty-six per day. 

As for Vietnam, more than 58,000 Americans died in the conflict. It is hard to get a fix on just when U.S. involvement began and ended in southeast Asia, since the Americans never officially declared war on North Vietnam, although it did come to a treaty concluding U.S. involvement at least, in Paris in 1973. 

I have chosen only to average the total American deaths between 1965 and 1972, the most deadly period of the Vietnam war. This is three-hundred sixty-five weeks, putting the average weekly combat deaths at 159, or just twenty per day. Even if one cuts down the sample to the total American dead in Vietnam between 1966 and ‘69, this is exactly the weekly death toll of the Korean war, that is 220, or 36 per day, for a total of forty-six thousand or so deaths over 208 weeks. 

And of course, this is in a country with a total population that in, 1965, was forty-two million higher than in 1950 (with fifty million more by 1969 than nineteen years earlier, standing at just over 200 million by then). Thus, the total number of dead in Korea, as a percentage of the prewar total U.S. population, was a fiftieth of one percent, while in World War II, it was just over a third of one percent (about 0.12% of Americans died in the Great War). 

That's one small step for a man...

Yet taking a closer look at the casualty rates from the Korean and Vietnamese wars, in fact it was more dangerous to be an American soldier in Korea than in Vietnam (during the period of peak mobilization in southeast Asia, 1965 to 1969). In Korea, about one-million, seven-hundred eighty-five thousand U.S. troops were deployed. 

With a death-toll of about thirty-six thousand American servicemen in the Korean theatre, this is twenty-two thousand less than the toll in Vietnam. But there, almost three-and-a-half-million U.S. soldiers served in the combat zone. The overall mortality rate for American troops in Korea was thus about a fifth of one percent, while in Vietnam, it was just over a sixth of one percent of the soldiery. 

But in the late 1960s, a war with a death-toll comparable to what existed in a previous (and similar) conflict less than two decades earlier, but which was historically far below mortality rate of previous major roles, was viewed by many as an unprecedented disaster.  
A change in social psychology was the cause, obviously. I think it has something to do with the saying (widely attributed to Stalin) that “one death is a tragedy; a million deaths are a statistic.” Paradoxically, when many more young men were falling in combat, as during the American Civil War, and the World Wars, their deaths became nearly impossible to personalize, except by their loved ones. Most families lost at least one member to war, and thus most were too wrapped up in their own grief so as to feel the tragedy of the death of someone else’s loved one. There is a paradoxical need for meaning, too, that in situations of mass death, those left behind find solace in the belief that their loved ones did not die in vain. 

During Vietnam, however, combat-death was uncommon enough such that afflicted families could be the recipients of regret and sympathy by neighbours and relatives, who did not similarly suffer such loss, probably because their own sons received deferment from conscription through marriage, schooling or fraud. Thus, Vietnam-war military deaths, while numerous enough, were not so voluminous as to be (as with the previous wars in which the U.S. fought) mere statistics. 

There was another factor related to this — the state of medical practice and technology that, by the late 1960s, were far advanced from what they were even two decades earlier. These advances were spurred on, at least, by previous experience of military surgeons in the really big wars. It meant, firstly, that a very marginal number of American military deaths in Vietnam resulted from other than combat injuries. Up to World War I, deaths from accident, misadventure, infighting and disease, typically were more than half of all military casualties. 

World War II was the first war in which America fought, where a majority of deaths were from actual combat. The disease factor, especially, was in Vietnam much reduced as a cause of mortality, even from World War II. This is in spite of the fact that the Vietnamese military theatre was directly in a tropical zone, where communicable disease has always been more prevalent than in the temperate climes from which, in turn, most American soldiers hailed. 

Tropical diseases, too, have always been especially harsh upon outsiders to the plague zone, which should have included American servicemen. Instead, improvements in sanitation and inoculation meant that very few U.S. soldiers succumbed to disease at all. 

There was, in 1968, a global influenza outbreak centred in southeast Asia. Though many soldiers came down with the flu (and returning stateside, also brought the epidemic with them), and many others died, this toll included a negligible number of U.S. military personnel. 

Added to these lifesaving factors, were more direct surgical interventions upon combat injuries, assuring that a large number of those who would have died in previous wars, were able to live on. Unfortunately, many of these survivors also had mutilating and disabling injuries that even the best medicine could not repair. Many other seriously-injured veterans recovered without losing the use of arms or legs (or the loss of a limb entirely), but were thereafter beset with long-term injuries to mental health which the medical profession at the time was even more addled in dealing with. 

Thus it was assured that a large number of Vietnam-war veterans were psychologically and physically broken upon their return home. For years, the very term “Vietnam vet” was almost a byword for pathology, as it seemed many crimes and infamies were committed by individuals of that description. 

An additional factor, too, was the advent of television, and in particular, broadcast-news coverage directly from the Vietnamese war theatre. During the Korean war, TV-ownership increased from as little as one-quarter of the American public, to more than one-half. Even so, TV news was just getting started over the period of 1950-53. National newscasts were typically no longer than fifteen minutes in length before the end of the 1950s. 

Very rarely in those days, too, were news reports accompanied by “visuals”, as videotape had yet to be invented, and celluloid-film took too long to be edited, processed and transported to be available for the daily newscast. Until well into the 1960s, in fact, events were witnessed in moving-image form, primarily through the weekly cinematic newsreel. 

There are in fact newsreels documenting the early stages of American involvement in Vietnam. Their super-patriotic, anti-Communist tone, scarcely different in presentation from Second World War-era newsreels, make them seem strangely anachronistic when seen today, as perhaps they were received as such also at the time. 

Technical innovations — not least the videotape and satellite communications — constituted a mini-revolution in newsgathering during the 1960s. The last U.S. newsreels were produced in 1967, by which time the evening national-newscast (extending to a half-hour in length in the early ‘60s) had become highly-rated and very profitable for TV networks. 

Most of the footage of Vietnam combat was shot on film, and not videotape (video-cameras were then too heavy and bulky to be practical for battlefield use). But this imagery was quickly transported by plane to Tokyo or Honolulu, transferred to video and then transmitted by satellite to headquarters in New York. The footage was thus only a day or so old when seen by the public — a vivid illustration of reports that came in that day’s morning newspapers. 

The unacknowledged Legislator of the United States for two decades.

Moreover, Vietnam-war footage was typically shot on colour film (in complete contrast even to late-era newsreels). Most homes did still have black-and-white TV, and the most gory and bloody of the footage was edited out for broadcast. Neither was the reporting by network correspondents usually very critical of the conduct of the war by American forces — let alone the reasoning behind U.S. involvement in the first place (at least prior to the attack by North Vietnamese forces during the Tet lunar new-year in January, 1968). 

But Vietnam has been called the “first television war”, or more pertinently, the “living-room war.” Far beyond the actual content of the news reports from Vietnam, the very fact of war-imagery being transmitted into the domestic surround each and every day, served to undermine support for U.S. military-involvement among the broader public. 

 There was no better example of this, than the coverage of the Tet-offensive itself. Militarily, the result was a disaster for the government in Hanoi. The war historian Don Oberdorfer wrote in 1971 of the Vietcong: “Tens of thousands of the most dedicated and experienced fighters emerged from the jungles and forests of the countryside only to meet a deadly rain of fire and steel within the cities. The Vietcong lost the best of a generation of resistance fighters, and after Tet increasing numbers of North Vietnamese had to be sent south to fill the ranks. The war became increasingly a conventional battle and less an insurgency. Because the people of the cities did not rise up against the foreigners and puppets at Tet — indeed, they gave little support to the attack force — the communist claim to a moral and political authority in South Vietnam suffered a serious blow.” (Tet! Doubleday & Co., 1971, pp. 329-30, cited in Peter Braestrup, Big Story: How the American Press and Television Reported and Interpreted the Crisis of Tet 1968 in Vietnam and Washington, Westview Press, 1977) p. xx.) 

The North Vietnamese weren't able to stage further attacks upon the south until after U.S. military forces vacated the country several years later. In the age before televised news, it would have been reported as such — “Enemy routed after failed invasion.” But the actual footage of Viet-Cong and North-Vietnamese regular troops pouring into south Vietnam — getting as far as the grounds of the U.S. embassy in Saigon — shown on TV soon after, had the same demoralizing effect on the home-front, as does the sight of a particularly fierce and disciplined attacking force on a much more numerous defending army. 

Pressured by the change in public opinion, the mighty United States military did not exactly turn and run from Vietnam. But the irony is that the Americans decided to abandon their south-Vietnamese allies just when they had the North on the heel for the first time during the course of war. And, contrary to legend, it wasn’t youthful protests that pressured the U.S. government to reverse course in Vietnam. The student radicals were very conspicuous opponents of the war, certainly. But according to opinion-polling at least, younger people were actually more in favour of fighting in Vietnam than were those in middle-age or older. It was this latter — the people who never missed Walter Cronkite or David Brinkley each evening — who were really responsible for the American “loss” of southeast Asia.