Friday, July 7, 2017

The Labour Market for Golden-Age Porn

Years ago, I watched an episode of the PBS-TV Frontline documentary programme, called Death of a Porn Queen


Collen Applegate as Shauna Grant, 1963-1984



It profiled Colleen Applegate, a small-town Minnesota teenager who, running away to Los Angeles in 1982, soon after became the porn actress known as “Callie Aimes” and then more famously as “Shauna Grant.” 

Blonde, rosy-cheeked and meaty, she was the fantasy girl for every dirty-old-man-next-door. She made 30 features and did numerous shorts and photo-shoots over just a year in the business. 

But in March, 1984, Colleen put a rifle to her temple, and pulled the trigger. She didn’t die right away, but hung on in intensive care, long enough for her family to deplane from the mid-west and unplug their daughter from life-support. 


Colleen's grave, Farmington, Minnesota. findagrave.com




It is entirely obvious to blame porn for Colleen Applegate’s suicide. Colleen was a troubled young woman before she got into the business, though: her original flight to California was fuelled by rumours of a previous suicide attempt in her home town. 

She also hadn’t actually made a porn movie in the year before her death (though she was about to re-enter porn following the incarceration of her boyfriend, a drug-dealer who reportedly also had severed their relationship shortly before she shot herself). 

Colleen’s depression may well have been exacerbated by the accumulated shame at having exposed herself sexually for the voyeuristic interests of many others. Other porno actresses have committed suicide; but many others have come and gone from the business without succumbing to such a fate. 

The question remains as to why so many attractive young women such as Colleen Applegate, in the first place chose to expose themselves in such a fashion, especially during the “golden age” of Californian porn during the late 1970s and early ‘80s. 

Feminists would answer that they are coerced into doing this, either specifically by abusive boyfriends or by the “patriarchy” in general. 

They cite the case of Linda Boreman, who as “Linda Lovelace”, starred in the first pornographic movie to receive general release in cinemas, Deep Throat, from 1972. Boreman alleged that she only appeared in that film because of the violence, or threats thereof, perpetrated by her boyfriend.  

But try as they might, the pro-censorship feminists have never been able to establish that porn actresses generally have faced such pressures to perform sexually in front of a camera, and I’ve never particularly understood the argument that “patriarchy” is responsible for getting young women involved in porn. 

When Western society was most truly patriarchal, great restrictions are placed upon women’s sexuality. This was the case, for example, during the Victorian era, and the fact of near-absolute male dominance coinciding with stringent limitations upon female sexuality, is practically universal. 

Yet, according to anti-porn feminists, modern “patriarchy” somehow demands that young women expose themselves sexually for the gratification of others. 

In fact, the female demographic cohort to which Colleen Applegate (born 1963) belonged, was the first for which there was an unquestioned expectation of independent living well into adulthood, and for whom there was a presumption in turn (and however reluctantly arrived at by the older generation), that chastity wouldn’t be observed until marriage. 

Or at least, this held for the middle-class family to which Colleen belonged. In any case, the insertion of such a large number of young women into the workforce, coincided with recession in the long boom that was touched off in the United States with the entry of the country into World War in 1941. 


and people think we have it bad now... www.reed.edu





Inflation was unleashed as wage-rises outstripped gains in productivity, interest rates reached double digits, the price of basic commodities (principally but not exclusively crude oil) increased steadily, and the “Keynesian” methods of pump-priming through deficit spending, proved ineffective. 

Throughout the 1970s, all Western economies faced desultory economic growth, while at the same time the cost of living continued to increase, sometimes by double-digits per year. 

The sheer number of young people, born in the 1950s and ‘60s when fertility rates remained high, entering the workforce during this period guaranteed that a large proportion of them remained unemployed, and many others would be stuck in low-paid, service-sector jobs (as foreign competition had also battered the American industrial sector). 

Young women in the workforce especially, faced discrimination and relegation to the “pink-collar ghetto”, low-wage clerical and service work from which there was little chance of improvement or promotion. 

Most workplaces, whether in the private or (increasingly larger) state sector, were managed by men who, coming from an earlier generation, would find it perfectly natural to refer to female subordinates, even middle-aged women, as “girls.” 

Some of these at least, behaved as a matter of course as though their younger female employees were happy to be harried, groped, even forced into sex. 

In these circumstances, working as a sexual exhibitionist may not, at first glance, have seemed all so bad. On the other hand, for Colleen and others like her to have become involved in pornography in the first place, would also involve overcoming a general revulsion and shame about exposing oneself to others, in the act of having sex. 

It is considered such a private act, that the words “bed” and “bedroom” are widely understood euphemisms for the sex act. But it wasn’t always thus. Turning to social-historian Bill Bryson, we find that just a few centuries ago, people tended to be less modest than they were subsequently. 

The equation of sex as something dirty, was a byproduct of bourgeois culture. Porn, as I said, gained semi-legitimacy in the wake of the counterculture, which rejected middle-class values such as the “hangups” inherent in sex. 

The economic and social forces that propelled Colleen Applegate and many other pretty young American women into pornography during the “golden-era” of the business in the late 1970s, are also behind the sudden appearance of female youth from the former Soviet bloc, in the adult-video business since the turn of the century. 

During the Cold War, a joke was told that went something like this: “What do you call a pretty girl in Warsaw [or Prague, Budapest, Moscow or any other eastern-European capital]?” The answer: “A tourist.” 


"Tourists" all. www.theapricity.com




In the West back then, people were so naive so as to construe the homeliness of Slavic women as a congenital quality, not a byproduct of the malnutrition and general scarcity of the Communist system. 

It was only a few years after the fall of command socialism, that Westerners understood that female Slavs (now well-fed and infused with hope) were more beautiful, very often, than the women of the liberal democracies. 

Russians, Czechs, Poles and so on, very often embody that traditional fair-haired and blue-eyed look of the Nordic ideal. After centuries of invasion from the far east, many also have an Asiatic physiognomy that many Occidental men find so alluring. 

While modern Slavic women have beauty, what they don't have is very much selling power in the labour marketplace. In this, they are in a position similar to young North American women of thirty and more years ago. In order to thrive, it seems, not very many, but enough of these young women from eastern Europe have opted to sell their bodies, virtually.

Friday, April 14, 2017

The Shrine in a Time of Science

A story in the Ottawa Citizen this past weekend caught my attention for a number of reasons. 

It reported on the conviction of a young man in the car-crash death in 2013 of his girlfriend, and as a father, I feel very deeply for any parent faced with the death of a child. 

Second, I was once a work acquaintance of the man, Marco Gauthier-Carriere, convicted of criminal negligence causing the death of Valerie Charbonneau. 

I will say only that I while I didn’t know Marco that well, I did like him and while I don't disagree with the verdict, I'm sure he had no intention that Valerie should die as a result of his actions - but negligence is the crime that deals with catastrophic sins of omission like what happened that night in November, 2013.  



Valerie Charbonneau.


I never met Valerie, but I remember Marco speaking fondly about her at least once, and I'm certain that he was devastated that his own actions caused the death of a long-time girlfriend that, I believe, he intended to marry. 

Third, it seems like an absurd length of time for this matter to reach trial. Three-and-a-half years is nearly one-sixth of the lifespan of both Marco and Valerie, who were both 21 at the time of the accident (they had been high-school sweethearts).  

For the family of the victim, not to mention that of the accused, to have to wait so long for a verdict seems almost like justice denied; the Supreme Court of Canada believes this too, which is why all levels of government are currently in a panic right now, trying to figure out this whole speedy-trial business after several accused murderers were released after their cases were stayed for taking too long to reach trial. 

Sometime ago, I completed an essay speculating on the reasons why over the last few decades, criminal cases have taken not only a long time to get to court, but also take much longer to try than was the case in the past. I may post it one day. 

Another reason this accident stuck out in my mind, is that at one time, I travelled daily on the stretch of road in East Ottawa where Valerie Charbonneau met her end.  

It occurred just where Montreal Road heading east makes a fairly sudden turn before coming to the intersection at Blair Road. 

Marco apparently lost control at this curve, jumped the median, and crashed into a concrete light-standard on the opposite side of the road, which crushed the roof inside the passenger cabin. 






As it happens, this was right outside the campus of the National Research Council of Canada

Since not long after the accident, a roadside-shrine dedicated to Valerie Charbonneau's memory has been placed on the chain-link fence separating the NRC from Montreal Road. 


I noticed driving by one day, that the supports for the black metallic NRC sign, do for a brief moment provide a frame upon the shrine itself, which consists in turn of flowers, trinkets, toys and even a tabernacle affixed to the chain-links, wherein is placed a photograph of the deceased young woman. 


After seeing this numerous times passing by, a couple of years ago I stopped, got out and took cellphone pictures of the shrine. 




Shrine to Valerie Charbonneau at National Research Council, Ottawa

Image of Valerie in tabernacle


Teddies and trinkets left by friends and family of Valerie

I've been struck by the fact that the name of an institution devoted to scientific investigation is juxtaposed by happenstance with an artefact — the shrine — with deep roots in religiosity. 

Shrines are common to most religions, it would seem, from pagan to monotheistic, and probably preceded civilization by many eons. 

Indeed, shrines may have even been the locality around which settled life took form: the temple complex at Gobekli Tepe, in modern-day Turkey, was apparently started eleven-and-a-half thousand years ago, centuries before there is any evidence of farming in that region or anywhere else. 

As shrines endure, walls and roofs are built to protect the venerated objects from the elements — most temples have a shrine as their hearth. Temples that are built high enough then become citadels, around which a city might grow. 

Graveyards are a kind of amassing of shrines, although most graves are venerated only by the loved ones of those laid in them. In early civilized times, it was common for the deceased to be buried underneath the homes in which they once lived, and where the living apparently paid homage at shrines within the domus itself. 
Reconstruction of Catal Huyak, the earliest town



In China, the practice of ancestor-worship was maintained long into civilized times, for although the departed were laid in necropolises, shrines to their memory were maintained in the family home itself. 

In pagan Europe, it was common to have out-of-doors shrines to particular gods and goddesses, and during the Middle Ages this tradition endured under Catholicism, in which the saints were worshiped at shrines where they lived or (sometimes) where they died. 

The roadside shrine, which became commonplace in the last years of the twentieth century, seems a combination of the European and Chinese traditions. 

They are almost always objects of care and veneration only for the loved ones of the deceased. But the left-behinds are also determined that the shrines should be blatantly public markers, and indeed, they are seen by perhaps thousands of people each day as they pass by in their cars. 

These roadside shrines may well be given more attention than the actual gravesites of the deceased person they pay homage to. Complaints about how such shrines constitute a driving hazard have lead many jurisdictions to curb them outright, which the municipal government of Ottawa did recently. 

I’m unsure how this will affect the shrine to the young woman killed by her boyfriend’s careless driving, as the land on which it sits belongs to the federal government, and thus the city has no say over it. 


It is a coincidence, too, that she died on a roadway just adjacent to a facility devoted to scientific research. But the contrast between an artefact with deep, superstitious roots in the human psyche, being placed at the frontier of a place devoted to expunging this very thing from the human mind, cannot be overlooked. 

Monday, October 3, 2016

The Linchpin of the Modern Economy

Advertising began as an industry something akin to land speculation. 

The first agencies would purchase blocks of space in newspapers and periodicals, and then generate profit by selling them piecemeal to businesses wishing publicize their wares. 

Originally, companies would provide their own advertising message – just as people who purchased land in the old days would contract to build their own houses. 

This is why neighbourhoods originating before the middle of the twentieth century, have residences that usually look much different from one another. 

But just as latter-day land speculators build houses before they sell off real-estate in parts, ad agencies began to employ in-house writers, and then illustrators also, to provide content for clients in the periodical-space sold off individually. 


www.urlnextdoor.com





Given how advertising removed the “place” from “market”, it is appropriate that the industry got its start practicing in microcosm the age-old practice of land speculation. 

From such relatively humble origins, advertising long ago grew into the linchpin of the modern information/marketing economy. 

Advertising is quite rightly identified with corporate capitalism: revenues for this sector are estimated to grow to 660 billion dollars (U.S.) in 2016, most of which is spent by private business.  But governments also spend heavily on advertising. 

The U.S. government is ranked fortieth as the biggest advertising organization, but in the U.K., the central government is consistently in the top-five of advertising entities. 

Public-service announcements are not the only type of advertising related directly to the political system. As a rule, such advertising is supposed to be non-partisan, but governments implicitly or sometimes explicitly use them to promote their own very partisan agendas. 

There are also the fortunes spent on campaign advertising throughout the world’s democracies – more than four billion dollars in the current U.S. presidential campaign alone – which are counted as private-sector advertising, resulting from the transaction of political parties and ad agencies. 

These are avowedly and doggedly partisan in nature, of course, and campaign advertising is of greater consequence to politics in a way out of proportion to the actual funds spent on them. 

Often, campaign-advertising through mass-media is traced back to the 1960 presidential election which brought John F. Kennedy to the White House, or in the vote which brought Ronald Reagan to the presidency twenty years later. 

However, high-level politicians’ involvement with advertising agencies goes back decades before that, to ad pioneer Albert Lasker’s involvement in the U.S. presidential campaign of Warren G. Harding, which Harding won in a landslide, later appointing Lasker to be chairman of the U.S. Shipping Board. 

It exaggerates only a little to say that politicians treat their pollsters’ words as though of a divine. Policies are now calibrated mainly for the purpose of winning office next time around (again, with the help of the marketing industry). 

In order to gauge public opinion on behalf of political clientele, the marketer must be treated as trusted aide, as important or more so to an elected leader as a cabinet minister or legal adviser. We see once again that advertising and marketing executives, at the very least, have insider knowledge of another crucial domain of modern life — the political process. 

For many “Madmen” the opportunity to have the ear of powerful statesmen and to influence public policy has been more important than making money. 

In some cases at least, the two went together: Dalton Camp (1920-2002) was best-known as a newspaper columnist but, earlier in his career, was an advertising campaign advisor who helped Richard Hatfield become the longest-serving premier of New Brunswick during the 1970s and ‘80s. 

Camp did his campaign work pro bono, but in return, his advertising firm had a monopoly contract on all government advertising done by the province. 

Advertising is the application of fine art to the business of persuasion. This is by no means a novel thing. On the contrary, in past times, art was usually employed for this purpose. 

It is the modern practice of selling "handmade" visual products to an abstract marketplace which is a novelty. 

But as works designed and drawn for advertising purposes are not “done for their own sake”, they are considered mere “illustrations”, not real art. 

Nevertheless, many “real” artists have made a living through advertising work. There is a definite case to be made that, in terms of simple dexterity in technique, commercial artists and graphic designers, whose work is bought and sold in the hundreds or thousands of dollars, are more talented than most “conceptual” artists, whose work may sell in the tens of millions. 

Qualms expressed either for the payment of vast fortunes for what are, on the one hand, inanimate objects; and on the other, the wounding of the integrity of an artist who goes to work for amoral corporations, are secondary to the reality of advertising as a creative enterprise. 

Creativity became an essential part of the process, as it became evident that the mass audience responded all the more to pictorial and emotive content, rather than didactic and cajoling text (as was common before the “creatives” entered the picture). 

This occurred when magazines were still the dominant mass medium. So it is that a novel type of speculative enterprise, evolved into the organizational crossroads for each sector of the engineered economy.

Former Madman.
www.siasat.com

 

The modern ad agency fulfills its original function as a broker of space for all the mass media: originally printed periodicals, then radio and television, and latterly, the Internet.

It thus has intimate knowledge of all the media processes, not even possessed by professionals and personnel in each sector, in regard to the others. 

The agency acts on behalf of clientele — namely all the biggest private industries, representing the widest possible range of products. These agencies are thus exposed to the mechanics of many businesses, which in ease case remain mostly obscure and half-understood by outsiders. 

In addition to the privileged knowledge held by admen and woman, of the media business in particular, and industry and commerce generally, there is the industry’s involvement in politics. 

Marketing and advertising, long central to the conduct of political campaigning, more recently became essential to governance itself. The lucrative patronage received by ad agencies in exchange for their getting politicians in office, may even be less important to marketing professionals, beside the clout they acquire through their knowledge of public opinion. 

And, as advertisers provide the content for media campaigns political and commercial, they bring together these central facts of modernity with the creative class as well. It isn’t only graphic artists and illustrators whom ad agencies employ for persuasive purposes. According to author Mark Tungate, “advertising is a springboard for creative talent. The list of writers and film directors who have worked in advertising is long and illustrious: Salman Rushdie, Fay Weldon, Len Deighton, Peter Carey, Sir Alan Parker, Sir Ridley Scott, David Fincher, Spike Jonze, Michel Gondry... I could go on... and on. The French creative director Olivier Altmann, of the agency Publicis Conseil, once told me, 'Working in advertising is one of the few ways you can be creative and make money at the same time.” (Adland: A Global History of Advertising, Kogan Page, p. 4).

A more direct link between “commercial” and “high” art, was seen in the activities of Charles Saatchi, the Iraqi-born British co-founder (with his brother Maurice) of the firm that bears their name (although they were both forced out in the 1990s). 

Saatchi & Saatchi was one of the first ad agencies with a global reach. Not coincidentally, it was also heavily involved in politics. It came up with the slogan used by the British Conservative party in the 1979 vote, “Labour Isn’t Working”, which helped bring Margaret Thatcher to power. 


Standing at the intersection of commerce and politics.
theconversation.com



Even before the agency’s founding in 1970, Charles Saatchi was an avid art collector. By the 1980s, he was supporting the work of the “Young British Artists”, such as Damien Hirst and Tracey Emin. 

Their conceptual work included a shark submerged in a glass tank filled with formaldehyde (overseen by Hirst), or a very untidy, unmade bed (of Emin’s). 


These and other words sold for millions or even tens of millions of pounds. Saatchi’s initial support for, and purchase of, these conceptual artists could be looked upon as a cagey investment, if nothing else. Often investing just a few thousand pounds to acquire the works directly from the artists, Saatchi often sold them at auction for many multiples of the original purchase price.

Saturday, August 13, 2016

Reflections on the Works of the Late Alvin Toffler

Alvin Toffler, who died recently aged 87, was the first author whose ideas I took seriously. The Third Wave and Future Shock introduced me to themes and subjects that I still think about today. 



Toffler translated.

I long ago forgot most of what he had to say, but I think Toffler’s biggest insufficiency as a thinker was his tidy division of history into specific periods: in his case, the “first wave” agricultural economy, the second an industrial economy, with the third wave being the information economy. 

This is, in fact, quite common in social-science, and inevitable in the study of history. Thus, in the Occidental tradition, the past is divided up into Classical, Medieval and Modern eras, though no one during medieval times considered themselves as being in the “middle” of anything. 

In a more religious era, history was divided up starkly between the birth of Christ and after. Toffler himself had been a Marxist, and Karl Marx also followed a tripartite method by dividing history into Feudal, Bourgeois and Capitalist eras. 

These categories are all artificial, of course, but they are a way of grappling with social phenomena by grouping them according to presumed temporal and spatial characteristics. Thereby, new information can be construed in terms of the mental groupings, thus saving the energy of having to consider and examine each thing discretely in turn. 


Past Alvin Toffler.

It isn’t as though sociological and historical categorization always departs from the facts on the ground: the Latin world of the fifth century was plainly different from the “dark ages” that followed, just as the fourteenth or fifteenth centuries were different from the nineteenth. But perhaps it was reconsideration of Toffler’s central argument that the “third wave” information-economy is really so dramatically different from the second-wave industrial economy, which lead me to be wary of this kind of categorization entirely. 

Instead, I’ve developed an idea in which particular artificial forms are brought into existence, and which in turn contain all potency which are variously acted upon their users. 

In a book called simply Cities, author John Reader summarizes the first civilized society, in Mesopotamia, and the cuneiform tablets found in the thousands by archaeologists (and deciphered thanks to the Rosetta Stone) which more often than not document the quotidian acts and thoughts of Sumeria.  Reader writes that they offer many “glimpses into the lives of ordinary people thousands of years ago, and accumatively they evoke a sense of how little the fundamentals of life have changed. Our lives and theirs could have been interchangeable. Only time separates us; we could have functioned there, and they could have functioned here. That this might seem at all surprising is a consequence of the way history is told. Accounts of ancient cities and societies have for generations tended to concentrate on the higher rungs of society, implicitly stressing the presence and overwhelming importance of kings and conquerors. To a degree, this is inevitable, since temples, palaces and treasure-trove burials have always been the prime target of anyone digging up ancient remains. ... Nowadays far greater attention is paid to evidence from lower down the social ladder.” (Reader, Cities, William Heinemann, 2004, p. 33).


Rosetta Stone.

My theory has long been that modernity pushed each of the variables of characteristic of "civilization", but to an extreme. Yet all of these variables were present in ancient Mesopotamia, just as they were in classical Greece, ancient Rome, imperial China, pre-Columbian America, and anywhere else that could subsidize city life. 

Later, Reader describes how ancient Attica’s control of the trade in grain supported financially the fluorescence of Athens during the fifth and fourth centuries BC: “As their populations grew, the city-states were obliged to look further and further afield for their grain supply. Those with limited wealth, or situated inland, had precious little chance to alleviate their plight and succumbed to more powerful states, while those on the coast, with access to timber and shipbuilding expertise, looked across the sea for alternative supplies, and the interacting agencies of need, ingenuity and initiative soon created a network of trade routes across the Aegean and beyond. This was a critical point in the development of the city as a functioning entity. It was beginning to reach out on a grander scale than ever before — though certainly not as a source of benevolent influence, simply to secure whatever it needed to survive. During the fifth century BC, Athens was bringing grain form the shores of the Black Sea, tapping into the western end of Russia’s great wheat lands. By the fourth century, the city controlled the grain trade of the entire eastern Mediterranean.” (Reader, pp. 54-55).

Later in Rome, in 123 BC, “laws were introduced establishing the basic right of every Roman citizen to a monthly ration of grain at a fixes that undercut prevailing market prices. The intention here was to even out the price fluctuations resulting from variations in supply, but the taxes imposed to pay the subsidies needed to gain the suppliers’ support for the scheme made it a hot political issue for the next sixty years. A law extending the number of grain distribution recipients was passed in 62 BC, and when Clodius became tribune four years later he abolished payment for the grain ration altogether. From then on, supplying a monthly ration of grain — free — to every eligible citizen became the responsibility of Rome’s governing authority. ... Moreover, Clodius’ law appears to have extended beyond the mere distribution of grain to cover all matters concerning both public and private supplies, the grain fields, the contractors and the grain stores.” (Reader, p. 57).

Reading these passages reminded me of a book I read years ago called The History of Money, by Jack Weatherford. Given that civilization preceded the invention of money (by thousands of years for the original city-based cultures), Weatherford coined the term “tributary” economy to describe the trade before coinage. 

Civilizations in the New World didn’t have money at all before the arrival Europeans, and Weatherford describes how the Aztec empire of Mexico “operated primarily on the basis of tribute, the markets functioned as subsidiary parts of the political structure, and many different standardized commodities served as forms of near-money. ... The vast bulk of goods that passed through the Aztec Empire moved primarily as tribute from the peripheral parts of the empire to its capital. In this regard, the Aztec Empire was like virtually all other empires in the era before the spread of money. Ancient Egypt, Peru, Persia, and China all functioned as tributary systems rather than market systems.” (Weatherford, The History of Money: From Sandstone to Cyberspace, Three Rivers Press, 1997, p. 19).

The Latins ascended to civilization following the invention of money, initially using the Greek currency but ultimately adopting their own standardized coinage. 

Although the Roman state (as republic and then empire) presided over a highly sophisticated market economy, its free distribution of grain harkened back more so to the tributary systems of the pre-money era. 

Amongst the Aztecs, Egyptians, Persians and so on, the expropriation of staple farm goods was not only to enrich the upper-class, though it certainly did that. Part of what was taken was redistributed in turn, to maintain the loyalty of the urban masses, as was the case in Rome. 

It demonstrates that apparently contrasting social behaviour can coexist simultaneously within the same culture. It could be tributary relations in a monetary economy; another example is hunting, which persisted in every society until very recently, millennia after farming had purportedly made it “obsolete.” 

Similarly, Roman history also shows how ancient is the rationalization of production, supposedly a hallmark of Toffler’s “second wave.” 

The historian Howard Saalman noted that “Industrialization, it should be said, is not synonymous with mechanization. It does imply an organized process of production and distribution of goods and services — and a genius for order and organization was the very basis of the Roman state. Everything from fun to funerals found its place in the legally and traditionally ordered scheme of things and — up to a point — everything worked well. If mechanization remained relatively limited it was because cheap and slave labor provided the required substitute, as in the American south of the early nineteenth century. No project the Romans undertook failed because of inadequate technology.” (Saalman, Medieval Cities, Braziller, 1968, p.12).

According to Toffler’s schemata, ancient Rome was a “first wave” society, in no essential way different from the very first farming cultures — which is a patent absurdity. 

As Saalman writes, “The Romans enjoyed their country villas and romanticized the rural idyll of Homeric times. But the land between their cities was rationalized by division into one-hundred-foot square units and farmed with an efficiency that can only be labeled `industrial agriculture.’ With the masses of population concentrated in large cities throughout the empire, less effective means of food production, of overseas and overland transportation, of agricultural products, or of less highly developed port facilities and storage terminals were practically unthinkable. With the satisfaction of basic needs and growing prosperity came a growing demand for manufactured products of all kinds. Roman craftsmen were prepared to make, and Roman merchants were ready to distribute, an impressive variety of products which gave life in urban apartments and rural homes a standard which compares favorably with later centuries. Architecture and engineering — utilizing stone, brick, and concrete masonry as well as metal, wood and glass — achieved a level of accomplishment by which all conceivable needs from those of frontier camps to those of town palaces could be and were met.” (Saalman, p. 12-13).


A "first wave" settlement: depiction of ancient Roman city that became Koln or Cologne, Germany.
historum.com

All of this was even more the case with China, which remained without a substantial urban population until relatively late in history. It developed into a organized, legitimatized state relatively quickly, however, and after some centuries became the most technologically advanced of human societies. Chinese ingenuity during the Song dynasty from the eighth century, far surpassed that of western Eurasia at the time. 

In China sophisticated engineering and advanced technology was employed to ensure the cultivation and distribution of the staple rice crop. One of the definitive features of “first wave” cultures, Toffler argued, was “decentralization”: population and authority remained scattered, uncoordinated, semi-legitimate. 

With the “second wave”, this gave way to the centralization of people in cities, and of power in formal, legitimate government. In imperial China, the vast majority of the population remained in villages, but their actions were coordinated by a despotic monarchy and centralized bureaucracy. It was a system established two centuries before Christ and which persisted (with periodic breakdowns and strife) until AD 1911. 

There is simply no way to categorize China in terms of “first wave” or “second wave”, although perhaps Toffler argued these categories only applied to the Occident. It still makes no sense, because the Roman empire doesn’t fit into them either. Its population was, too, largely on the countryside, but nevertheless depended on the centralized authority of the SPQR. 

When Rome fell, the civilization itself largely faded away for a true “first wave” decentralization of population and delegitimization of authority. In the Third Wave and Future Shock also, Toffler describes how the new society’s conceptions of space and time will supersede the “linear”, “Newtonian” viewpoint characteristic of second-wave industrial societies. 

Toffler seems to have chosen the term “wave” (to describe what later be better-known as a “paradigm”) so to convey the chaotic non-linearity of sociocultural change as wrought by technology, it being no more resistible than any other force of nature. But it turns out that Toffler’s own terminology was in itself highly “linear” in so far as he proposed that the socio-technological “waves” arrive sequentially, and don’t seem to mix together, as was plainly the case with Rome and China.

Tuesday, July 19, 2016

The Triumph of Theatre

This past spring, after attending at the National Arts Centre a touring-company revival of The Sound of Music, I was reminded about thoughts I’ve had in the last few years as to how paradoxically in the age of the Internet, the live performance was being revived as a key part of the entertainment business. 

It is a consequence, of course, of the ability of Internet users to pass back and forth digital copies of recorded performance, illegally, of course, but nevertheless it is a fact of life.  According to figures compiled at this site, sales of recordings (compact disc, cassette, vinyl record) have declined from a peak of nearly 20 billion U.S. dollars in 2000, to around six billion inflation-adjusted dollars in 2013.

Thus, in order to be assured of a living, musicians must play live, the only type of performance experience that cannot be adequately pirated or bootlegged.  

The movie-industry has also been affected by digital bootlegging, though not as dramatically as has recorded music.  

Nevertheless, much of the revenue derived from movies comes from other than actual cinematic audiences: while sales of digital-video discs have declined dramatically from their peak in 2004, the slack has been taken up by video-on-demand and related services which bring the movie-going experience direct to the home.  

Recently, too, it was reported that Star Wars / Star Trek director J.J. Abrams, along with other filmmaking luminaries, were boosting a service that would bring newly-released movies directly to the home.

But, while the movie-theatre business itself remains moribund, there has been a very big revival in live theatre presentation, as well.

www.pinterest.com

Often, "new" musical theatre productions are simply reboots of unsuccessful or long-forgotten motion-pictures.

Last autumn, for example, the Arts Centre hosting the touring production of Newsies. Opening on Broadway in 2011, the story is based on an actual New York city newsboy-strike at the very end of the nineteenth century, and is evidently successful enough to be taken on tour across the continent. 

Newsies was, however, originally a movie released by the Disney company in the early 1990s — which was (according to Wikipedia) a complete flop. 

I’ve never seen this movie, nor yet the stage production itself, but is a reversal from the original practice, wherein Broadway musicals would be made into movies; but often the movies adapted to the stage, were often not musicals themselves. 

Newsies originally was a musical, both others re-adaptation were not: such as The Producers, which was a bit of a sensation in the early twenty-first century (and based on a 1967 film starring Gene Wilder), or Hairspray, which premiered in 2002 and was based on a 1988 movie by John Waters (the musical adaptation of which was, in turn, remade into a film a few years later, starring John Travolta as an obese woman). 

Apparently, many stories are more popularly accessible when accompanied by song and dance, and when performed onstage. 

My thoughts in this direction may have been inspired by the book I read recently, No Applause, Just Throw Money, a history of vaudeville by Travis Stewart under the “Trav S.D.” pseudonym. 


www.pinterest.com
Stewart traces vaudeville’s origins to the rough and raunchy stage entertainments of nineteenth-century America. But by the late 1800s, savvy impresarios responded to then-fashionable pleas for moral hygiene. Figuring out that a good, clean stage-show, appropriate even for women and children, was far more lucrative than appealing to smutty instincts of the crowd, theatre-men applied systematic industrial methods to the live entertainment. 

As Stewart writes, “The revolution of the `double audience’ (appealing to women and children now as well as men) added enormously to the profitability of variety production. It had been achieved chiefly though a public relations coup (courtesy of Barnum and his vaudeville acolytes) the likes of which may never be seen again. But several other innovations not only added to, but multiplied the growth of the vaudeville industry, making its existence a foregone conclusion. Variety, after all, had been small potatoes. Its producers were small businessmen whose dreams didn’t extend any farther than their own saloon doors. Vaudeville, on the other hand, was big business. Following Adam Smith’s principles of division of labor and mass production, its producers would come to control the entertainment of the nation.” 

The difference between the old theatre-owners and the showmen of vaudeville, Stewart writes, “is essentially the same as the one between the proprietor of your local greasy spoon and Ray Kroc,” the later being responsible for franchising the original McDonald’s restaurant into a global operation. (Stewart, No Applause, p. 84)

I’ve long viewed the McDonald’s restaurant as the epitome of the division of labour, which even before engineered machinery, is the decisive factor of modern work. The technical resources of the McDonald brothers in 1950s California were no different than what was available to countless other roadside proprietors had at there and across North America. What made their restaurant so special was how they divided up the work needed to make burgers and fries in the most efficient way possible. 

Similarly, the original manufactories were not necessarily steam-driven: they instead employed unskilled workers, each doing a fraction of the total work, to achieve exorbitant productivity. As Stewart observes, “The nineteenth century saw the birth of mass production and distribution of nearly everything: furniture, clothing, tools, appliances. It was inevitable that the techniques and the philosophy of industrialism would come to the theater.” (Stewart, p. 85) 

Vaudeville performances were thus staged from eight o’clock in the morning until late in the evening, or even the early hours. Audiences could pay to enter and leave any time, and were treated to about ten separate acts, repeated over again until closing, according to “Trav.” 

There was a rough pattern the performances: novices and less-polished openers would warm up the crowd for the headliners that took up the middle of the cycle. Then, less-talented acts further down the playbill, would follow on until the final “chaser”, a performer so bad that he or she would literally chase the audience from the seats. The goal was, of course, to create room for more paying customers. Often, notes Stewart, one of the segments of the vaudeville show would be a short silent-film. 

The talking-pictures, though, were (along with the Depression) a cause of vaudeville’s demise. In the 1930s, film-production boomed even as live-theatre, and most other American industries, went into prolonged slump. 

However, as the movies could now feature singing as well as dancing, the musical talent of vaudeville and Broadway largely decamped to Hollywood, creating the great age of the movie-musical from the ‘30s to the 1950s

While vaudeville largely faded out, Broadway continued to originate musical-theatre during this period, with many of productions successfully adapted to the wide-screen. But as the west-coast became the talent-centre for the musical, the stage was set in New York for a fluorescence of theatrical drama, such as not been seen (in the English-speaking world, at least) since Elizabethan times. 

During this period, plays written by Eugene O’Neill, Thornton Wilder, Tennessee Williams, Arthur Miller and Lillian Hellman met with great commercial and literary success, in spite of their often bleak and tragic content. 

They dealt in subject-matter considered too controversial and subversive to be a part of “golden-age” Hollywood cinema. Orson Welles gained footing in show business through the theatre, though he is best-remembered for his contributions to the art of the film. 

Significantly, however, very few of the classics of the “great age of American drama” — such as Our Town by Wilder, O’Neill’s Long Day’s Journey Into Night, Death of a Salesman by Miller, or The Children’s Hour by Hellman — were ever definitively adapted to film. 


Cast of recent staging of Death of a Salesman, with the late Philip Seymour Hoffman second from right (and Spiderman to his left).
entertainment.time.com

Williams’ Streetcar Named Desire is perhaps the exception; but it is remembered chiefly, it seems, for Marlon Brando’s undershirted repetition of the character’s name, “Stella”, than for anything else. 

Apparently, the twentieth-century American dramatists were communicating something onstage that couldn’t be reproduced effectively on film. There was, on the other hand, little financial risk for theatre-owners to stage these high-brow plays, because of the relatively low cost of production (certainly compared to the expense of making a motion picture). 

Edward Albee, a playwright whose career began at the tail-end of this dramaturgical flowering, was in fact the grandson of a vaudeville impresario

Reaching his greatest success in the early ‘60s with Who’s Afraid of Virginia Woolf? (which was successfully adapted into an Oscar-nominated film starring Elizabeth Taylor and Richard Burton), Albee introduced strong language and nudity to the stage, though in relatively chaste form, compared to what came later. 

Perhaps the new era of the Broadway musical began with the premiere of Hair, in 1968. Featuring a group of hippies trying to evade U.S. military conscription, the play also had full-frontal nudity at its climax, a first for Broadway, if not for the American stage as a whole. 

Just a year later appeared Oh! Calcutta, a revue of sexuality-theme sketches that featured nudity throughout the show. 

In real way, both Hair and Calcutta were reviving the smutty character of the pre-vaudeville stage when, as Stewart describes it, boozing and whoring went hand in hand with theatre-going: when taverns and bawdy houses were not located right inside the venue, they sat right next-door. Calcutta’s incorporation of exhibitionism in the old-style revue, made it seem instantly avant-garde. 


Broadway's Theatre Row, 1970s version.
gothamist.com

But soon the New York theatre district’s traditional home, around the intersection of Forty-Second street and Broadway, became far more ribald and dangerous than the worst of the nineteenth century saloons. 

Theatres that had staged dramas and musicals were often turned into pornographic cinemas, and the area around Times Square for a time recorded the highest number of offences for its land area, than any other place in the world. Broadway the street was saved by the aggressive policing of scofflaws (which in turn led to a reduction in more serious crimes), and the revocation of slot-palace business-licences. 

Major retailers were encouraged to invest in the area, significantly including the Disney company (hence the implicitly derisive term “Disneyfication” to describe the contemporary Times Square district). 

Below the famous “sign” at the Square (now a series of giant flat-screen monitors), have been installed bleachers so that visitors can sit and view midtown Manhattan as a show in itself (and they in turn are on stage for those walking through the area). 

Perhaps because of its experience in staging technically sophisticated, interactive entertainment at amusement parks in California and Florida, Disney has also become involved in a big way in live-theatre production. 

Not all recent Broadway-musical successes have been based on old movies.  In particular, Hamilton, which debuted in 2015 and features a fictionalized account of the life of American Founding Father Alexander Hamilton and his ultimately fateful rivalry with the Vice-President, Aaron Burr.  Set to a largely hip-hop score, with Hispanic and African-American actors playing Hamilton, Thomas Jefferson, George Washington, and so on, it has been a smash with audiences, especially young people, and is strikingly original in concept and execution.  


Cast of Hamilton.
www.mic.com

The irony of all this is, of course, that while the cinema at one time supplanted live-theatre as the main source of public entertainment, now live-entertainment has become the growth industry even as the movie-exhibitor business shrinks in size. 

Aside from being transformed into all-around leisure spots (with the addition of fast-food outlets, video-arcade games, rooms for children’s parties), the switch-over to digital projection has allowed cineplexes to transmit live entertainment as well — sometimes sports events, but also awards shows, and even opera and other kinds of highbrow spectacle. 

Contemporary musical theatre’s revival, often adapting failed or forgotten movies into box-office successes, is part of the trend in which the live performance has become the more lucrative part of the entertainment business, as the Internet and digital technology reduces the value of recorded entertainment to near zero. 

The successful live-staging of stories that were failures or forgotten in other media, demonstrates that the form of experience really does matter, over and above its narrative content. And also that live performance which cannot be reproduced effectively in other, recorded media, is all the more precious to audiences.