Tradition, Genuine and Spurious

Here is a very thoughtful list from our colleague Bruno Postle. He rightly points out that before we can make a declaration on traditional architecture, we need to unpack that word “tradition” and understand what is and is not involved….

Cheers, Michael Mehaffy
Hi Michael

…My observation is that this word ‘tradition’ has so many negative associations, you really need to roll all the way back to the beginning and declare what a ‘true tradition’ is relative to a ‘false tradition’.

So you got me thinking, none of this is specifically about architecture, it could apply to cooking or be pinned on the wall of a maker-space:

1. In a living tradition, we are allowed to copy what works and that meets human needs.
2. A copy is only a ‘fake’ when it is fraud – passing off one thing as something else.
3. In a living tradition, copying isn’t duplication, we are obliged to remix, to fit, adapt and improve.
4. In a living tradition, we share knowledge in common and pass on what we have learned.
5. In a living tradition, we are delighted to be copied, but we like to be acknowledged.
6. A false tradition has none of these things, though it may seem to be old.
7. A false tradition is a marketing device concocted to have the illusion of antiquity.
8. A false tradition is usually camouflage for the indefensible.

#TradArchCHS

I’ll be tweet-casting next weekend’s proceedings from Charleston, even though some are complaining. Such a pity that they don’t realize that there are two parts to the story: the work that you do, and how you communicate the work that you do. In the old days, companies pre-cooked a perfect story, then launched it to their target audience. Today, the company isn’t the centerpiece of our work; the cause is. And a cause operates in the sunlight for everyone to see.  Hope they catch up someday.

THE CRISIS WITH TODAY’S MODERN ARCHITECTURE

Modern architecture has reached a crisis point.  The original intention of early modernists was to seek out an architecture that was non-traditional and free of historical precedent.  The mistake made by architecture schools was to abandon the teaching of classical architecture in favor of only modernism and thus creating the multi-generational gap in the continuous transfer of architectural knowledge and training from one generation to the next.  Just as the Second Amendment to the U. S. Constitution provides the right to bear arms, it also provides the right to NOT bear arms if one chooses.  Similarly, to truly be free of historical precedent in architecture, one should also be free to engage in historical precedent.  One is not free if they are restricted from pursuing a certain design solution that may seem relevant.  It was very unfortunate that this clear, logical way of thinking was not present when all the architecture history books and plaster casts were purged out of the architecture schools.  This anti-traditional architecture, with its lack of ornamentation and mouldings can only take you so far.

3-499x336

Modern Architecture has reached that point.  Considering the Form Follows Function philosophy of architecture, there was nothing that could be done that wasn’t done before.   With no where to innovate, the Form Follows Function philosophy was replaced by Form can be Anything philosophy.  Now modern architecture has reached the point where any shape or form can be passed off as a great work of architecture, worthy of publication in any of the architecture trade journals. There are no rules, no guidelines, no principles, therefore anyone can do it. Anyone who has Sketchup on their computer, and is proficient at it can do it. There is no longer anything that separates or distinguishes great modern architecture from mediocre or bad modern architecture. Of course the architect still has to make sure there are proper exits and the building meets the latest building codes, but the last time I checked, architecture was more than passing city and fire marshal reviews.

It seems like the ultimate design freedom, but without rules or design principles there is no way to evaluate one form as being better than another.  There is no where to go; there is no way to improve.  There is nothing to master.  Once you’ve learned the computer skills at generating the weird shapes you can’t get better at it.

Classical architecture is a grammar.  It can easily be taught, it can be learned, and it can be mastered. Unfortunately, there is the misconception that applying classical architectural principles to today’s architectural problems is regressive, or going backwards.  They think it is not progressive.  In fact, it is necessary to move the tradition of classical architecture forward using today’s technology and if necessary adapt it to today’s technology and needs.  This is what was done during all of the revival periods.  Greek Revival architects didn’t design buildings like they were transformed back into the 5 Century BC; they designed mid-nineteenth century buildings using motifs found on Ancient Greek architecture.  They built with wood “balloon” frames, used double hung windows with glass glazing, clad the exteriors with clapboard siding, and used ornamental cast iron railings, all of which are materials and techniques that were unknown to Ancient Greek architects.

Today’s modernist architects do not believe that classical architecture is the way to continue down the road.  They continue to think that it is going backwards, meanwhile they are all off the road, in a pasture doing donuts and getting stuck in the mud.

How the World Takes Its Tradition

Outside the garden of architectural elitism, the world is comfortably reawakening to good, old-fashioned tradition. Smart, big companies, quick to spot trends, have organized their products and their delivery systems around these murmurs of old-fashionedness, giving folks what they want, exactly how they want it. Architects would be wise to wake up and smell the coffee.

One new age behemoth of such systems is Starbucks. The blend of addictive ingredients (caffeine, sugar, fat), brilliant locations (usually on the righthand side along main commuter arteries with easy in-out), and a reasonably hip vibe, have driven sales, pricing, and stock shares all soaring.

Oh, but here’s the kicker: it’s 1) Natural, 2) Organic, 3) Handmade, and they attempt to be 4) Just. The Local part (the last of my personal 5-point mantra), they provide by serving as a neighborhood social hub. (And no, not every ingredient they use is exactly natural, but it’s the positioning of the product for mass acceptance that matters here – and that the tradition-based market is large and well defined.)

Importantly, Starbucks deploys high technology to produce and promote their “New Natural” experience, but, that technology (like barista-proof espresso machines, computerized payment systems, and social media networking) is decidedly not the product. They use technology behind the scenes to deliver a handcrafted experience out front. To me that’s brilliant, even explosive, and a model traditional architects can emulate.

Modernist architects are still stuck thinking the technology is the product. The world is telling us that’s not what they ordered. It turns out the future may not be so futuristic after all; the future may look far more like the past than what we’ve been led to believe.

Christopher Lasch

Robert Orr writes (on the Lean Urbanism listserv, later cross-posted to TradArch):

21 years ago in 1994 on Valentine’s Day this coming weekend, America’s arguably most provocative historian succumbed to cancer in Pittsford, NY at the age of 61. Upon learning that it was unlikely to significantly prolong his life, he refused chemotherapy, observing that it would rob him of the energy he needed to continue writing and teaching, he wrote to one persistent specialist: “I despise the cowardly clinging to life, purely for the sake of life, that seems so deeply ingrained in the American temperament.”
I was first introduced to Christopher Lasch, not the person, but to his writings, 25 years earlier in 1969 by one of the most significant mentors in my life, Jeremy Felt, in his year-long seminar on American Intellectual History. Lasch roomed with John Updike at Harvard, and was heavily influenced at Columbia by another intellectual examined in Felt’s seminar, Richard Hofstadter. Lasch is one of those rare intellectuals who attracts both conservative and liberal following.
Published posthumously in 1994, Lasch’s book The Revolt of the Elites and the Betrayal of Democracy, in contraposition to José Ortega y Gasset’s 1930, The Revolt of the Masses, is written in direct language, with a broad, nonacademic audience clearly in mind. Though historically informed, it makes no pretense of being a work of historical scholarship, devoting most of its attention instead to the discontents of the late 20th century and the prospects for the future. Like some on this list, he renders a harsh judgment on the college-educated professional and managerial elites in America, whom he accuses of having abandoned the common life and subverted democracy. Never before in American history, Lasch argues, have the privileged classes been so “dangerously isolated” from the rest of the country. Lean.
It is this book that takes up the exact same arguments found in our posts, if more articulately rendered, and most irrefutably raises the Lean Urbanism Debate to the higher plane of no less than the survival of democracy, and wherein one might see the strongest substantiation for all the avenues explored over the past year on this list. Unintentionally Lasch paints a picture where the armature of democracy and community has been erased by the mobility desperately sought by the elites, and we have unintentionally painted a picture of how to replace the armature, of democracy and community, through our focus on the “small” average bloke. Once the proper armature can be replaced, normative democratic human enterprise can find “susceptible” lodging and get right to work self-assembling the web of community, which is its wont, cemented by the “Irrational” foundations of stasis — faith, hope and love, AKA Fides, Spes et Caritas.
For sure our purpose is NOT a campaign to win over the populous to a lean way of thinking, as many globalists on this list might imagine, a futile and unproductive piece of brain damage, if ever there was one. Rather we must see ourselves as the abandoned “physical laborers” of whom Lasch speaks, quietly inserting the lean armature without notice of the elites in the most susceptible locations where democracy can re-take up arms.
The question left to be debated is the Lippmann/Lasch debate: Do we insert the lean armature as experts without the distraction of political process, or do we engage democratic debate so that everyone knows why, and is part of why, it’s there.
Please read carefully Lasch’s (written 1994) compelling Introduction Chapter One to The Revolt of the Elites:
 
Christopher Lasch – The Revolt of the Elites
Chapter One
_________
Introduction
The Democratic Malaise
Most of my recent work comes back in one way or another to the question of whether democracy has a future. I think a great many people are asking themselves the same question, Americans are much less sanguine about the future than they used to be, and with good reason. The decline of manufacturing and the consequent loss of jobs; the shrinkage of the middle class; the growing number of the poor; the rising crime rate; the flourishing traffic in drugs; the decay of the cities — the bad news goes on and on. No one has a plausible solution to these intractable problems, and most of what passes for political discussion doesn’t even address them. Fierce ideological battles are fought over peripheral issues. Elites, who define the issues, have lost touch with the people. (See chapter 2, “The Revolt of the Elites.”) The unreal, artificial character of our politics reflects their insulation from the common life, together with a secret conviction that the real problems are insoluble.
George Bush’s wonderment, when he saw for the first time an electronic scanning device at a supermarket checkout counter, revealed, as in a flash of lightning, the chasm that divides the privileged classes from the rest of the nation. There has always been a privileged class, even in America, but it has never been so dangerously isolated from its surroundings. In the nineteenth century wealthy families were typically settled, often for several generations, in a given locale. In a nation of wanderers their stability of residence provided a certain continuity. Old families were recognizable as such, especially in the older seaboard cities, only because, resisting the migratory habit, they put down roots. Their insistence on the sanctity of private property was qualified by the principle that property rights were neither absolute nor unconditional. Wealth was understood to carry civic obligations. Libraries, museums, parks, orchestras, universities, hospitals, and other civic amenities stood as so many monuments to upper-class munificence.
No doubt this generosity had a selfish side: It advertised the baronial status of the rich, attracted new industries, and helped to promote the home city against its rivals. Civic boosterism amounted to good business in an age of intense competition among cities, each aspiring to preeminence. What mattered, however, was that philanthropy implicated elites in the lives of their neighbors and in those of generations to come. The temptation to withdraw into an exclusive world of their own was countered by a lingering awareness, which in some circles survived even the riotous self-indulgence of the Gilded Age, that “all have derived benefits from their ancestors,” as Horace Mann put it in 1846, and that therefore, “all are bound, as by an oath, to transmit those benefits, even in an improved condition, to posterity.” Only an “isolated, solitary being, … having no relations to a community around him,” could subscribe to the “arrogant doctrine of absolute ownership,” according to Mann, who spoke not only for himself but for a considerable body of opinion in the older cities, in much of New England, and in New England’s cultural dependencies in the Old Northwest.
Thanks to the decline of old money and the old-money ethic of civic responsibility, local and regional loyalties are sadly attenuated today. The mobility of capital and the emergence of a global market contribute to the same effect. The new elites, which include not only corporate managers but all those professions that produce and manipulate information — the lifeblood of the global market — are far more cosmopolitan, or at least more restless and migratory, than their predecessors. Advancement in business and the professions, these days, requires a willingness to follow the siren call of opportunity wherever it leads. Those who stay at home forfeit the chance of upward mobility. Success has never been so closely associated with mobility, a concept that figured only marginally in the nineteenth-century definition of opportunity (chapter 3, “Opportunity in the Promised Land”). Its ascendancy in the twentieth century is itself an important indication of the erosion of the democratic ideal, which no longer envisions a rough equality of condition but merely the selective promotion of non-elites into the professional-managerial class.
Ambitious people understand, then, that a migratory way of life is the price of getting ahead. It is a price they gladly pay, since they associate the idea of home with intrusive relatives and neighbors, small-minded gossip, and hidebound conventions. The new elites are in revolt against “Middle America,” as they imagine it: a nation technologically backward, politically reactionary, repressive in its sexual morality, middlebrow in its tastes, smug and complacent, dull and dowdy. Those who covet membership in the new aristocracy of brains tend to congregate on the coasts, turning their back on the heartland and cultivating ties with the international market in fast-moving money, glamour, fashion, and popular culture. It is a question whether they think of themselves as Americans at all. Patriotism, certainly, does not rank very high in their hierarchy of virtues. “Multiculturalism,” on the other hand, suits them to perfection, conjuring up the agreeable image of a global bazaar in which exotic cuisines, exotic styles of dress, exotic music, exotic tribal customs can be savored indiscriminately, with no questions asked and no commitments required. The new elites are at home only in transit, en route to a high-level conference, to the grand opening of a new franchise, to an international film festival, or to an undiscovered resort. Theirs is essentially a tourist’s view of the world-not a perspective likely to encourage a passionate devotion to democracy.
In The True and Only Heaven, I tried to recover a tradition of democratic thought — call it populist, for lack of a better term — that has fallen into disuse. One reviewer surprised me by complaining that the book had nothing to say about democracy (a misunderstanding I have laid to rest, I trust, in chapter 4, “Does Democracy Deserve to Survive?”). That he could miss the point of the book in this way tells us something about the current cultural climate. It shows how confused we are about the meaning of democracy, how far we have strayed from the premises on which this country was founded. The word has come to serve simply as a description of the therapeutic state. When we speak of democracy today, we refer, more often than not, to the democratization of “self-esteem.” The current catchwords — diversity, compassion, empowerment, entitlement — express the wistful hope that deep divisions in American society can be bridged by goodwill and sanitized speech. We are called on to recognize that all minorities are entitled to respect not by virtue of their achievements but by virtue of their sufferings in the past. Compassionate attention, we arc told, will somehow raise their opinion of themselves; banning racial epithets and other forms of hateful speech will do wonders for their morale. In our preoccupation with words, we have lost sight of the tough realities that cannot be softened simply by flattering people’s self-image. What does it profit the residents of the South Bronx to enforce speech codes at elite universities?
In the first half of the nineteenth century most people who gave any thought to the matter assumed that democracy had to rest on a broad distribution of property. They understood that extremes of wealth and poverty would be fatal to the democratic experiment. Their fear of the mob, sometimes misinterpreted as aristocratic disdain, rested on the observation that a degraded laboring class, at once servile and resentful, lacked the qualities of mind and character essential to democratic citizenship. Democratic habits, they thought-self-reliance, responsibility, initiative-were best acquired in the exercise of a trade or the management of a small holding of property. A “competence,” as they called it, referred both to property itself and to the intelligence and enterprise required by its management. It stood to reason, therefore, that democracy worked best when property was distributed as widely as possible among the citizens.
The point can be stated more broadly: Democracy works best when men and women do things for themselves, with the help of their friends and neighbors, instead of depending on the state. Not that democracy should be equated with rugged individualism. Self-reliance does not mean self-sufficiency. Self-governing communities, not individuals, are the basic units of democratic society, as I argue in chapters 5 (“Populism or Communitarianism?”), 6 (“Conversation and the Civic Arts”), and 7 (“Racial Politics in New York”). It is the decline of those communities, more than anything else, that calls the future of democracy into question. Suburban shopping malls are no substitute for neighborhoods. The same pattern of development has been repeated in one city after another, with the same discouraging results. The flight of population to the suburbs, followed by the flight of industry and jobs, has left our cities destitute. As the tax base shrivels, public services and civic amenities disappear. Attempts to revive the city by constructing convention centers and sports facilities designed to attract tourists merely heighten the contrast between wealth and poverty. The city becomes a bazaar, but the luxuries on display in its exclusive boutiques, hotels, and restaurants are beyond the reach of most of the residents. Some of those residents turn to crime as the only access to the glittering world seductively advertised as the American dream. Those with more modest aspirations, meanwhile, are squeezed out by high rents, gentrification, and misguided policies intended to break up ethnic neighborhoods that allegedly stand in the way of racial integration.
Populism, as I understand it, was never an exclusively agrarian ideology. It envisioned a nation not just of farmers but of artisans and tradesmen as well. Nor was it implacably opposed to urbanization. In the fifty years preceding World War I, the rapid growth of cities, the influx of immigrants, and the institutionalization of wage labor presented democracy with a formidable challenge, but urban reformers like Jane Addams, Frederic C. Howe, and Mary Parker Follett were confident that democratic institutions could be adapted to the new conditions of urban life. Howe caught the essence of the so-called progressive movement when he referred to the city as the “hope of democracy.” Urban neighborhoods, it appeared, re-created the conditions of small-town life with which democracy had been associated in the nineteenth century. The city fostered new forms of association in its own right, notably the labor union, together with a lively civic spirit.
The conflict between town and country, exploited by nativist demagogues who depicted the city as a sink of iniquity, was largely illusory. The best minds have always understood that town and country are complementary and that a healthy balance between them is an important precondition of the good society. It was only when the city became a megalopolis, after World War II, that this balance broke down. The very distinction between town and country became meaningless when the dominant form of settlement was no longer urban or rural, much less a synthesis of the two, but a sprawling, amorphous conglomeration without clearly identifiable boundaries, public space, or civic identity. Robert Fishman has argued persuasively that the new pattern can no longer be adequately described even as suburban since the suburb, formerly a residential annex of the city, has now taken over most of its functions. Cities retain a residual importance as the home of large law firms, advertising agencies, publishing companies, entertainment enterprises, and museums, but the middle-class neighborhoods that sustained a vigorous civic culture are rapidly disappearing. Mere remnants, our cities are increasingly polarized; upper-middle-class professionals, together with the service workers who cater to their needs, maintain a precarious hold on the high-rent districts and barricade themselves against the poverty and crime threatening to engulf them.
None of this bodes well for democracy, but the outlook becomes even darker if we consider the deterioration of public debate. Democracy requires a vigorous exchange of ideas and opinions. Ideas, like property, need to be distributed as widely as possible. Yet many of the “best people,” as they think of themselves, have always been skeptical about the capacity of ordinary citizens to grasp complex issues and to make critical judgments. Democratic debate, from their point of view, degenerates all too easily into a shouting match in which the voice of reason seldom makes itself heard. Horace Mann, wise in so many things, failed to see that political and religious controversy is educative in its own right and therefore tried to exclude divisive issues from the common schools (chapter 8, “The Common Schools”). His eagerness to avoid sectarian quarrels is understandable enough, but it left a legacy that may help to explain the bland, innocuous, mind-numbing quality of public education today.
American journalism has been shaped by somewhat similar reservations about the reasoning powers of ordinary men and women (chapter 9, “The Lost Art of Argument”). According to Walter Lippmann, one of the pioneers of modern journalism, the “omnicompetent citizen” was an anachronism in the age of specialization. In any case, most citizens, he thought, cared very little about the substance of public policy. The purpose of journalism was not to encourage public debate but to provide experts with the information on which to base intelligent decisions. Public opinion, Lippmann argued in opposition to John Dewey and other veterans of the progressive movement, was a weak reed. It was shaped more by emotion than by reasoned judgment. The very concept of a public was suspect. The public idealized by the progressives, a public capable of the intelligent direction of public affairs, was a “phantom.” It existed only in the minds of sentimental democrats. “The public interest in a problem,” Lippmann wrote, “is limited to this: that there shall be rules… The public is interested in law, not in the laws; in the method of law, not in the substance.” Substantive questions could safely be left to experts, whose access to scientific knowledge immunized them against the emotional “symbols” and “stereotypes” that dominated public debate.
Lippmann’s argument rested on a sharp distinction between opinion and science. Only the latter, he thought, could claim to be objective. Opinion, on the other hand, rested on vague impressions, prejudices, and wishful thinking. This cult of professionalism had a decisive influence on the development of modem journalism. Newspapers might have served as extensions of the town meeting. Instead they embraced a misguided ideal of objectivity and defined their goal as the circulation of reliable information — the kind of information, that is, that tends not to promote debate but to circumvent it. The most curious feature in all this, of course, is that although Americans are now drowning in information, thanks to newspapers and television and other media, surveys regularly report a steady decline in their knowledge of public affairs. In the “age of information” the American people are notoriously ill informed. The explanation of this seeming paradox is obvious, though seldom offered: Having been effectively excluded from public debate on the grounds of their incompetence, most Americans no longer have any use for the information inflicted on them in such large amounts. They have become almost as incompetent as their critics have always claimed — a reminder that it is debate itself, and debate alone, that gives rise to the desire for usable information. In the absence of democratic exchange, most people have no incentive to master the knowledge that would make them capable citizens.
The Misleading Distinction between knowledge and opinion reappears, in a somewhat different form, in the controversies that have recently convulsed the university (chapter 10, “Academic Pseudo-radicalism”). These controversies are bitter and inconclusive because both sides share the same unacknowledged premise: that knowledge has to rest on indisputable foundations if it is to carry any weight. One faction — identified with the left although its point of view bears little resemblance to the tradition it claims to defend — takes the position that the collapse of “foundationalism” makes it possible for the first time to see that knowledge is merely another name for power. The dominant groups — white Eurocentric males, in the usual formulation — impose their ideas, their canon, their self-serving readings of history on everybody else. Their power to suppress competing points of view allegedly enables them to claim for their own particularistic ideology the status of universal, transcendent truth. The critical demolition of foundationalism, according to the academic left, exposes the hollowness of these claims and enables disfranchised groups to contest the prevailing orthodoxy on the grounds that it serves only to keep women, homosexuals, and “people of color” in their place. Having discredited the dominant world view, minorities are in a position to replace it with one of their own or at least to secure equal time for black studies, feminist studies, gay studies, Chicano studies, and other “alternative” ideologies. Once knowledge is equated with ideology, it is no longer necessary to argue with opponents on intellectual grounds or to enter into their point of view. It is enough to dismiss them as Eurocentric, racist, sexist, homophobic — in other words, as politically suspect.
Conservative critics of the university, understandably uneasy with this sweeping dismissal of Western culture, can find no way of defending it except by appealing to the very premise the collapse of which invites the attack on the classics: that acknowledgment of certain axiomatic principles is the precondition of reliable knowledge. Unfortunately for their cause, it is impossible, at this late date, to resurrect the absolutes that once seemed to provide secure foundations on which to build dependable structures of thought. The quest for certainty, which became an obsessive theme in modern thought when Descartes tried to ground philosophy in indubitable propositions, was misguided to begin with. As John Dewey pointed out, it distracted attention from the real business of philosophy, the attempt to arrive at “concrete judgments … about ends and means in the regulation of practical behavior.” In their pursuit of the absolute and immutable, philosophers took a disparaging view of the time-bound and contingent. “Practical activity,” as Dewey put it, became in their eyes “intrinsically an inferior sort of thing.” In the world view of Western philosophy, knowing came to be split off from doing, theory from practice, the mind from the body.
The lingering influence of this tradition colors the conservative critique of the university. Foundationalism, conservatives argue, provides the only defense against moral and cultural relativism. Either knowledge rests on immutable foundations or men and women are free to think whatever they please. “Things fall apart; the center cannot hold; mere anarchy is loosed upon the world.” Conservatives never tire of quoting Yeats’s lines in order to show what happens when axiomatic principles lose their authority. The trouble in academia, however, derives not from the absence of secure foundations but from the belief (shared, it must be repeated, by both parties to this debate) that in their absence the only possible outcome is a skepticism so deep that it becomes indistinguishable from nihilism. That this is not, in fact, the only possible outcome would have been abundantly clear to Dewey, and the revival of pragmatism as an object of historical and philosophical study — one of the few bright spots in an otherwise dismal picture — holds out some hope of a way out of the academic impasse.
The quest for certainty has more than merely academic interest. It also enters into the heated controversy over the public role of religion. Here again both sides often tum out to share the same premise, in this case that religion provides a rock of security in an unpredictable universe. It is the collapse of the old certainties, according to critics of religion, that makes it impossible (impossible, at least, for those exposed to the corrosive influence of modernity) to take religion seriously. Defenders of religion tend to argue from the same premise. Without a set of unquestioned dogmas, they say, people lose their moral bearings. Good and evil become more or less indistinguishable; everything is permitted; old injunctions are defied with impunity.
Such arguments are advanced not only by evangelical preachers but occasionally by secular intellectuals troubled by the threat of moral anarchy (chapter 12, “Philip Rieff and the Religion of Culture”). With good reason, these intellectuals deplore the privatization of religion, the disappearance of religious issues from public discussion. Their case is weakened, however, by a couple of serious flaws. In the first place, it is impossible to revive religious belief simply because it serves a useful social purpose. Faith issues from the heart; it cannot be summoned up on demand. In any case, religion cannot be expected to provide a comprehensive, definitive code of conduct that settles every dispute and resolves every doubt. It is this very assumption, curiously enough, that leads to the privatization of religion. Those who want to keep religion out of public life argue that religious belief, in the nature of things, commits the believer to indisputable dogmas that lie beyond the reach of rational argument. They too, these skeptics, see religion as a body of ironclad dogmas the faithful are forbidden to question. The same qualities that make religion attractive to those who regret its decline — the security that it allegedly provides against doubt and confusion, the comfort adherents allegedly derive from an airtight system that leaves nothing unexplained — make it repulsive to the secular mind. Opponents of religion argue further that it necessarily fosters intolerance, since those who embrace it imagine themselves to be in possession of absolute, exclusive truths irreconcilable with other truth claims. Given the opportunity, they will invariably seek to make everyone else conform to their own ways. The cultured despisers of religion suspect that religious tolerance, in short, is a contradiction in terms — a fact seemingly borne out by the long history of religious warfare.
No doubt this disparaging view of religion, which has been with us for a long time, contains more than a little truth. Still, it misses the religious challenge to complacency, the heart and soul of faith (chapter 13, “The Soul of Man under Secularism”). Instead of discouraging moral inquiry, religious prompting can just as easily stimulate it by calling attention to the disjunction between verbal profession and practice, by insisting that a perfunctory observance of prescribed rituals is not enough to assure salvation, and by encouraging believers at every step to question their own motives. Far from putting doubts and anxieties to rest, religion often has the effect of intensifying them. It judges those who profess the faith more harshly than it judges unbelievers. It holds them up to a standard of conduct so demanding that many of them inevitably fall short. It bas no patience with those who make excuses for themselves— an art in which Americans have come to excel. If it is ultimately forgiving of human weakness and folly, it is not because it ignores them or attributes them exclusively to unbelievers. For those who take religion seriously, belief is a burden, not a self-righteous claim to some privileged moral status. Self-righteousness, indeed, may well be more prevalent among skeptics than among believers. The spiritual discipline against self-righteousness is the very essence of religion.
Because a secular society does not grasp the need for such a discipline, it misunderstands the nature of religion: to console but, first of all, to challenge and confront. From a secular point of view, the overriding spiritual preoccupation is not self-righteousness but “self-esteem” (chapter II, “The Abolition of Shame”). Most of our spiritual energy is devoted precisely to a campaign against shame and guilt, the object of which is to make people “feel good about themselves.” The churches themselves have enlisted in this therapeutic exercise, the chief beneficiaries of which, in theory at least, are the victimized minorities that have been systematically deprived of self-esteem by a vicious history of oppression. What these groups need, according to the prevailing consensus, is the spiritual consolation provided by the dogmatic assertion of their collective identity. They are encouraged to recover their ancestral heritage, to revive discarded rituals, and to celebrate a mythical past in the name of history. Whether or not this bracing account of their distinctive past actually meets accepted standards of historical interpretation is a secondary consideration; what matters is whether it contributes to the positive self-image that allegedly makes for “empowerment.” The same benefits misleadingly associated with religion — security, spiritual comfort, dogmatic relief from doubt — are thought to flow from a therapeutic politics of identity. In effect, identity politics has come to serve as a substitute for religion — or at least for the feeling of self-righteousness that is so commonly confused with religion.
These developments shed further light on the decline of democratic debate. “Diversity” — a slogan that looks attractive on the face of it — has come to mean the opposite of what it appears to mean. In practice, diversity turns out to legitimize a new dogmatism, in which rival minorities take shelter behind a set of beliefs impervious to rational discussion. The physical segregation of the population in self-enclosed, racially homogeneous enclaves has its counterpart in the balkanization of opinion. Each group tries to barricade itself behind its own dogmas. We have become a nation of minorities; only their official recognition as such is lacking to complete the process.[1] This parody of “community” — a term much in favor but not very clearly understood — carries with it the insidious assumption that all members of a given group can be expected to think alike. Opinion thus becomes a function of racial or ethnic identity, of gender or sexual preference. Self-selected minority “spokespersons” enforce this conformity by ostracizing those who stray from the party line — black people, for instance, who “think white.” How much longer can the spirit of free inquiry and open debate survive under these conditions?
Mickey Kaus, a New Republic editor, has advanced an interpretation of the democratic malaise, under the provocative and slightly misleading title The End of Equality, that has a great deal in common with the interpretation advanced in these pages.[2] According to Kaus, the most serious threat to democracy, in our time, comes not so much from the maldistribution of wealth as from the decay or abandonment of public institutions in which citizens meet as equals. Equality of income, he argues, is less important than the “more attainable” goal of social or civic equality. He reminds us that foreign observers used to marvel at the lack of snobbery, deference, and class feeling in America. There was “nothing oppressed or submissive” about the American worker, Werner Sombart wrote in 1906. “He carries his head high, walks with a lissom stride, and is as open and cheerful in his expression as any member of the middle class.” A few years later R. H. Tawney noted that America was “marked indeed by much economic inequality, but it is also marked by much social equality.” It is this culture of self-respect, according to Kaus, that we are in danger of losing.
The trouble with our society, from this point of view, is not just that the rich have too much money but that their money insulates them, much more than it used to, from the common life. The “routine acceptance of professionals as a class apart” strikes Kaus as an ominous development. So does their own “smug contempt for the demographically inferior.” Part of the trouble, I would add, is that we have lost our respect for honest manual labor. We think of “creative” work as a series of abstract mental operations performed in an office, preferably with the aid of computers, not as the production of food, shelter, and other necessities. The thinking classes are fatally removed from the physical side of life — hence their feeble attempt to compensate by embracing a strenuous regimen of gratuitous exercise. Their only relation to productive labor is that of consumers. They have no experience of making anything substantial or enduring. They live in a world of abstractions and images, a simulated world that consists of computerized models of reality — “hyperreality,” as it has been called — as distinguished from the palpable, immediate, physical reality inhabited by ordinary men and women. Their belief in the “social construction of reality” — the central dogma of postmodernist thought — reflects the experience of living in an artificial environment from which everything that resists human control (unavoidably, everything familiar and reassuring as well) has been rigorously excluded. Control has become their obsession. In their drive to insulate themselves against risk and contingency — against the unpredictable hazards that afflict human life — the thinking classes have seceded not just from the common world around them but from reality itself.
The culture wars that have convulsed America since the sixties are best understood as a form of class warfare, in which an enlightened elite (as it thinks of itself) seeks not so much to impose its values on the majority (a majority perceived as incorrigibly racist, sexist, provincial, and xenophobic), much less to persuade the majority by means of rational public debate, as to create parallel or “alternative” institutions in which it will no longer be necessary to confront the unenlightened at all.
According to Kaus, public policy should seek not to undo the effects of the market (which inevitably promotes inequality of income) but to limit its scope — ”to restrict the sphere of life in which money matters.” Drawing on Michael Walzer’s Spheres of justice, he argues that the goal of civic liberalism, as distinguished from “money liberalism,” is “to create a sphere of life in which money is devalued, to prevent those who have money from concluding they are superior.” Walzer is similarly concerned to limit the “extraction not only of wealth but of prestige and influence from the market,” as he puts it. He treats the problem of justice as a problem of boundaries and “boundary revision.” Money, even more than other good things like beauty, eloquence, and charm, has a tendency to “seep across boundaries” and to buy things that should not be for sale: exemption from military service; love and friendship; political office itself (thanks to the exorbitant cost of political campaigns). The principle of equality is best served, Walzer maintains, not by ensuring an equal distribution of income but by setting limits to the imperialism of the market, which “transforms every social good into a commodity.” “What is at issue,” he writes, “ … is the dominance of money outside its sphere.”[3]
There is much wisdom in these words, and those who value democracy would do well to heed them. But it is equally important to remember — what neither Walzer nor Kaus would deny in the last analysis — that economic inequality is intrinsically undesirable, even when confined to its proper sphere. Luxury is morally repugnant, and its incompatibility with democratic ideals, moreover, has been consistently recognized in the traditions that shape our political culture. The difficulty of limiting the influence of wealth suggests that wealth itself needs to be limited. When money talks, everybody else is condemned to listen. For that reason a democratic society cannot allow unlimited accumulation. Social and civic equality presuppose at least a rough approximation of economic equality. A “plurality of spheres,” as Walzer calls it, is eminently desirable, and we should do everything possible to enforce the boundaries among them. But we also need to remember that boundaries are permeable, especially where money is concerned, that a moral condemnation of great wealth must inform any defense of the free market, and that moral condemnation must be backed up with effective political action.
In the old days Americans agreed, at least in principle, that individuals cannot claim entitlement to wealth far in excess of their needs. The persistence of this belief, even though it is admittedly only an undercurrent in the celebration of wealth that now threatens to drown all competing values, offers some hope that all is not yet lost.

Footnotes.
[1] * The vagueness of the concept makes it impossible for policy makers to agree on a list of designated minorities entitled to compensation for a history of oppression. Social scientists first began to speak of minorities, in the current sense of the term, during the New Deal era. It referred to groups that had been “singled out… for differential and unequal treatment,” in the words of Louis Wirth. Whereas national minorities in Europe were generally denounced as aggressive and warlike, American minorities were seen as victims rather than predators. From the beginning, minority status thus gave those able to claim it a certain moral and political leverage. If “the mere fact of being generally hated … is what defines a minority group,” as Arnold and Caroline Rose explained, the moral advantage invariably lay with the minority” (even if it made up a statistical majority of the population). Pressure to expand the category, with a consequent loss of precision, has proved irresistible. By the seventies it included not only various racial and ethnic groups but women (except when they were verbally distinguished by the meaningless formula “women and minorities”), homosexuals, and groups (e.g., the deaf) formerly treated by social scientists as “deviant.”
Justice Lewis Powell declared in the Bakke case (1978), the Supreme Court’s definitive but muddled statement in the matter of affirmative action, that “the United States had become a nation of minorities.” He admitted, however, that the term was hopelessly imprecise. Any group that could “lay claim to a history of prior discrimination” could assert its minority status and its entitlement to the newly created “rights” conferred by the courts in their expansive interpretation of affirmative action. It was equally clear, however, that “not all of these groups” could “receive preferential treatment,” for then “the only  ‘majority’ left would be the new minority of white Anglo-Saxon Protestants. “How was it to be decided, in that case, exactly which groups were eligible for compensatory treatment? It would be bard to quarrel with Powell’s conclusion that “there is no principled basis of deciding which groups would merit ‘heightened judicial solicitude’ and which would not.”
In spite of its obvious imprecision, the minority concept has had enormous influence on social policy. Searching public debate could only strengthen popular opposition to affirmative action and the notion of minorities that undergirds it. In the absence of such a debate, government officials find themselves in the unenviable position of attempting to enforce policies unsupported by any semblance of a social consensus. Philip Gleason, in a thoughtful review of the minority concept, observes that “differential treatment … surely requires more explicit recognition and debate than it has so far received” — an understatement, if there ever was one.

[2] Kaus’s title is ambiguous because it is not altogether clear whether he proposes to abandon the struggle against inequality or whether the proper goal or object (“end”) of an egalitarian society, as he sees it, is a rich civic life accessible to all, not a leveling of incomes. The second reading turns out to be the right one. This does not, however, rule out the possibility that a measure of economic equality is one important means or precondition to the end of civic equality.

[3] Similar concerns were raised, much earlier, in the work of sociologists loosely affiliated with the progressive movement, especially in Charles Horton Cooley’s Social Process, published in 1907. The “pecuniary motive,” Cooley wrote,  “excludes such vast provinces of life that we may well wonder at the extent of our trust in the market process.” As he saw it, “Pecuniary values fail to express the higher life of society.” The counterweight to the marker was to be found in activities undertaken for their own sake and not for the sake of extrinsic rewards — in art, workmanship, and professionalism. “The pleasure of creative work and the sharing of this by those who appreciate the product, … unlike the pleasure of possessing things we win from others, … increases the more we share it, taking us out of the selfish atmosphere of every-day competition.”

The Vernacular Mind

The true Vernacular Mind does not work on treatises. Its practitioners are not academics or designers, but the townspeople building the town. Until very recently, that practice did not exist almost anywhere in the post-industrial world because we purchase almost everything from specialists. It is therefore easy for some to say that the Vernacular Mind, and the living traditions it produces, are myths, and never existed… because they have never seen them in operation.

Today, however, there are reasons for hope. The low point was in the 1950s, when we gave almost our entire lives over to the specialists. The 1960s and 70s saw glimmers of recovery in places like the Foxfire books. From then until now, we’ve been taking back larger swaths of our lives, from our health maintenance to the making of (to some degree) our homes. “You can do it; we can help” is both Home Depot’s tagline and a sign of these times. Will most people ever take back the entire construction process? That’s doubtful, and most may not have ever done that once we moved past the primitive hut. In my own family’s history of self-built homes, they still bought things like windows and hardware. But I don’t think it’s necessary for everyone to build everything themselves to recover the Vernacular Mind. We simply need to be generalists again, knowing the entire process even if we don’t practice the entire process.

The Default Setting of the Renaissance

This post is based on a recent exchange that centered around a typically provocative post (shown below in italics) by Andres Duany, who began with this image of an unidentified proposal by Leon Krier. Duany’s position is that classical architecture doesn’t have to be based in forms established during the Renaissance; that alternate ancient sources like the Erechtheion are available to classical architects, as are the inventiveness of modern masters like Wright and Aalto. This is the basis of his violent polemic against the current classical academy, which he maintains is limiting students’ awareness of these kinds of “heterodox” models. krier 2       krier 1

Andres: “It isn’t just that Leon is exceedingly talented. It is also that he doesn’t BEGIN by falling into the default setting of the Renaissance. That is the main problem we have at the moment. And if you can’t see this, take off them Renaissance blinders. You would be amazed with what is at hand, when you design.”

          Actually, this transit station or market building seems to have made a beginning by falling right into the arms of the Renaissance. In fact, it was conceived as a typical triple-arched Renaissance loggia, albeit with unnecessarily heavy piers between the arches, which were, instead, similar to the piers at churches like Alberti’s Sant’Andrea in Mantua.
          Krier than puts that underlying form through a Chomskian transformation, as explained by Leonard Bernstein in a useful series of lectures entitled “The Unanswered Question” that one Tradarch blogger brought to our attention earlier. He does so across the board, and not only by making the loggia much heavier than is called for by its use and position in the city, which raises questions about the propriety of government intervention in architecture. He makes this move principally by reversing the curves of the arches to create a dissonant, metaphorically rich, structurally ambivalent series of cyma reversas, reminding the viewer of 18th c furniture imitating animal forms. Then he shelves out sections of the entablature for similar reasons.
          Ironically, Krier is careful not to open up too wide a span for the capacity of the stone architrave to bridge above the inverted arches. The form of the piers, incorporating two levels of smaller, blind arches, themselves deliberately underscaled for effect, and two belt courses, in this case of exaggerated size, appear to be based on the Arch of Titus in Rome, but only as moderated by Alberti’s serene façade at Sant’ Andrea.
          He then subjects the entire cornice to the same process, compressing it into a single gesture but without losing the essence of its typically Renaissance architectural syntax. The roof gets a similar treatment, rendered insubstantial with glazing. At the same time, the classical plinth is expanded to provide seating, just like the seating built into the bases of numerous Renaissance palazzi in Florence. The building is pushed uncomfortably close to a neighboring structure. It might be inferred that Krier designed this scene, like everything else in this Serlian tragic stage set, to point out the essential orderliness of the city by creating a limited degree of discord.
          Finally, the viewer, always on the lookout for narratives to explain juxtapositions, assumes that the streetscape is the result of the transactions of countless citizens over an extended period of time. Instead it becomes apparent that the whole civilization was invented by one man, pushing here and pulling there to make it look like a collaboration. This is what’s wrong with Poundbury.
          In short, Krier, and by extension, his princely patron, are impatient with the natural processes of town and city formation, which seem to have stopped working anyway. They substitute picturesque planning, which gives an unfounded aura of timelessness to what is a very up-to-date estate development. This is where Krier does step outside the reflexive reliance on Renaissance planning traditions. With Camillo Sitte, he aspires to a medieval artfulness, placing everything just right, but rarely orthogonally or axially. A Renaissance patron would have insisted on unifying each intervention at the largest appropriate scale, making unitary street facades and squares to the greatest degree possible.
          In the case of this market or bus stop, Krier then goes beyond Sitte by using building form and placement to create not just the kind of poetic ambiguity that Sitte and Bernstein found enriches our experience of beauty, but deliberately induces an unease that is characteristic of much modern art, music, and architecture.