by James R. Smith

What we call inflation is actually the theft of trillions of dollars from poor, working class people, small businesses, professionals and the moderately well-to-do.

The big bite began with the CARE Act of 2020, and has continued with the high interest rates of ensuing years. The stealing continued, as of this writing in June 2024, in every sector of the economy from the basic necessities of food, housing (including home purchases and rents), medical care, transportation, clothing, higher education, insurance, and much more.

Inflation comes from the practice of Monetarism, the ideology of capitalist neoliberalism, and the intransigence of the Federal Reserve Board (the Fed).

The Fed is made up of hide-bound and very rich conservative bankers and academics who do what they are told. And what they are told is to increase the wealth of the rich, and the poor be damned.

The Federal Reserve System Board of Governors includes the Chair, Jerome H. Powell, whose multiple terms run from 2018 to 2028. Previously, Powell was a partner at The Carlyle Group, which is a private equity firm which manages $426 billion in assets. Carlyle developed a reputation for acquiring businesses related to the defense industry. After 911 (the attack on the World Trade Center and the Pentagon), it became known that Carlyle had a close relationship with the bin Laden family. Shafig bin Laden, Osama’s half brother, had been guest of honor at a Carlyle Group conference held in Washington on the very same day of the disaster, Sept. 11, 2001. In addition, the bin Laden family had previously been an investor in Carlyle funds while Powell was a partner in the firm. After Carlyle, Powell worked in George W. Bush’s Treasury Department.

Other Federal Reserve members include:

Vice Chair Philip N. Jefferson, an academic;

Vice Chair for Supervision Michael S. Barr. He is an academic and lawyer, who worked in Bill Clinton’s Treasury Department.

Michelle W. Bowman is a Kansas banker, who also worked for Kansas Republican Sen. Bob Dole, and in federal agencies including FEMA and Homeland Security.

Lisa D. Cook is an academic, whose main federal job was as deputy director for Africa Research at the Center for International Development.

Adriana D. Kugler’s most significant position seems to be as U.S. Executive Director at the World Bank. She held that office when she was appointed to the Fed.

Christopher J. Waller is an academic and the only board member who held lower positions at the Fed, before being placed on the Board.

The seven member board includes two Blacks, three women, and three white guys.

It might be a mistake to assume that the Federal Reserve Board makes economic policy. Mostly, they are told how to vote by those of higher rank in the Deep State (Wall Street, “Intelligence Community,” Military, and assorted wealthy people). During the past few years, the Fed has become a convenient vehicle for the expansion of the money supply. It does this by setting a liberal loan policy for major corporations. These loans create more money in circulation, causing everyone’s dollars to become less valuable. But since the titans of society rake in a vast amount of “new money,” they win, even if each dollar is less valuable. If more dollars are in circulation, then the cost of goods and services goes up, hence: inflation.

Origins of the Fed, and What it Became

The Federal Reserve Board was created in 1913 during the administration of President Woodrow Wilson. Today, Woodrow Wilson is mainly known for his incessant racism and for jailing Socialist Presidential Candidate Eugene Debs.

Its purpose was to address bank crises, which at that time caused havoc among the rich, and often meant that millions of workers lost their jobs. The Fed should have been abolished when it proved incapable of solving the Great Depression of 1929, but it is still with us.

Other factors that have led us to this sad economic state where workers’ incomes continue to fall as compared with society’s total wealth, include Monetarism and Neoliberalism.

Monetarism is an economic philosophy of the rich that was founded by University of Chicago Economist Milton Friedman. It theorizes that controlling the supply of money is the best way to control the economy. In practice, we get abominations like a high interest rate, no social security or other welfare, and a belief that the economy will create full employment without further tinkering.

Friedman was an advisor to both US President Reagan and UK Prime Minister Margaret Thatcher. Both heads of state were the most conservative of their times in their respective countries.

Friedman also supported Chilean fascist dictator Augusto Pinochet. A former Army General, Pinochet staged a bloody coup against the elected President Salvador Allende, which resulted in Allende’s death and the torture and death of thousands of Chileans. despite this, Friedman kept on friendly terms with the dictator and encouraged his colleagues at the University of Chicago to go to Chile and work for the oligarch.

Neoliberalism is more limited in scope. It is focused on squeezing the most profit out of every business. It came into vogue among financial capitalists around the late 1970s. Before that, “industrial capitalists” ruled the roost. As a whole, they were not as greedy as the Wall Street Crowd that supplanted them. Industrial capitalists have been portrayed in many old movies as friendly old gentlemen with hearts of gold. They were not that.

I was active in newspaper unions at the time of this big change. When I began negotiating labor contracts, most newspaper owners would expect an annual profit of between five and 10 percent. Not so with the neoliberals. They believed they should be entitled to 30 percent profit, and then 40 percent. The sky was the limit. It took drastic action to make their dreams come true. They had to get rid of many of their long-time employees, such as reporters, editors, typographers, press operators, ad salespeople, and so forth. In many cases, they sold their downtown headquarters (often quite beautiful) and headed for a warehouse in the suburbs. The price of a newspaper skyrocketed, and regular raises were replaced by “merit” raises and bonuses. Union-busting companies were hired to get rid of all guaranteed provisions in the labor agreement.

The same thing happened in nearly every industry. In Los Angeles, all four tire plants shut down and moved to the Deep South, leaving their union-paid workers behind. At least three steel plants and three auto plants followed the rubber plants’ lead. Other companies in the garment, furniture, and service industries likewise shut down.

The result was a great deal of poverty, drugs, and crime in South Central LA and Compton, as thousands of Black and Latinx workers searched for a way to survive.

As the years went by, the Wall Street moguls found they could make bigger profits by leaving the country entirely. Southeast Asia and Mexico vied to provide the lowest wage. The income gap between rich and poor turned into an ever-growing chasm.

Then came the Pandemic. This provided an awesome opportunity for trillions, yes, trillions, of dollars to be printed and used for loans for the big corporations. It was funny money, not backed by a damn thing. In earlier years, people might have fallen for it, but in the 2020s the game was up. Less and less income was available to working people. Of course, there were stories about how ordinary people suddenly became rich on the stock market or by inventing a miracle device, but that happens mostly in peoples’ dreams.

Inflation is here to stay if the Fed has anything to say about it. Just a few days ago, the Fed refused to lower the interest rate even though the rate of inflation has come to a screeching halt.

By manipulating the interest rate, the Fed can take even more money out of your wallet and deposit it in a corporate bank account. So is there no recourse to Monetary policy, Neoliberalism, Modern Monetary Theory, or whatever you want to call it?
An Alternative to neverending interest rate hikes

Once upon a time, there was an alternative. It involved direct manipulation of the whole economy. It wasn’t meant to increase the wealth of our leisure class, but to make sure that working people got their fair share. Who could make that happen? It was none other than President Franklin D. Roosevelt. This president really cared about the plight of the poor and workers, in contrast to the bunch we have today who love to hang out with the Deep Staters.

Here are his own words (edited to stay on subject). This was a radio address (Fireside Chat) he gave on Sept. 7, 1942, when confronted with WWII, perhaps the fiercest inflation maker of all time.
President Franklin D. Roosevelt

On Inflation and Progress of the War Labor Day Radio Address of the President

September 7, 1942

MY FRIENDS:

Today I sent a message to the Congress, pointing out the overwhelming urgency of the serious domestic economic crisis with which we are threatened. Some call it ” inflation,” which is a vague sort of term, and others call it a “rise in the cost of living,” which is much more easily understood by most families.

That phrase, “the cost of living,” means essentially what a dollar can buy.

From January 1, 1941, to May of this year, nearly a year and a half, the cost of living went up about 15%. And at that point last May we undertook to freeze the cost of living. But we could not do a complete job of it, because the Congressional authority at the time exempted a large part of farm products used for food and for making clothing, although several weeks before, I had asked the Congress for legislation to stabilize all farm prices.

At that time I had told the Congress that there were seven elements in our national economy, all of which had to be controlled; and that if any one essential element remained exempt, the cost of living could not be held down.

This act of favoritism for one particular group (farmers) in the community increased the cost of food to everybody — not only to the workers in the city or in the munitions plants, and their families, but also to the families of the farmers themselves.

Since last May, ceilings have been set on nearly all commodities, rents (and) services. Installment buying, for example, has been effectively stabilized and controlled.

Wages in certain key industries have been stabilized on the basis of the present cost of living.

But it is obvious to all of us (however) that if the cost of food continues to go up, as it is doing at present, the wage earner, particularly in the lower brackets, will have a right to an increase in his wages. I think that would be essential justice and a practical necessity.

Our experience with the control of other prices during the past few months has brought out one important fact — the rising cost of living can be controlled, providing that all elements making up the cost of living are controlled at the same time. I think that also is an essential justice and a practical necessity.

We know that parity prices for farm products not now controlled will not put up the cost of living more than a very small amount; but we also know that if we must go up to an average of 116% of parity for food and other farm products — which is necessary at present under the Emergency Price Control Act before we can control all farm prices — the cost of living will get well out of hand. We are face to face with this danger today. Let us meet it and remove it.

I realize that it may seem out of proportion to you to be (worrying about) over-stressing these economic problems at a time like this, when we are all deeply concerned about the news from far distant fields of battle. But I give you the solemn assurance that failure to solve this problem here at home — and to solve it now — will make more difficult the winning of this war.

If the vicious spiral of inflation ever gets under way, the whole economic system will stagger. Prices and wages will go up so rapidly that the entire production program will be endangered. The cost of the war, paid by taxpayers, will jump beyond all present calculations. It will mean an uncontrollable rise in prices and in wages, which can result in raising the overall cost of living as high as another 20% soon. That would mean that the purchasing power of every dollar that you have in your pay envelope, or in the bank, or included in your insurance policy or your pension, would be reduced to about eighty cents worth. I need not tell you that this would have a demoralizing effect on our people, soldiers and civilians alike.

In my Message to Congress today, I have (told the Congress) said that this must be done quickly. If we wait for two or three or four or six months it may well be too late.

I have told the Congress that the Administration can not hold the actual cost of food and clothing down to the present level beyond October first.

Therefore, I have asked the Congress to pass legislation under which the President would be specifically authorized to stabilize the cost of living, including the price of all farm commodities. The purpose should be to hold farm prices at parity, or at levels of a recent date, whichever is higher. The purpose should also be to keep wages at a point stabilized with today’s cost of living. Both must be regulated at the same time; and neither one of them can or should be regulated without the other.

At the same time that farm prices are stabilized, I will stabilize wages.

That is plain justice — and plain common sense.

And so I have asked the Congress to take this action by the first of October. We must now act with the dispatch, which the stern necessities of war require.

I have told the Congress that inaction on their part by that date will leave me with an inescapable responsibility, a responsibility to the people of this country to see to it that the war effort is no longer imperiled by the threat of economic chaos.

As I said in my Message to the Congress:

In the event that the Congress should fail to act, and act adequately, I shall accept the responsibility, and I will act.

The President has the powers, under the Constitution and under Congressional Acts, to take measures necessary to avert a disaster which would interfere with the winning of the war.

There may be those who will say that, if the situation is as grave as I have stated it to be, I should use my powers and act now. I can only say that I have approached this problem from every angle, and that I have decided that the course of conduct which I am following in this case is consistent with my sense of responsibility as President in time of war, and with my deep and unalterable devotion to the processes of democracy.

The responsibilities of the President in wartime to protect the Nation are very grave. This total war, with our fighting fronts all over the world, makes the use of the executive power far more essential than in any previous war.

If we were invaded, the people of this country would expect the President to use any and all means to repel the invader.

Now the Revolution and the War between the States were fought on our own soil, but today this war will be won or lost on other continents and in remote seas. I cannot tell what powers may have to be exercised in order to win this war.

The American people can be sure that I will use my powers with a full sense of responsibility to the Constitution and to my country. The American people can also be sure that I shall not hesitate to use every power vested in me to accomplish the defeat of our enemies in any part of the world where our own safety demands such defeat.

And when the war is won, the powers under which I act will automatically revert to the people of the United States — to the people to whom (they) those powers belong.

I think I know the American farmers. I know (that) they are as wholehearted in their patriotism as any other group. They have suffered from the constant fluctuations of farm prices — occasionally too high, more often too low. Nobody knows better than farmers the disastrous effects of wartime inflationary booms, and post-war deflationary panics.

So I have also suggested today that the Congress make our agricultural economy more stable. I have recommended that in addition to putting ceilings on all farm products now, we also place a definite floor under those prices for a period beginning now, continuing through the war, and for as long as necessary after the war. In this way we will be able to avoid the collapse of farm prices (which) that happened after the last war. The farmers must be assured of a fair minimum price during the readjustment period which will follow the great, excessive world food demands (which) that now prevail.

We must have some floor under farm prices, as we must have under wages, if we are to avoid the dangers of a post-war inflation on the one hand, or the catastrophe of a crash in farm prices and wages on the other.

Today I have also advised the Congress of the importance of speeding up the passage of the tax bill. The Federal Treasury is losing millions of dollars (a) each and every day because the bill has not yet been passed. Taxation is the only practical way of preventing the incomes and profits of individuals and corporations from getting too high.

I have told the Congress once more that all net individual incomes, after payment of all taxes, should be limited effectively by further taxation to a maximum net income of $25,000 a year. And it is equally important that corporate profits should not exceed a reasonable amount in any case.

The nation must have more money to run the War. People must stop spending for luxuries. Our country needs a far greater share of our incomes.

For this is a global war, and it will cost this nation nearly one hundred billion dollars in 1943.

I have devoted much more space to Roosevelt’s Address because everyone should have an opportunity to see how far our political life has fallen. Today, no president would speak the way Roosevelt spoke. By that, I mean that the candidates would not dare antagonize their donors and benefactors in favor of working people. We need to join together to bring back those days when a president really did work for us, and not for the rich. Economics is easy to understand, the point is to change it.

by fredX

Why change? It affects everything: your job, your relationships—everything you are a part of. Why would you want the turmoil? The world changes all the time; there’s turmoil all around us.

Turmoil exists when there isn’t resolution, when the changes around us aren’t right. If you don’t make your own changes, but let things around you drive your life, life loses a lot of its preciousness. That big chunk of confidence you get when you feel when you’ve accomplished something and drives you forward. Your sense of worth. You may not change because you’re old, maybe it’s the battles, maybe it’s the energy that the struggles have taken out of you, maybe it’s that your body can’t override pain or keep absorbing life’s beatings. Maybe mind -over-matter isn’t much of a possibility anymore. Maybe you just think, “I can’t.” No matter how old you are, one day you might get that surge of strength you feel making when you decided to undertake a new partnership or project. And in that moment—your age seems irrelevant.

Irrelevant because you believe, irrelevant because you decided, irrelevant because you know, in that one moment, if you change viewing the present as you are living it now, big things, real things are going to happen in your life, and you will be on a new plateau—maybe even a mountain, or maybe only a valley. But you will be somewhere new, at the beginning of something that matters and offers promise because—in your heart of hearts—you know it’s going to be a better place if you go there.

Progress comes in different sizes. What is the size of your progress?
What if you said your progress was saving a failing relationship?
What if you said it was being accepted as an artist?
What if you said it was your next dog? Or another child? Or no child?
What if any of these things would turn your world around and you knew it?

You need a sense of mission. And so do I. So do we all. Not a sense of self importance, a sense of power, or sense of control. Not safety, not security, not omniscience. None of those things, because those things are reflections of glory rooted in ego. No, what we need is a sense of mission coming from a sudden realization that something we are about to undertake is going to be meaningful and valuable to us as well as those around us–and will last.

The present doles out to us, on a moment-to-moment basis, leverage, harmony, and synergy. All that we need to do is recognize these moments, open all those doors, automatically, no matter how seemingly familiar—be wide eyed enough to see the new moments—and then run with them. Not with a sense of opportunism but with a sense of mission. Opportunism is fleeting; missions endure. If we embrace a mission, suddenly we find we have allies, friends we never knew we had, like-minded spirits who are already on the same planet and will join with us for the same reason: what the mission means to them.

The human race gets things done when they gather human support. When something already has support, things start falling into place. When that happens, we don’t have to work as hard. When it “wants to happen,” of its own accord–when we didn’t cold-bloodedly go out to enlist cohorts of supporters who would be our leverage to bend the world to our own wills—the world becomes a better off place to be.

 

by John Zerzan

 

Theodor Adorno opens his magnum opus Negative Dialectics (1966) with “Philosophy, which once seemed obsolete, lives on because the moment to realize it was missed.” Thus, he goes on to say, philosophy is “obliged ruthlessly to criticize itself.”

Not only has it misfired, but its failure and irrelevance has only worsened over the years, with scant exceptions. Its tedious abstraction means the standard philosophy college survey course easily wins the “most boring” prize.

Cumhaill and Wiseman’s Metaphysical Animals (2022) tells us about post-World War II women philosophers at Oxford, and points out that the “greats” of European philosophy have all been men, and nearly all of them bachelors. They isolated themselves from women and children, so largely from life, love, and loss.

The separation of philosophy from the larger culture goes along with an increasing separation of philosophers from other philosophers. In 1962 Yehoshua Bar-Hillel judged that “communication between philosophers has been deteriorating during the last decades.”

Originally the word meant “love of wisdom,” referring to knowledge in general. Very early on, the senses were ruled out in favor of the cerebral alone. In the past century or two specialization has set in with a vengeance, producing mostly abstruse intellectual puzzles of interest only to professionals.

Meanwhile a new malaise of civilization is the zeitgeist, with news outlets fueling a fast-spreading catastrophic outlook. End times prevail, with new depths of psychological suffering. Technology has triumphed, leaving us feeling lonely and abandoned.

The current period of widespread indifference to politics has seen the end of “every emancipatory adventure,” according to Elizabeth Rondinesco in 2005.

I’d like to back up here, to the 19th century, to take a look at how philosophy’s idols (some of them, anyway) helped bring us to today. Deflate them a bit, bring them down to earth, starting with Schopenhauer and Nietzsche.

Arthur Schopenhauer (1788-1860) proclaimed a “life is suffering” doctrine, close to the outlook of Buddhism. In The World as Will and Representation (1818) he saw irrational, unrelenting will as bedeviling the individual, denying personal fulfillment. He was one of a handful of philosophers who addressed suffering, but only in terms of a quasi-religious gesture, quite removed from historical or social reality.

Referred to as the Great Pessimist, he became far less pessimistic when he acquired prestige quite late in life, as a kind of overnight sensation. His was perhaps not as serious a philosophy as he projected, but Schopenhauer had a major influence on Friedrich Nietzsche.

According to Nietzsche, a philosopher should be “a terrible explosive from which nothing is safe.” To many, he was a radical iconoclast, the anti-Christ, his Zarathustra alter ego proclaiming the ubermensch/overman hurling prophetic thunderbolts. Gilles Deleuze judged that “modern philosophy has largely lived off Nietzsche, but not perhaps in the way he would have wished.”

In fact, the image of a peripatetic wild man misses the mark entirely. He wandered, having abandoned his philology professorship early on, but his outlook was conventional at base, at least until he approached a psychotic break at the age of 45.

He rejected antisemitism and German nationalism, boosting Felix Mendelssohn as his favorite composer, and aspiring to the status of “good European.” He would resemble a modern liberal, but for the fact that he was wholly against democracy.

Nietzsche did not deliver a systematic philosophy His outlook was anti-metaphysical, and beginning with The Birth of Tragedy (1872), more of an aesthetic one. His contrast of Dionysian and Apollonian impulses in the arts was meant to revitalize Western civilization, not to combat it. His attention to the dynamics of individual ethics qualify Nietzsche as more of a psychologist than a philosopher. He advocated a more instinctual approach to life, decrying the tame and civilized status of the herd. But at times he undercut this anti-domestication sentiment by urging the subject to transcend his animal nature.

For Nietzsche freedom is the will to power; however, he quite clearly rejected the idea of power as power over others. He meant self-mastery; again, the psychologist, the ethicist. Robert Solomon’s essay, “A More Severe Morality: Nietzsche’s Affirmative Ethics” explores this ably.

His crowning conception was eternal recurrence or amor fati, love of fate: the unreserved embrace and recurrence of all that was and is without change. Conservative conformism on a philosophic level, as I see it.

Nietzsche died in 1900, after a decade of madness following his 1889 breakdown in Turin. Another unsystematic thinker, Henri Bergson, was beginning to emerge into prominence in the years before World War I. He was by far the most well-known philosopher in the interwar years. Since World War II, unlike Nietzsche, he has been forgotten.

Bergson stressed experience and intuition over mediation and abstraction; he wrote of duration as lived time. HIs vitalist outlook fit the pre-World War I zeitgeist of energy and challenge. Nikos Kazantzakis was Bergson’s student; his Zorba the Greek expresses some of that spirit. Bergson’s major works include Time and Free Will and the very popular Creative Evolution.

Coinciding with the heyday of Bergsonism was the arrival of the “linguistic turn,” the major philosophical development of the 20th century. This movement is variously called analytic philosophy, logical positivism, and the Vienna Circle, the last a reference to its proponents. For these philosophers, all philosophical questions are questions of language.

Outspoken philosopher Marjorie Grene trashed this as a move to the “more and more trivial, more and more divorced from anything…you can tell when philosophers start talking. There’s no connection with reality.” Bruce Wilshire put it this way: “Analytic philosophy tends powerfully to put us at a remove from everything, even from our own selves, selves turned ghostly.”

With the likes of Gottlob Frege, Bertrand Russell and Ludwig Wittgenstein, the modern analytic shift was based on logic, specifically mathematical logic, the most formal kind. Thus meaning somehow resides in what is ever more formally abstract, aiming at an utmost precision of language, a purely formal analysis.

Looking for the austere logical skeleton within language is immeasurably remote from actual reality and its challenges. Language is not bodily. It is the missing person. Analytic philosophy is thin and desiccated to the highest degree. Wittgenstein, considered the most brilliant of the analytic practitioners, ended up seeing the folly of the search for the inner meaning of language. His early Tractatus Logico-Philosophicus pursued the party line, but he reversed course with Philosophical Investigations (published posthumously in 1953). He came to the conclusion that the secret of language is a false quest, that the many uses of ordinary language (“language games,” as he called them) work just fine.

Edmund Husserl founded phenomenology at the beginning of the 20th century. He tried to establish a rigorous analysis of consciousness in order to get directly “to the things themselves,” free of preconceptions. Although he failed to reach this goal, Husserl’s brand of theory of knowledge was an influence in many spheres of thought throughout the century. His aim––an unmediated, beyond-the-conceptual connection with “things themselves”––beckons, but cannot be obtained through abstraction. (Which is, after all the hallmark of philosophy.)

Husserl was the mentor of Martin Heidegger, who as the rector of the University of Freiburg banned him from the campus when the Nazis came to power in 1933. (Husserl was retired but had continued to use the university library for his research.)

To many in the philosophy playpen, Heidegger was the most influential thinker of the 20th century. He focused on what it means to be: Dasein, or being in the world, with his declarative Being and Time (1927). He asserted that it is possible “to think being” in separation from beings, a Heideggerian version of existentialism.

Against our “fallenness,” our “forgetfulness of Being” (we’ve been on the wrong path since Plato), Heidegger situates thinking on a deep ontological level. But it’s so much bunkum that links up to nothing. He even attests that all human inquiry is circular. His rhetoric sounds profound and goes nowhere. Adorno’s The Jargon of Authenticity nails it: jargon, not authenticity.

His one actual insight he quickly rendered inconsequential. He saw that technology devours and deforms everything, even thought itself. So he recommended a new Be-ing––but one that would leave technology alone! In 1955 he said, “We can use technical devices and yet with proper use we also keep ourselves free of them.” (!!) This quietism is a total copout, and I think he knew better. “Only a God could save us,” he declared in a 1966 Der Spiegel interview.

There are those who find Heidegger’s thought eminently separable from his identification with Nazism, a deplorable ethical lapse by his partisans. Recent scholarship has made this position even less defensible. Beginning in 2014 his Black Notebooks, covering entries between 1931 and 1969, have been published. Notebooks can consist of undeveloped ephemera, but Heidegger considered the Notebooks his crowning achievement, containing his definitive judgments. As convincingly documented in Richard Wolin’s Heidegger in Ruins (2022), the Notebooks display the depth of his antisemitism, commitment to Nazi ideology, and their connection to his overall philosophy. It’s an extremely ugly and irredeemable collection, reflecting almost forty years of bile and bigotry.

Gilbert Ryle, along with A.J. Ayer, Rudolf Carnap, and of course Wittgenstein, was a mainstay of the logical positivist/analytic school. Ryle summed up Heidegger in earthy terms: “He was a shit from the heels up, and a shit from the heels up can’t do good philosophy.”

Martin Heidegger claimed throughout his career that he was a phenomenologist, no matter how distant his thought was from “the things themselves.” His empty ontology was in no way bodily, for instance.

Maurice Merleau-Ponty, on the other hand, was indeed a phenomenologist, the century’s most prominent. Certainly influenced by Heidegger, his emphasis was on perception, consciousness, embodiment. He was a co-editor, with Jean-Paul Sartre and Simone de Beauvoir, of Les Temps Modernes. This collaboration, and further philosophical exploration, were cut short by Merleau-Ponty’s death at age 53.

Sartre was an acclaimed dramatist (e.g. No Exit), and novelist; Nausea was one of the major novels of the 20th century. His Being and Nothingness (1943), proposing that human being is a nothingness that must constitute itself, is a long introduction to a philosophy of freedom. The Nazi occupation forced the question of freedom on Sartre; it was the ground from which existentialism emerged in the postwar West, especially in France.

Sartre’s partner Simone de Beauvoir was strongly influenced by Engels’ The Origins of the Family: Private Property and the State. She saw in women a “new proletariat,” and wrote early feminist classics, notably The Second Sex (1949).

Sartre was a famous partisan of the Left, opposing French colonialism in Algeria and French and American imperialism in Vietnam. In The Critique of Dialectical Reason, he tried to reconcile existentialism and marxism. Despite his cornerstone Rousseauvian emphasis on existential freedom, he joined the French Communist party and praised the regimes of Stalin and Mao Tse-Tung.

Theodor Adorno, Max Horkheimer and Herbert Marcuse were co-members of the radical Frankfurt School. Adorno and Horkheimer’s Dialectic of Enlightenment (1947) borrowed a leaf from Freud’s Civilization and Its Discontents, as to the ever-greater instinctual renunciations at the heart of civilization. They give a different spin to the scene in Homer’s Odyssey in which Odysseus and his crew are tempted by the Sirens: “Come ashore and party with us!” His response is to have himself tied to the mast, his crew’s ears blocked with wax. To Homer the Sirens threatened death. To Adorno they represented eros and freedom, an interruption to the voyage to repression/civilization.

Marcuse’s Eros and Civilization (1955) tried (unsuccessfully, I think) to rescue civilization via a middle-ground perspective. Against Freud’s conclusion that repression is the very nature of civilization, he argued that if we could remove excess or “surplus” repression from civilization all would be fine. Marcuse’s One-Dimensional Man (1964) despaired of the possibility that people could revolt. With what he termed repressive desublimation, the subject has become too deeply enslaved. “The Sixties” began to explode globally within months of the publication of One-Dimensional Man. Teaching in California, Marcuse joined the movement.  HIs star pupil, Angela Davis, turned out to be a marxist-leninist, sadly enough. Somewhat like the case of Jurgen Habermas, Adorno’s graduate student, a committed proponent of civilization and enlightenment; not a failed, fatal experiment, Habermas claims, but one in need of completion or fulfillment. (!!)

Neo-Freudian Jacques Lacan had–– and still has, in the case of leftist Slavoj Zizek––an influence on philosophy. His most memorable line reminds us of the continuing dominance of the “linguistic turn”: “The Unconscious is structured like a language.”

Postmodernism enters the picture in the 1960s with figures like Jacques Derrida, who also enlisted under the “linguistic turn” banner. And who famously proclaimed, “There is nothing outside the text.” Nothing inside it either, when one applies his deconstruction approach that undermines stable meaning, ultimately reducing text to incoherence when it is shaken or stirred enough. Derrida renounced categories like transparency, presence, origin. Now that Artificial Intelligence can produce the text and the other symbolic products, where does that leave deconstruction?

Jean-François Lyotard was another postmodernist. Like Derrida, he opposed metanarrative, the desire to grasp an overview or the whole. “Let us wage war on totality,” he urged, for the will to totalize is a totalitarian impulse. This was aimed at marxism, but extended to create its own anti-totality totality, a generalized dictum that rules out understanding.

Among those in the postmodern dark in France I must award the prize for most removed from reality to Jean Baudrillard, beloved by the art school-type crowd in the ’80s and ’90s. In his early work, such as The Mirror of Production (1975), he cogently analyzed marxism as embracing productionism as fervently as does capitalism. But he soon declared that reality is no longer moored to reference points; under the sign of simulation modernity has become hyperreal; all is simulation. In The Gulf War Did Not Take Place (1995) Baudrillard took this view to a new level, asserting that images of the U.S. war on Iraq were more real than any actual “war.” It has been said that from being a big science fiction aficionado he graduated to writing sci-fi himself. One quote he did get right: “We live in a world where there is more and more information and less and less meaning.”

In many ways postmodernism was a debilitating impulse, a surrender in thinking. Alan Sokal, a physics professor at New York University, published “Transgressing the Boundaries: Towards a Transformative Hermeneutics of Quantum Gravity” in a 1996 issue of Social Text, a “cultural studies” journal. Sokal purported to apply a postmodern approach to particle physics; the article was a complete hoax, as he admitted. A parody, employing trendy pomo rhetoric, without substance. But leading postmodernists fought back, defending the indefensible, just as there were those who defended Baudrillard’s insistence that the Gulf War didn’t really happen.

Strongly influenced by his contemporaries Foucault and Sartre, Gilles Deleuze focused on how philosophy comes about. He first received acclaim in academic circles for his Nietzsche and Philosophy (1962). Deleuze became involved in what is called the philosophy of sense; in The Logic of Sense (1969) he resisted philosophy’s “linguistic turn.” It was at this point that he came to fully embrace his materialist and naturalistic leanings. In those years he began his collaboration with activist psychiatrist Felix Guattari. Deleuze’s major work, Difference and Repetition, mirrors, however abstractly, the difference he sought, and the repetition and inertia that blocked liberation from the State and the Communist party.

Deleuze and Guattari’s Anti-Oedipus (1972) and A Thousand Plateaus (1980) are the two volumes of their Capitalism and Schizophrenia. Foucault’s take was that these works examine Western society’s “innate herd instinct” and question what is normalcy. But their complicated and obscure “body without organs” concept seems to me unnecessarily central to either capitalism or schizophrenia.  And their reliance on the “rhizome” metaphor, describing a lattice-like surface as means of development, is clearly related to postmodern rejection of depth and origins.

The Left in Europe declined markedly in the 1970s. Its last significant theorist was the structural marxist Louis Althusser, who strangled his wife in 1980 and was declared insane. His For Marx (1962) and Reading Capital (with co-authors, 1965) are forgotten today, even as embarrassing communists like Alain Badiou and Slavoj Zizek soldier on.

Guy Debord’s Society of the Spectacle (1967) introduced an important concept as to the nature of modern society. Ignored by the mainstream until the 1990s as too radical, Debord, in turn, ignored the centrality of technology.

There were no important postmodern thinkers after the 1990s. In America, philosophy has mainly slumbered in the shallows of pragmatism, from William James down through John Dewey and more recently, Richard Rorty. Its thin reformism barely veils its utter conformism.

Another branch of contemporary philosophy includes ethicists such as Emmanuel Levinas, Martha Nussbaum, and Peter Singer. I think their overall failure stems from ignoring social institutions, while claiming to decide how to approach questions of right and wrong.

A third sector is philosophy of mind, still haunted by machine metaphors despite the overwhelming negative reality of technology. Thomas Nagel’s What Is it Like to be a Bat? (1974) is a banal if well-known offering from this field.

Philosophy: for all that life has been up against, hardly a success story.

 

by Stephen Slater

Tourism, like the society that gives rise to it, has a remarkable inescapability.  Although it is undeniable that tourism would not exist without tourists, it is equally undeniable, and in fact more revealing, that people become tourists because of tourism.  It is the context which bestows meaning on – and in part begets – the individual phenomena rooted in it.  Tourism is a web of institutions, images and attitudes that, since the end of the nineteenth century, is always already there before the tourist sets out.  The modern guidebook, guided tour and travel bureau all have their prototypes in the early years of the nineteenth century.  The “tourist flood” first witnessed in Switzerland and Italy is something that took shape in the following decades.  Since then, tourism has become not simply one form of travel alongside others but rather, with exceptions such as the business trip and visiting relatives and friends, the form of travel itself, its incarnation in the modern world.

A definition of the term “tourist” which would seek to distinguish it from “traveler” in the period since the closing decades of the nineteenth century would have, at best, mere polemical value for those who wish to see themselves as travelers rather than as tourists.  The consulate officials who issue visas have understood this better than most: tourists are those who enter a country for reasons other than business or study or visiting relatives, for example.  To travel for the purpose of seeing and enjoying is, to a greater or lesser extent, to make use of the institutions, be enticed by the images and embody the attitudes, of tourism.

This piece of writing aims to be a description of the consciousness of the tourist.  It is here that these images and attitudes are rooted, here that these institutions reveal their power.  Other approaches are of course possible: the history and sociology of tourism, its economic, political and ecological effects, the prospects of a less obtrusive tourism, etc.  These are of course important topics, and have already been investigated to varying degrees.  Yet the subjective dimension has for the most part remained unexplored, with the exception of at least one of the texts assembled in Roland Barthes’ Mythologies (1957),  Hans-Magnus Enzensberger’s “Eine Theorie des Tourismus” (1958, in his Einzelheiten, 1962) and Dean MacCannell’s The Tourist (1976).

Fundamental to tourism is the notion of the sight to be seen, or the scenic.  The designation of certain areas, highways and views as being “scenic,” or certain buildings and monuments as being “of interest,” is by no means one which can be accepted or set aside according to preference.  Once brought into being, the scenic has a force of its own.  At most, individual designations or collections of such designations (in guidebooks, for example) can be rejected and alternative ones put in their place; but the scenic as such, as configuration of experience in advance, remains.  It is the organizing principle of this experience.

A city, for example, is approached by means of a particular selection of points of interest.  In the course of visiting these locations, some points of interest or scenic routes are added, others omitted.  A process of abstraction takes place: selected features of this city are highlighted as being Worth Seeing, and the rest of it becomes simply – The Rest.  It is not a matter of abstraction as such, which is a necessary component of experience.  The point is that this abstraction is also a process of reduction.  The city, divided up into what is scenic and what is not, is perceived by those who approach it this way as simply the sum total of all the scenic locations plus the local conditions governing transportation from one scenic location to the next.  What occurs, then, is a reduction of the city as a whole to a fragmentary constellation of enclaves of the scenic.

The search for the scenic has, within the last century, been powerfully reinforced by a technological development that performs this process of abstraction and reduction mechanically (and now, electronically).  Although the camera itself was invented in the early nineteenth century, it was not until 1923 that a portable, inexpensive camera was made available to the amateur photographer (after the Kodak camera in the late 19th century).  What this meant for the pastime of sightseeing was a shift from merely looking at sights to the activity of collecting them.

In her book On Photography (1977), Susan Sontag provides penetrating analysis of the psychic significance of photography.  “A way of certifying experience, taking photographs is also a way of refusing it – by limiting experience to a search for the photogenic, by converting experience into an image, a souvenir. Travel becomes a strategy for accumulating photographs.”  The search for the scenic thus brings into relief the common characteristic inherent in tourism and photography, these twin developments of the industrial age: both are permeated with an essentially accumulative attitude, not to things per se, but to experience itself.

This attitude is reflected in the touristic conception of “knowledge of the world,” which is measured by the number of cities, parks, beaches of countries one has seen, or at least passed through – and, of course, how many photos one has.  “Most tourists feel compelled to put the camera between themselves and whatever is remarkable that they encounter.  Unsure of other responses, they take a picture.  This gives shape to experience: stop, take a photograph, and move on” (Sontag).  These frozen reminders of what we have lived or seen (or – merely photographed) present the continuity of lived experience as an episodic chain of memorable moments, one following the other in a homogeneous series, with the implication that the more photos we have taken, the more memorable experiences we have had.

The increasingly available and nearly inescapable photographs of buildings, monuments, cities, beaches and landscapes have flooded the imagination to such an extent that that had tourism not existed, photography would surely have invented it.  That this is not simply idle speculation is attested to by the overwhelming success of modern visual advertising, which creates the demand for products and services well before they are actually on the market.  It is not, then, solely the taking of photographs by tourists, but also the nearly unavoidable presence of them which has contributed to the insidious power of the scenic.  In Understanding Media: The Extensions of Man (1964), Marshall McLuhan observed that the photograph has reversed the purpose of travel, since it is no longer the strange and unfamiliar which is sought out, but rather that which has already been seen.  “Thus the world itself becomes a sort of museum of objects that have been encountered in some other medium.”  Travel is then a matter of corroboration of previously seen images, a project of confirmation in which the goal of the tourist’s undertaking is to “…check his reactions to something with which he has long been familiar, and take his own pictures of the same.”  To say of a place, “It wasn’t what I expected,” is to say that it didn’t look like its photograph.  If it is remembered that the Journey had been regarded among the educated and privileged in eighteenth- and early nineteenth-century Europe as the epitome of transformative experience (the Grand Tour), it will not be difficult to appreciate the significance of the change which has taken place.  (In fact, the Middle High German word ervarn (modern German: erfahren) meant both “to travel (through)” and “to experience,” i.e. “to go through.”)

The search for the scenic is one of the many manifestations of experience as spectacle so typical of modern urban life.  Rather than enumerating the criteria which would determine what should be classified as spectacle and what should not, it is more illuminating to characterize the sort of consciousness for which the phenomenon of the spectacle has come about.  The desire to merely see, to look at, regardless of whether any sort of adequate comprehension or receptivity can in this way be achieved, is the hallmark of a consciousness for which an avalanche of visual impressions is an everyday occurrence.  Today, this may seem to be due to the city as such, film, video and the internet, yet it has its roots in particular technological and commercial developments in the nineteenth century: along with photography, the steam locomotive (1830) and the emergence of visual advertising as an enterprise in its own right.

As a result of both the speed of modern means of transportation and the frequency of their use, a sort of dazed staring on the part of the passengers sets in, and an attitude of simply “watching it all go by” is generated, without any involvement with what goes on outside the windows of the vehicle.  In the century between the introduction of the steam locomotive and that of television, this tendency toward passive visual experience received strong support not only from other modes of transport but also from advertising, photo-journalism and film.  Although television was certainly very influential in this regard, it should be understood as strengthening a tendency which had already crystallized long before the electronic age began.

After the initial shock of the speed wore off, the “motion picture” of the visual flow presented by the windows of the railway car became the main feature of passenger railway travel.  Due to the nature of this sort of transportation, it was impossible to do more than simply look, unless one looked away; the physical constraints of such conveyance provided no opportunity for any further involvement with what was perceived for a brief moment.  Soon, an unquestioned “obviousness” of this visual experience was the unforeseen result.  (So much, however, cannot be said of most advertising, sensation-oriented photo-journalism or the more mind-numbing examples of Hollywood movies, television and the internet.  An unquestioned passivity is intended here: submission to that which is not meant to be inquired into.)

The spectacle as a way of experiencing is a uniquely modern turn of mind which tourism so remarkably exemplifies: the “museumization” of the world – not just of monuments, buildings and cities, but also of human beings and cultures.  This way of seeing the world as on display, simply there to be stared at whether purposely so arranged or not, is part of a worldview that has been generated by the department store, the supermarket, the show window, the illustrated magazine and indeed the museum itself. (Dean MacCannell has a very interesting chapter on “staged authenticity,” originally published in 1973, in his 1976 book The Tourist.)  As a corollary of the museum’s intended function of preserving the art and artifacts of the past and present, there is the inevitable “show” aspect which is to some extent present in every exhibition, due to the fact that a context which would render an exhibition is some sense meaningful, even just intelligible, must be to a significant degree supplied by the viewers themselves; otherwise an exhibition is simply a collection of curiosities, a mere juxtaposition of things often indistinguishable from the most random grouping of objects.

The scenic does not exist alone, but in relation to that which defines it as scenic.  Before photography became an instrument of such definition, the guidebook had already sketched its main features.  John Murray’s Red Book, which appeared in 1836 and covered Holland, Belgium and the Rhine region, was the first guidebook to evaluate sights by means of the “star system,” according to which the more stars received, the more worthy the sight is of being seen.  Through imitation of Murray’s approach, Karl Baedecker’s books attained the popularity which made the name Baedecker synonymous with the guidebook for so long.

The guidebook is more than one among several pieces of equipment that accompany tourists on their travels; as the text that channels intentions, desires and awareness itself into preselected paths, the guidebook is a primary historical root of the scenic.  As the annually updated catalogue of tourism’s world department store, it creates the demand for that which is described, thus falling within the scope of advertising, both in function and in use of language.  In general, it is a descriptive inventory of possibilities, relegating to non-existence what is not listed simply by virtue of not being listed.  In this sense the guidebook is an encyclopedia of the scenic: though modestly renouncing all claims to completeness, by the mere fact of its catalogue form the guidebook effects the reduction of what is there to what is listed, what is described.  James Bruzard, referring to a 1975 article in the New Yorker, notes: “German bombing attacks on British cultural institutions in 1942 earned the name ‘Baedecker raids’ because the bombs appeared to be aimed at all the guidebook’s starred attractions” (The Beaten Track: European Tourism, Literature, and the Ways of Culture 1800-1918, 1993).  The supposed objectivity of the map is surreptitiously invoked.  (It is noteworthy than some maps have come to approximate the tour guide: places of interest are prominently marked, scenic routes indicated, sometimes with photographs or cartoon-like illustrations of what is “typical” for a given location provided.)

  1. M. Forster’s novel A Room with a View, first published in 1908, contains a concise portrait of the guidebook mentality.  Lucy Honeychurch, a young English tourist in Florence, Italy, finds herself without her Baedecker and is at a loss as to what she should do.  Irritated and upset, she enters the church Santa Croce: “Of course, it must be a wonderful building.  But how like a barn! And how very cold! Of course, it contained frescoes by Giotto, in the presence of whose tactile values she was capable of feeling what was proper.  But who was to tell her which they were?  She walked about disdainfully, unwilling to be enthusiastic over monuments of uncertain authorship or date.  There was no one even to tell her which, of all the sepulchral slabs that paved the nave and transepts, was the one that was really beautiful, the one that had been most praised my Mr. Ruskin.”

This helplessness in the face of the thing itself – and Forster is by no means exaggerating here – is a classic example what Edward Said referred to in his book Orientalism (1978) as the “textual attitude.”  The textual attitude is in a sense a way of not seeing. By subordinating our perceptions to the interpretive schema of a book, we short-circuit the process of finding out as much as we can by ourselves.  The concomitant blind spots in our receptivity to the world are the subjective counterparts of the fineness of mesh which an “authoritative” text has.  It is by means of this filtering function in the text itself that some aspect of the world are allowed to pass through to us and others not.  “Travel books or guidebooks are about as ‘natural’ a kind of text, as logical in their composition and in their use, as any book one can think of, precisely because of this human tendency to fall back on a text when the uncertainties of travel in strange parts seem to threaten one’s equanimity. […] [T]he book (or text) acquires a greater authority, and use, even than the actuality it describes.”  As long as the author succeeds, or at least doesn’t fail too noticeably, the reader confirms the veracity of the author’s claims by reading more books by him/her/them, recommending them to others, etc.

The foreign and the exotic are sought out in the hope of shaking off the constraints of the familiarities of home.  The romance of travel and adventure has, after all, fed the Western imagination for centuries (e.g., Homer’s Odyssey), and since the European voyages of discovery there has been a growing body of travel literature.  The era of the European drives toward colonial expansion (primarily) from the 15th through the 19th century saw the aesthetic appropriation of the foreign and exotic in painting, the novel and poetry.  (That all of this has in our time been dwarfed, at least in lavishness of detail, by the exotica presented on film, television and the internet, is evident.)  Likewise, the importance of the Grand Tour for the education of young European men in the 18th century has been noted by many, including the author of the article “Travel” (original: “Voyage”) in the Encyclopedia of Diderot and d’Alembert: “Travelers develop and raise the level of the mind, enrich it through knowledge, and cure it of national prejudices.  Such study cannot be replaced by books or by the tales told by others.  Men, places, and things one has to judge by oneself.”  (Enlightenment indeed.)  In its description, categorization and enumeration of the elements and basic regularities of the various things it dealt with, the Encyclopedia (1751 – 1772) was in fact the predecessor of both the travel guide book and another phenomenon related to it: the retail catalogue.  First appearing at the end of the 19th century, by 1906 the Sears mail-order catalogue had become the “consumer’s bible.”  In its commodification of places, monuments and customs, the guide book can be regarded as a retail catalogue of the foreign and exotic.

Until the middle of the 19th century, travel for its own sake was the privilege of the few.  By the time the organized group tour (Thomas Cook), the printed guide book and the tourist hotel had come into prominence in the last decades of that century, “seeing the world” had been, at least for the middle class and the better-paid strata of the working class, democratized to a significant extent. What this meant for the realm of the imagination, the inner landscape where dreams of the far away are enacted, was a shift from being the closed realm of the unrealizable to becoming the initial planning phase of real possibilities.  Hopes are kindled that the World, as opposed to home, will allow one to shed one’s everydayness, at least temporarily.  As Dean MacCannell notes: “[…] somewhere, only not right here, not right now, perhaps just over there someplace, in another country, in another life-style, in another social class, perhaps, there is genuine society.”

It is in this attempt to escape one’s own everydayness that one of the paradoxes of tourism becomes manifest: the foreign and far away come to resemble the familiar and everyday in as many aspects as possible.  The easy access which new modes of transport created brought about the flourishing of the hotel business throughout most of the world.  Increasing comfort while enjoying the foreign and far-away is the principle here; accessibility itself is, in the context of tourism’s domestication of the foreign, an aspect of this comfort.  After all, what else is “comfort” in this context other than the protective bubble of familiar everydayness?  “Home away from home” and receptivity to what is unfamiliar cease to indifferently coexist after a certain threshold of comfort and convenience is reached: thereafter, comfort and convenience are present to the extent that the foreign and unfamiliar are absent.

The comfort which is sought is not only physical.  The uneasiness in the face of the unfamiliar gives rise to various defensive strategies.  Not only the activity of taking pictures but speech itself serves as a repository of the familiar in those innumerable discussions among tourists about the logistics and accoutrements of travel.  Such conversations, rather than exemplifying a concern for detail indicative of an active curiosity about one’s environment, are defenses against an alien environment – in short, they are expressions of homesickness.  But this homesickness is not that of being too long away from home, but rather that of having ventured into what is foreign in the first place.

Another aspect of tourism’s inherently paradoxical nature reveals itself in exactly the opposite of the comfort just depicted: the search for the native, the unspoiled and authentic.  An attempt is made to come into contact with what is genuinely foreign, undefiled by the quest for home away from home.  The tourist seeks to escape tourism, to be simply a visitor, a “traveler.”  He or she is often hostile to others who seem to fit into the category of “tourist.”  As James Buzard has amply documented, the traveler/tourist distinction (always with disapproval of what the latter term was taken to signify) dates back to the early nineteenth century.  He also quotes the twentieth-century novelist Evelyn Waugh: “Every Englishman abroad, until it is proven to the contrary, likes to consider himself a traveler and not a tourist.” “The term ‘tourist’ is increasingly used as a derisive label for someone who seems content with his obviously inauthentic experiences.”  There is a wish to become invisible as a tourist, a desire to pass as a “local,” if possible.  But as Jonathan Culler has persuasively written, “The desire to distinguish between tourists and real travelers is a part of tourism – integral to it rather than outside it or beyond it” (Framing the Sign: Criticism and its Institutions, 1988).

The flight from tourism has itself become an extremely popular mode of tourism, and has generated a whole range of “alternative” guidebooks, guided tours, hotels, hostels, etc.  But there is a grim truth concerning the very essence of tourism: the massive search for the unspoiled and authentic is one of the primary causes of its destruction.  As Bruzard observes, “No tourist ‘intends’ the transformation or violation of visited places; yet, in complicity with powerful social, cultural and economic forces, each tourist helps to effect such transformation.  In Tristes Tropiques (1955), the anthropologist Claude Levi-Strauss notes: “The fact is that these primitive peoples, the briefest contact with whom can satisfy the traveler […] are all, in their different ways enemies of our society, which pretends to itself that that it is investing them with nobility at the very time it is completing their destruction, whereas it viewed them with terror and disgust when they were genuine adversaries.”

The search for the scenic and the accumulative approach toward experience it involves compromise from the outset those contacts with native residents which, precisely because experience here is predominantly experience of things, are quite limited to begin with.  “A country’s humanity disappears to the exclusive benefit of its monuments” (Roland Barthes, Mythologies).  To the extent that contacts with native residents do occur, they tend to be within tourism’s pre-established socio-economic context in which the local resident is the merchant, the hotel clerk, the bartender, the waiter or the local tour guide, and the tourist is always the customer.  The tourist industry is, after all, an interlocking cluster of service industries, which in most cases means that the ultimate consumer of the product or service is personally waited on.  The server-served relationship is by no means unique to tourism; but what is striking here is the pronounced and nearly unavoidable tendency to the exclusion of other kinds of relationship.  The tourist as customer is there to be helped, accommodated, fed and entertained.  He or she may act according to whim, since what has been purchased is not simply a meal, a room, a seat or a souvenir, but the unimpeachable role of guest.  Though perhaps somewhat ill at ease due to lack of competence in the local language or to a vague and unexamined feeling of being out of place, the tourist’s nothing-to-lose attitude of the self-invited guest is there nonetheless.

The local resident whose job it is to serve the tourist does not have the luxury of such arbitrariness, since business and livelihood are at stake.  Moreover, in the anonymity of waiting on and being at the service of the tourist, the service industry worker is placed in the role of representative of his or her town, region or country.  The tourist addresses the person who serves him or her from a position of inherent superiority: the power to decide where and how to spend even a limited supply of vacation money is a crucial advantage.  This superiority of position never goes unnoticed by those who possess it.  It may or may not manifest itself as haughtiness, but even if it does not, the conditions for condescension and patronization have already been created.

The local service industry worker and the tourist who is served are thus bound to each other in a mutual embrace in which the instrumentality through which each views the other casts a shadow on any further attempts at communication between them, as if engaged in a parody of the dialectic of master and slave in Hegel’s Phenomenology of Spirit.  Although the resulting ambiguity of their simultaneous attraction/repulsion is mutual, the initiative lies with the tourist, as does the advantage throughout.