Archive for the ‘Society and polity’ Category
Imagine there’s no countries
Imagine there’s no countries
It isn’t hard to do
Nothing to kill or die for
And no religion too
Imagine all the people living life in peace, you
You may say I’m a dreamer
But I’m not the only one
I hope some day you’ll join us
And the world will be as one
—John Lennon (1940–1980)
Earthrise
In many of the the 2018 midterm election campaigns, some candidates are described as advocating open borders. Although there are few, if any, political candidates in the United States actually advocating open borders of the sort that exist in the European Union, it is interesting to think of what it would be like to have an agreement among all the countries in the Americas that would allow people to move freely anywhere in the American continents and the Caribbean islands to pursue a livelihood. What would it be like if workers could move from one country to another as easily as corporations do? This thought experiment can be taken one step further. What if there were no nations at all and therefore no borders to cross?
On Christmas eve in 1968, in the course of the Apollo 8 mission to orbit the moon, the lunar module pilot Bill Anders took a photograph of the distant earth with the surface of the moon in the foreground. The photo, entitled “Earthrise,” has been called by the nature photographer Galen Rowell ‚ “the most influential environmental photograph ever taken.” Nearly fifty years after the photograph was taken, in a PBS program called Earthrise: The First Lunar Voyage, members of the crew of Apollo 8 reflected on the impact that seeing the earth from a lunar orbit had on them.
One of the observations that several astronauts who have seen the earth from the moon or from the international Space Station have made is that when our planet is seen from a distance, it is possible to see natural features such as oceans and large lakes and mountain ranges and deserts, but it is not possible to see human-made features such as nations, states and counties. When seen from that perspective, it is apparent that the earth is surrounded by dark space and that there is nothing nearby on which the inhabitants of the planet can call for help. If the inhabitants of the earth are to survive, they must do so by cooperating with one another. Within the human race, that cooperation may best be achieved if a focus on differences—differences in nationality, ethnicity, ideology and religion—is not allowed to take priority over a focus on basic common needs. As the human race interacts with other species, the common needs of human beings are best met by remembering that we human beings are only one of countless other interdependent lifeforms on this planet. It is because the photograph Earthrise makes all that cooperation and interdependence easier to grasp that it has been called the most influential environmental photograph ever taken.
The role of mythology
In his book Homo Deus: A Brief History of Tomorrow, the historian Yuval Noah Harari discusses the influence that fictitious narratives play in human history. Examples of the fictions he explores are money, corporations, nations, ethnicity, personal identity and freedom. Harari points out that while it may be easy for a modern person to regard the Sumerian deity Enlil as a fiction and to see as fictitious narrative the belief that all the lands and crops and precious artifacts offered to Enlil are private property owned by Enlil, it may be more difficult to see that a corporation is also a fictitious entity and that it is a socially constructed fiction that the corporation owns lands. Most people probably do not regard it as preposterous to believe that Canada is real and that it owns part of the Arctic or that a British-Australian multinational corporation named Rio Tinto Group is real and that it owns the Bingham Canyon open pit copper mine in Utah. Why, then, would they regard it as preposterous that the Sumerians believed that all the land around them was owned by the god Enlil?
Harari does not advocate banishing fictitious narratives from our lives. Rather, he advocates recognizing that they are fictions. They are myths that give our lives meaning and that facilitate large-scale social co-operation. It would be very difficult, if not impossible, for human beings to co-operate with large numbers of total strangers whom they have never met and never will meet without some kind of shared mythology. Mythology is therefore not to be avoided altogether, but it is important to realize that myths serve specific purposes under particular circumstances. As circumstances change and human needs change, then successfully meeting those needs may require a change in the fictitious stories we tell one another so that we can work together.
A question worth thinking about today is whether the fiction of nation-states is still serving the collective needs of humanity. There may have been a time when the story of having countries to kill or die for served a useful purpose. It may well be that we have entered a time when it is increasingly counterproductive to believe that a nation has a right to withdraw from co-operating with other nations to address such global predicaments as the warming of the planet through the combustion of fossil fuels. Perhaps it is time for international co-operation to give way to a kind of co-operation in which the very idea of a nation no longer plays a role, some kind of post-national co-operation. This is a possibility I have explored elsewhere.
Hanging on to the past
“The world of dew
is the world of dew.
And yet, and yet–”
― Kobayashi Issa
When I lived in Hiroshima from the autumn of 1977 until the spring of 1979, I often passed by the iconic A-bomb dome, the ruined remains of the former Hiroshima Prefectural Industrial Promotion Hall. Someone pointed out to me that the infrastructure of the building was so compromised that it would collapse if measures were not taken to keep it standing. While fully aware of and sympathethic to the purpose of keeping a visual reminder of the devastation wrought by an atomic bomb, I also reflected on the irony of putting effort into keeping something perpetually in a state of being on the verge of collapse. How does that differ from, say, putting a brain-dead person on life support on display for no other reason than to serve as a reminder that life ends in death?
In recent years I have been a volunteer in two organizations that monitor archaeological and historical sites in New Mexico. Monitors visit sites periodically to see whether damage has been done by natural occurrences such as fire or water, or by burrowing animals, or by human campers or treasure hunters. Any damage found is reported to an archaeologist, who then surveys the site more carefully to see whether steps need to be taken to restore the site to the state it was in just before the damage was done. The state a site was in just prior to being damaged, of course, is usually a state of collapse. Sites that were villages built by ancestral Pueblo peoples in the thirteenth century are now piles of stone and scattered pot sherds or flakes from the manufacture of lithic tools. Eventually the occupants of those villages moved on to other locations, probably taking with them whatever was both useful and portable. During the past century, however, and even more during the past twenty years, efforts have been made to keep those piles of stones and middens and pottery and tool scatter, so far as possible, in the condition in which we now find them. Archaeologists used to do much of their research by digging, which of course altered the nature of the site being researched. The tendency now is to use tools, such as ground-penetrating radar, that leave a site intact while gathering information about it.
I would not participate in the endeavor of site monitoring if I did not value knowing about how people lived in the past and if I did not respect the Pueblo peoples who still live in New Mexico and their ancestors who have lived here for many centuries. At the same time, however, I am struck by how holding on to things from the past runs counter to the fact that everything in this universe is constantly changing and that the material of any given epoch must be the same material of previous epochs. Materials are constantly being reused, transformed, and repurposed. There is something unnatural about preservation, not only physically but psychologically. It is, I am inclined to think, generally speaking more psychologically healthy to let the past slide into forgottenness than to hold on to it. Anything that is truly useful or valued is bound to survive somehow, or to be rediscovered if it goes missing for a while and turns out to be indispensable. This being the case, there is a part of me that is inclined simply to let nature, including human nature, simply take its course, knowing that the course nature takes is always destruction of the old to make way for the new. There is, however, another part of me that says with Kobayashi Issa, “And yet…. And yet….”
The Sea of Hype
hype (informal) noun 1. extravagant or intensive publicity or promotion. 2. deception carried out for the sake of publicity. Origin 1920s (originally in the sense ‘shortchange, cheat,’ or ‘person who cheats, etc’): of unknown origin.
Last night as I was watching a current affairs program on one of the commercial television channels, I was struck by how many commercial breaks there were. It seemed as though the pattern was that the announcer would say a few intriguing words about a news story that would be coming up in just a few minutes, then two or three commercial messages would come on, followed by a brief news story, half of which had already been given in the “preview” to the story, after which two or three more commercial messages would follow. Most of the featured stories consisted of politicians delivering sound bites, about which one conservative and one liberal panel member made a partisan pronouncement. What struck me in particular about this format was that the entire program from beginning to end consisted of almost nothing but hype—extravagant or intensive publicity or promotion. The commercial messages were, of course, promoting products or services. The politicians were promoting a political agenda. The commentators were trying to persuade the viewer that the agenda being promoted by the politician was either just exactly what the country needs right now or would be a complete disaster for all concerned.
What is missing in hype, it hardly needs to be said, is a careful weighing of evidence and an impartial assessment of the evidence considered. Advertising agencies are paid handsomely, not to offer an impartial assessment of a product based on scientific tests but to convince the viewer that this product is preferable to similar products made by a competing company. Political campaigning is all about making the case that a particular candidate is the best person for the job and will do the most for the citizens—all citizens, not just those who vote for the candidate making the pitch. Rarely these days is a politician not campaigning. When elected and “serving,” a politician must keep an eye on the next election, which requires persuading the voting public that the policy the politician is advocating is one that will benefit the voters. The partisan commentators who participate in panels on news analysis programs continue to carry out the endeavor of persuading, an endeavor that nearly always involves at least some degree of deception or distraction or oversimplification.
What struck me as I was watching the current affairs program last night was not just that this program was mostly hype but that almost everything one is exposed to all day long is hype. Hype is the very fabric of modern culture. (Perhaps it has always been so. Perhaps hype is the very fabric of being human. Not knowing whether that is the so, let me focus only on modern culture.) To change the metaphor, hype is the very sea in which we swim.
While reflecting on the ubiquity of hype, I was reminded of a conversation I had decades ago with a friend who had just returned to Montreal from seven years of living in a Buddhist forest monastery in Thailand. He reported that as he walked along the streets of the city he felt as though everything was reaching out and trying to grab his arm to get his attention. In every shop window, on every lamppost, in every Metro station, at every bus stop there were posters advertising goods and services, every one of which he had learned he could live without. He reported finding it an exhausting experience to take even a short walk in the city, such was the feeling of being assaulted from all sides by persuaders. After a few months, he noticed himself growing used to it, and we had a conversation about how unfortunate it is that we who live in contemporary society simply grow used to all the hype rather than feeling outraged by it. Being outraged by something that one is for the most part powerless to change, we concluded, is probably even more detrimental to one’s well-being than being slowly poisoned by omnipresent hype.
I was surprised to learn in consulting several dictionaries that the origin of the word “hype” is unknown. I had always assumed that it was an abbreviation of the rhetorical term “hyperbole,” which according to Wikipedia comes from the Greek ὑπέρ (hupér, “above”) and βάλλω (bállō, “I throw”). The article goes on to say:
In poetry and oratory, it emphasizes, evokes strong feelings, and creates strong impressions. As a figure of speech, it is usually not meant to be taken literally.
As a rhetorical device, I am quite fond of hyperbole or overstatement. A good deal of humor employs it. A bit of hyperbole adds spice to conversation. Like spice, it is best used sparingly, not as the main ingredient. (An exception to this rule, of course, is green chile in New Mexican cuisine.) I am concerned that the hype to which most of us are exposed these days has become the main ingredient of the main course and that as a result our minds are not receiving proper nourishment.
Fortunately, it is possible to find respite from the pervasiveness of hype, even in the United States, the country that hardly any politician can resist calling “the greatest country in the history of the world.” (Why limit oneself to just the world? Why not say it’s the greatest country in the history of the Milky Way?) One can watch PBS or listen to NPR to get some hype-reduced nutrition. One can read any number of works of fiction or non-fiction. One can have conversations with carefully selected friends in some non-commercial setting, such as a home or a relatively remote rural natural setting.
Now that I think of it, I suspect some version of hype may have been difficult to avoid during most of human history. In ancient Buddhist texts, written long before electronic technology overwhelmed us all, followers of the Buddha are advised to seek isolation (viveka), that being described as a place far enough away from a populated area that one can no longer hear the sound of people’s voices. Presumably the chattering of birds and chipmunks and the occasional roaring of lions does less to undermine one’s concentration than exposure to human verbiage. I am not convinced, however, that birdsong is entirely free from hype, especially during the mating season. Be that as it may, the hyperbole that the flora and fauna broadcast to draw attention to themselves does not irritate most human beings as much as the hype put out by our own species.
Speaking only for myself—Heaven forfend that I would try to persuade anyone else to have the same taste as I— on most days I had rather listen to a male finch trying to attract a mate with his elaborate arrangement of notes than to a politician trying to attract my vote or to a pharmaceutical company trying to convince me that its product is the best remedy for moderate to severe jangled nerves caused by overexposure to hype.
The barren landscape of originalism
“The Constitution that I interpret and apply is not living but dead, or as I prefer to call it, enduring. It means today not what current society, much less the court, thinks it ought to mean, but what it meant when it was adopted.”—Justice Antonin Scalia (March 11, 1936–February 12/13, 2016)
Justice Scalia was one of the leading proponents of a method of interpreting the Constitution called originalism, a form of textual exegesis that uses historical and linguistic scholarship to determine what the authors of a text meant by their words or what the first readers most probably understood the words of the text to mean. The original meaning, once determined as well as scholarship allows, is then regarded as the only meaning of the text, unaffected by the what later generations of readers of the text may believe. According to Scalia, while the patterns of thinking of society as a whole may change from one generation to the text, the meaning of the Constitution endures without change.
The method of textual interpretation called originalism is familiar to and widely practiced by scholars of ancient and medieval texts, even if that name is not commonly used by textual scholars. While I was being trained in the field of Buddhist studies, for example, students were advised to try to discover what the expressions found in the Pali Canon (the scriptures of the Theravāda school of Buddhism) probably meant to speakers of Indian languages at the time of the Buddha and to ignore what those same expressions came to mean to commentators in later centuries. The greater the temporal and geographical distance of a commentator from the time and location of the the Buddha, the more that particular commentator was to be regarded with suspicion. What this meant in practice was that my fellow students of the Pali Canon and I were unlikely to turn even to Buddhaghosa (who probably lived in the same part of India in which the Buddha lived but nearly one thousand years later), let alone to Bhikkhu Buddhadasa (a Thai monk who lived from May 27, 1906 until May 25, 1993). Despite the fact that Buddhaghosa’s commentaries on the Pali Canon came to be the interpretation that prevailed from the twelfth century C.E. onward, and that Buddhadasa was regarded as the most authoritative interpreter since Buddhaghosa, an academic scholar of the Pali Canon trained in Toronto in the 1970s would studiously avoid being influenced by them. Similarly, when reading a Sanskrit text written by Nāgārjuna in the second century C.E., a student in Toronto in the 1970s would carefully steer clear of writings by modern Tibetan scholars, such as the Dalai Lama, from schools of Buddhism based on Nāgārjuna’s teaching. In other words, in my academic training in Buddhist studies, I was taught that academic respectability diminished to the extent that one’s method of interpretation of a Buddhist text deviated from textual originalism.
If I had never approached Buddhist texts in any way than as a textual historian, I might hold originalism in high esteem. My interest in Buddhist texts, however, was never motivated principally by historical curiosity. What motivated me, probably to my detriment as a serious academic scholar, was a search for inspiration, a perhaps vain hope to find advice on how to lead a more useful life. As a seeker of inspiration, my practice was to read and reflect on whatever came into my hands and to let it have its way with me. Looking back now on my thinking of several decades ago, I realize how divided my mind was against itself. Without consciously setting out to compartmentalize my thinking, I unconsciously developed two distinct modes of reading. While in academic mode I would read a classical Buddhist text one way; while in spiritual seeker mode, I would read that same text differently—sometimes only somewhat differently and sometimes radically differently. It was rarely possible to be in both modes at once, and it wasn’t always easy to discern which mode I was in at any given time. Flitting back and forth between the two modes became a way of life and probably caused almost as much confusion in the minds of my students and fellow Buddhists as it did in me. The confusion both for me and for others was much less when I was in the midst of Quakers, for my Quaker faith and practice was almost entirely uncontaminated by a scholarly approach to either the Bible or the writings of George Fox and other Quaker authors. Looking back on it all now, I think it may have been easier for all concerned had I been just a practicing Quaker with an intellectual curiosity about the history of Buddhist thought. That, however, is not what I was.
My bewildered and bewildering life as a scholar-practitioner has no doubt had an influence on how I think about the Constitution of the United States. My interest in that document has never been that of a historical scholar. If my interest had been purely scholarly, I would probably have been inclined to be sympathetic to some form of originalism. My interest in the Constitution, however, is much more like my interest in the writings of George Fox. I read and reflect on Fox that I might be a better Quaker, and I read and reflect on the Constitution that I might be a better citizen of the particular constitutional democracy into which I happened to be born. Given that orientation to reading the Constitution, I have relatively little interest in how the people who wrote the text, and those who voted to ratify it in 1789, saw the world. Since they lived and wrote, the world has been exposed to and enriched by the thinking of Charles Darwin and the tens of thousands of scientists who take his work as a point of departure; and to astronomical research that has resulted in a view of the universe that the Founding Fathers could not even imagine; and to quantum mechanics, which has resulted in a view of the universe that no one can imagine; and to neurophysiology, which has resulted in transformations of how we view personality, agency, responsibility, and what people used to call the self, the mind or the soul; and to Hegel and Nietzsche and Kierkegaard, who have made naivety in most matters impossible; and to Emerson, Thoreau, the Transcendentalists, the Pragmatists and numerous kinds of religious and philosophical pluralists; and to depth psychology; and to generations of brilliant litterateurs, social commentators, political thinkers and essayists. Since the Constitution was written, the United States has expanded across a continent, been blessed with waves of immigrants from all parts of Europe, Asia and Africa and survived several devastating wars from which surely numerous important lessons have been (or should have been) learned. There is hardly any aspect of modern life that would be recognizable to the authors of the Constitution. Why should the way we think and act in the world be recognizable to them? To expect people today to ignore all that has happened in the past two and a quarter centuries and to eschew all the wisdom gained from those happenings and to hew to the world view of the Founding Fathers is as unreasonable and impractical as it would be to expect great grandparents to continue thinking and acting as they did as toddlers.
There is no doubt in my mind that Justice Scalia was a deeply learned and highly intelligent scholar of the text of the Constitution. There is equally little doubt in my mind that an enduring or dead Constitution is no more than a historical artifact, as impractical in today’s world as a horse and buggy. What is needed is a method of interpretation of the Constitution that allows for changes in human thinking resulting from scientific discovery, developments in the humanities and social sciences and trends in the arts. The results of such interpretation would no doubt sometimes be wild and unpredictable, and occasionally discomfiting, exactly as life itself is. To be sure, not all change is for the better, but all change must be acknowledged to have taken place, for better or for worse. To ignore change is delusional. To resist it is futile. To embrace it is alone conducive to flourishing.
The culture of self-promotion
From both of my parents and all four of my grandparents, I inherited a distaste for self-promotion—even the indirect forms such as being patriotic or proud of one’s school or place of residence or of other members of one’s own family. Early childhood conditioning tends to be persistent, so to this day I inwardly cringe upon witnessing displays of self-referential praise.
Years ago, I was on an academic committee considering a faculty member for promotion. I knew and admired the candidate, and there was little doubt in my mind about his being worthy of promotion. That notwithstanding, I found myself put off by his supporting documentation. Rather than simply submitting the required teaching evaluations, he supplied an accompanying document quoting selected phrases from comments that students had made; these selected words of praise were isolated from the surrounding narrative by being placed in text boxes and formatted in a large and bold font so that there would be no missing how highly his students thought of him. Offprints of his publications were accompanied by a similar document, featuring laudatory remarks that reviewers had made of his work, also placed in text boxes and set in boldface type in a larger font size. The presentation felt like an advertising brochure, as though the candidate somehow believed that the only way to get an academic promotion by others was to display a capacity for self-promotion.
I was not the only member of that committee to be put off by the presentation. An older colleague, nearing retirement age, commented that universities nowadays are almost forcing their employees to expunge all traces of modesty and humility from their behavior, if not from their mentality. Department chairs are expected to write annual reports assuring the university administration that their department is filled with world-class scholars and universally admired instructors. By the time of the early 1990s, candidates for promotion in the academic world could no longer submit a simple letter and a typed resumé. They had to write 10-page descriptions of their goals as teachers and scholars, accompanied by ample evidence that they were accomplishing those goals and that their accomplishments were being recognized by others living near and far. When I sought my first promotion, it was still possible to submit a brief letter and a typed resumé. By the time I was at the stage of my career to seek a promotion to the next level, all that had changed dramatically. I found the new process so unpleasant to contemplate that I never sought another promotion after that first one—and I was amply rewarded by never getting another one. Putting together the expected sort of dossier was not worth the time and effort, but more to the point it was not worth the violence to my sense of dignity. Ironically, my sense of self-worth would have been undermined by having to present myself as worthy.
It is not only the academic world that has steadily gravitated toward a culture of vainglory. Far from being an unpleasant feature of a bloated ego, fulsome self-congratulation now seems to be expected. Just as no commercial product can afford to be presented simply as adequate to the task but must be portrayed as better than all its competitors and indispensable to the discriminating consumer, no person can afford to be seen as merely competent. Pretty good is just not considered good enough anymore.
During an election year in the United States of America, voters are treated to a parade of candidates who not only toot their own horns sans cesse but also boast about their country as the greatest country in the world, even as the greatest country in the history of the world. Some of the candidates go so far as to disparage political leaders who do not participate in their jingoistic frenzy; those not caught up in nationalistic fervor are characterized as actually hating their country and wanting to drag it down to the same level as ordinary countries. An ordinary country is one that has affordable health insurance and reasonably-priced medical services and pharmaceutical products for everyone; reasonable tuition fees and generous food, housing and transportation subsidies for students pursuing a higher education; a modest-sized military of men and women trained mostly to help citizens cope with natural disasters such as floods, hurricanes and earthquakes; and a prison system designed to reform and educate miscreants rather than punish them. An ordinary country does not have a bloated military budget that is used to send personnel and materiel to countries all around the word and to build permanent military bases in more than a hundred other nations. Americans these days who long to live in an ordinary nation are advised to go live in Canada or Northern Europe, for the United States is a nation for those who wish to participate in excelsior.
In the 1950s, the psychologist Carl G. Jung said in an interview broadcast in English that the United States as a nation is “extraverted like hell.” The quiet reflection of the introvert is deprecated to such a degree that the system of public education is skewed in favor of gregarious doers whose energy is dedicated to making changes in the world rather than in one’s own attitudes and expectations. The thriving industry dedicated to selling products designed to help people realize their dreams of “self-improvement” tend to focus on how to be more self-confident, more assertive, more aggressive, more successful by external standards of assessment, more admired by the crowd. Jung chose his words carefully; an overly-extraverted country truly is like hell.
Although the United States could be described as “extraverted like hell” in the 1950s, it appears not to have always been that way. Neither the New England where some of my ancestors were born and lived, nor the Midwest in which other of my ancestors made their way from the cradle to the grave, according to what I heard from family elders, had much room for the braggadocio narcissism that has become so prevalent in today’s culture.
It is possible that my elders’ memories of the prevalent culture of their early days were faulty. Perhaps they were just getting old and slowing down, as I have managed to do a few decades after they passed on. Perhaps the world always seems too fast-paced, too forceful and too brash from the perspective of a rocking chair on the porch with a commanding view of an array of bird feeders. Or perhaps a culture of modesty and moderation really has been mostly replaced by a culture of excess and hubris.