The decline of religion in the west seems unprecedented in world history. Religion dates back to at least the Upper Paleolithic and, prior to the 20th century, was pretty much a cultural universal. But increasingly, people in developed societies are turning away from it. Or are they?
I’ve noted before that religion historically had three main functions:
Explaining the world.
Supporting the social order.
Soothing existential anxiety.
This is a simplification of function lists I’ve read from anthropologists and other scientists like Jared Diamond.
In the modern world, science has pretty much taken over 1. And modern societies have built a number of institutions to handle 2. (Related to 2, amanimal recently called my attention to a fascinating article comparing religion to sports and the social cohesion benefits of rituals and symbols.) Indeed, its not uncommon here in the US for our constitution, government, courts, etc, to be referred to as our “civil religion“.
3 has been the one that has taken the longest, from a historical perspective, to replace. But I think that replacement is happening. I don’t think it’s a coincidence that the countries with the strongest social safety nets, the ones that protect their citizens from the worst consequences of the vicissitudes of life, are also the ones at the forefront of religion’s decline. These social safety nets are reducing the existential anxiety that fueled the need for 3.
This raises an interesting question. Is religion so much in decline, or are the ancient supernatural religions simply in the process of being replaced? As I’ve written before, religion is a difficult beast to define. A historian centuries from now looking back may interpret what’s happening as more of a transition from one set of worldviews to a new set. They may see our modern emerging “religion” as a syncretization, a merging, of science and civil religion, including the social welfare state.
This is a view typically resisted by both religion’s advocates and its opponents. They see religion as inescapably linked to its traditional supernatural beliefs. Along those lines, maybe it’s not religion per se that’s in decline but supernatural beliefs. Except that many people who are not religious, even in Scandinavia, still hold supernatural beliefs, often retaining belief in a hazy “universal spirit” or “higher power”.
Of course, this may all be a matter of semantics. An argument could be made that words should be defined according to their common meanings. And by that measure, religion is in decline, and may, in decades to come, be in danger of extinction.
Could something reverse that decline? Given 3 above, I’d say yes. If life were to become harsh and unpredictable again in the west, I think we’d see a resurgence in traditional religion. The only thing separating us from that resurgence would be a devastating war, a natural catastrophe, or some form of economic collapse. If any of these happened with sufficient magnitude that civic institutions were overwhelmed, I think it would be a boon for religion.
An interesting thought experiment is to consider what might happen if these types of events happened after traditional religion had died out. Would totally new religions rise up? Or would people return to the old ones?
What do you think? Is religion headed for extinction? Or is it too hardwired into the human psyche and we’re only seeing a temporary lull? Could we avoid falling back into religion if civilization collapsed or declined?
Here in the United States, daylight savings time ended today. We got an extra hour of sleep (yay!). But this is only a temporary reprieve. It’ll be back in the spring, when we’ll have to “spring forward” and start waking up an hour earlier again.
So, given the fact that the science is, at best inconclusive, and at worst showing that daylight savings doesn’t accomplish its aims, why do we in the US continue to do it? I suspect the primary reason is retail sales. I don’t know whether or not it’s true, but most retailers seem to believe that their sales are higher during daylight and drop after sunset. I think this is why Congress continues to be lobbied for daylight savings time, and why they actually increased it a few years ago.
But I suspect that society has largely adapted to daylight savings, which is why the effects have become elusive. As I understand it, the practice of switching off of it in the winter is supposed to stop that from happening, but we’ve been on it so long now, and increased the time that it’s in effect for so much of the year, that it has essentially become baked into people’s habits and business’s schedules.
I don’t mind the fall transition, but the spring transition is onerous, often taking me a week or two to get used to. It’s gotten worse as I’ve aged. From what I read, I’m a mild case. Many people find it a pointless burden and wish we’d just stop it. I have to agree.
But this also gets into why we have the overall time zone system in the first place. I often wonder if the world wouldn’t be better off all switching to Greenwich Mean Time, or some other universal standard. Yes, it would be weird at first to wake up at 13:00, but once we got used to it, it would just be our new normal, and a lot of time zone confusion would be gone.
Earlier today, John Zande clued me into an awesome science fiction novella by David Brin called ‘Stones of Significance‘, which I highly recommend to anyone interested in AI, post-singularity fiction, and the nature of reality.
By coincidence, the author, David Brin, tweeted this video earlier which, given the epic online fights that have happened recently, should be watched by many people. There’s a good amount of wisdom here.
It’s a bit ironic that he posted this on Twitter, not exactly a forum conducive to these techniques, but still an effective one for getting a message out.
Worried the world is going to hell in a handbasket? You’ll feel better after watching this video, and also get some insight into why you might have thought it was.
Watching this video also reminded me of something I learned years ago, not to trust numbers given in isolation. Whenever I hear about the increase in some number, or I’m shown a large scary number, as the media is prone to do, I pretty much automatically discount that as evidence unless the numbers are given in their overall context.
For instance, here in the US it’s common to hear about the total US public debt, which is always some scary astronomical number much higher than in the past, but rare to hear it as a percentage of the total income of the economy, and how that compares historically, where it’s not nearly as scary.
I am sure of my position. Doubtless, death is a loss. It deprives us of experiences and milestones, of time spent with our spouse and children. In short, it deprives us of all the things we value.
But here is a simple truth that many of us seem to resist: living too long is also a loss. It renders many of us, if not disabled, then faltering and declining, a state that may not be worse than death but is nonetheless deprived. It robs us of our creativity and ability to contribute to work, society, the world. It transforms how people experience us, relate to us, and, most important, remember us. We are no longer remembered as vibrant and engaged but as feeble, ineffectual, even pathetic.
While I find the age of 75 to be a bit arbitrary, I think there’s a lot of wisdom in Emanuel’s position, and it’s an attitude toward life and death I’ve read from many physicians and other people involved in medical fields. They seem to be much more aware than the average person that there are worse things that can happen to someone than death.
As a result, many of them, when diagnosed with a terminal illness, rather than fight death to the bitter end with every last treatment option, instead accept only palliative care (i.e. treatment of pain) until they expire. Having watched a couple of relatives (one of whom was my mother) go through aggressive cancer treatments, with all the suffering and loss of dignity involved, before ultimately dying anyway, I’ve long wondered if the extra life bought with those treatments was worth it.
I think this subject actually reveals something about our society’s attitude toward death, and how costly it can be, both in terms of life satisfaction and finances. Particularly since the most expensive health care is usually delivered in the last six month’s of life.
I know many people who expend great effort making sure they live a healthy lifestyle, eating the right foods, scrupulously avoiding the wrong foods, exercising several hours per week, and many other activities. I’d say that one of my relatives probably devotes most of their free time to this. When they’re not exercising, growing their own food, or purchasing only organic food, they’re researching how to do these things better and stressing about what’s in the food that they do buy.
One of the things I learned decades ago in business school that I have found useful in many aspects of life is to look at things in terms of the cost benefit ratio, and to keep in mind the Pareto principle, the 80-20 rule, which says that you often get 80% of the benefits with only 20% of the effort. I think it pays to examine exactly what we’re buying when we attempt to live healthy.
For me, the first thing that sticks in my mind is the stark fact that we are all mortal. Short of either a religious rapture or a technological singularity (neither of which I think are likely), we’re all going to die someday. So when we engage in healthy living activities, we’re really going for two things: increased quality of living while we are alive, and hopefully a few more years in which to live.
Of course, as Emanuel points out in his article, many people haven’t really thought this out and are, consciously or unconsciously, going for immortality. Unfortunately, we have to face up to the fact that immortality has yet to be achieved (at least in this world), so chances are, we are going to fail. As one of my uncles used to say (usually with a beer in hand), “Something always gets you in the end so why not enjoy life.”
So, with this in mind, how much investment in healthy living is worth it? Certainly it pays not to live destructively: smoking, drinking excessively, overeating, eating with zero regard for health, being a complete couch potato, etc. The record is pretty clear that when you indulge in these activities, you will probably live an abbreviated life, often decades short of what most people can achieve, or you run the risk of living decades with a dramatically reduced quality of life. Some people are fine with this, at least until it comes time to pay the piper, but most of us would like to live at least the average life span in reasonably good health.
But once you aren’t smoking or drinking, are eating a moderately healthy diet, and getting at least a moderate amount of exercise, how much benefit do you really get going beyond that? Certainly there will be some benefits. But along the lines of the 80-20 rule, we’ll see diminishing returns. It’s fine to exercise for hours on end if you enjoy it, but is it really worth it if you don’t?
Some might argue that they’ll take every month, every day that they can buy with increased healthy living, but as Emanuel points out in his article, this can often lead to an old age with a substantially diminished quality of life. How much pleasure in our current daily life are we willing to give up in order to live a few more years in our senescence?
I don’t agree with Emanuel on drawing a line at 75. I’d be fine continuing to live as long as I could have intellectual stimulation and good conversation, and if that ends up being denied me years before 75, my interest in life would probably fade. If it continues until 90 but with physical disabilities, I’d still find that a life worth living. But I do very much agree that it’s important to ponder what we desire in life, what in it gives us satisfaction, and to what extent life without those things would be worth experiencing.
Of course, in truth, when faced with the actual decision about whether or not to fight for life, giving up on it can be very hard. I remember reading a quote from a WWII medic who noted that many soldiers told him in training that if they lost a limb or suffered severe disfigurement, that they would prefer to die, but that in actual combat when these things happened, everyone fought for life.
But I think it’s important for us to think about these issues well ahead of time. The cost of not doing so can be high, in terms of personal suffering, burdens on our families, and to society.
This strip reminds me of something I heard someone say in a presentation on communication strategies several years ago. It was a concept the speaker referred to as “the curse of knowledge”. The curse of knowledge is the idea that when you know something, it is often very difficult to see things from the perspective of people who don’t know that thing, particularly if it’s something you’ve known for a long time.
For example, once you understand the vast array of cultural norms and practices that exist across humanity, many of our cultural taboos become obviously arbitrary, such as the one against homosexuality. When interacting with someone who does not understand that, it can be very difficult to put yourself in their shoes and understand where they’re coming from. It can also be very easy to demonize them, when often their main failing is a simple lack of knowledge.
I often find this to be an important thing to remember when dealing with younger people, which as a manager at a university, I have lots of opportunity to do. I regularly have to remind myself of how limited my own understanding of many things were when I was 20 years old, particularly my understanding of human relations. When I do succeed in that reminder, I’m often struck by how much more intelligent and knowledgeable today’s young people are than my generation was at that age. (We had TV to teach us, but not the internet.)
The curse of knowledge is important to keep in mind when you’re trying to convince someone of anything. Often their resistance is rooted in things they don’t know, things we may have come to see as obvious. Success in convincing them may hinge on figuring out where that lack of knowledge exists, and finding a respectful and effective way to bring them up to speed. (Of course, as the strip suggests, it’s always possible that their resistance is rooted in something we don’t know; a possibility we always have to be open to.)
The other day, I did a post asking what religion is. This TED talk by Kwame Anthony Appiah seems to be in much the same theme, pointing out that making accurate generalizations about religion is difficult since there is no one definition of it.
I do think that perhaps Appiah may be hiding behind exceptions to the rule in order to dodge many of the criticisms of religion. That said, critics of religion could often be a little more precise about who exactly they’re criticizing when they attack fundamentalism or other extreme cults and practices.
I think I’ve mentioned before that I only recently came the realization that the scientific revolution was more a matter of increased communication than necessarily a breakthrough in method. Along the lines of this realization, I have a few thoughts about communication and its effects on human history.
Humans are social animals. Communication between and among us are a vital aspect of the human condition. With each major advance in our ability to communicate, progress on the human condition has accelerated. Each advance, each age of communication, has built on the past advances.
The age of language
The first age of communication began with spoken language. All intelligent social animals, such as chimpanzees, elephants, or dolphins, have culture to one degree or another, but without language, their cultures are far simpler than anything that happens in human societies. Language allows us to communicate the state of our minds with each other, to pass along knowledge, and for societies to organize in larger groups than anything seen in other primates or social animals.
When exactly language began is unknown. Some archaeologists and linguists believe it happened in the relatively recent past around 50,000 years ago or just before the migrations out of Africa, perhaps as a result of some genetic mutation. These scientists point to the sudden appearance in the archaeological record of evidence of behavioral modernity, including symbolic thought in art and refined tools, and posit that language could have been the reason for the change.
However, most scholars and scientists now believe that language developed gradually over time, starting perhaps with the alarm signals and calls made by many primates. It now appears that Neanderthals had the required physiological adaptations for speech, putting the development of such adaptations, and possibly of language, before our evolutionary lines separated hundreds of thousands of years ago. Although, if we could hear it, we would probably regard the early language of Homo heidelbergensis, the common ancestor of Homo sapiens and Neanderthals, as more proto-language than language itself.
Whenever language did start, it’s an attribute that every human society possesses, no matter how primitive that society. Along with other cultural universals such as art, religion, music, and cooking, it is a significant attribute that separates us from other animals. Indeed, it may be necessary for some or all of the other attributes to exist. There’s no doubt that language has given humans a major evolutionary advantage over other species.
The age of writing
The second age of communication began with writing. The earliest known writing may turn out to be symbols found in paleolithic art dating back tens of thousands of years, although whether or not these early symbols communicate information in the manner usually ascribed to writing remains speculative.
The earliest writing is usually said to be in Mesopotamia during the period from 3500-3100 B.C., although long before true writing appeared, pictograms and numeric notations were in use in both Mesopotamia, Egypt, and other regions. Indeed, the first writing appears to have been motivated to keep tax records and other mundane tasks. The first historical or mythological narratives didn’t appear for several centuries afterward.
Writing in these early centuries was difficult, both to do and to learn, with reading and writing usually left to a class of scribes. Still, writing enabled the formation of the first civilizations, kingdoms, and empires.
The second age of writing
I think what I’m calling the second age of writing began with the development of the Canaanite and Phonetic alphabet in the later 2nd millennium and early first millennium B.C. These alphabets probably made reading and writing easier. The Phoenecia trading expeditions carried the Phonetic alphabet far and wide. The influence of these alphabets also spread and influenced writing systems throughout the Mediterranean and southern Asia.
It’s probably no coincidence that the centuries after the spread of this alphabet, from roughly 800 B.C. to 300 B.C., became a period that some scholars now refer to as the Axial Age, the period where much of the philosophical and spiritual foundations of modern civilization were established. Judaism, Zoroastrianism, Buddhism, Jainism, Hinduism, Confucianism, Greek philosophy, and many other intellectual and spiritual movements began during this time.
History is usually said to have begun with written records around 3000 B.C., but our knowledge of these early times (3000-500 B.C.) is sketchy and often based more on archaeology than on the skimpy historical narratives from then. The actual field of history begins in the fifth century B.C, with Herodotus and especially Thucydides. From this period forward, our knowledge of events, at least in literate societies, becomes progressively more detailed.
The Axial Age is a controversial concept, but to whatever extent it was a pivotal age, it was probably due to the spread of writing throughout the ancient world, for the first time enabling the thoughts of prophets, philosophers, and other thinkers to be recorded for the ages. Many of the most sacred scriptures, and the most influential philosophical treatises, date from this period.
The age of print
Writing enabled the detailed thoughts of previous times to be preserved accurately for future generations. Scholars of one generation could read the thoughts of scholars of previous generations and build on their ideas. Progress in human thought could be made. But reading the work of previous scholars was not easy. It was often necessary to visit libraries, monasteries, or other centers of learning in order to read that writing.
This was because every manuscript had to be laboriously copied by hand, making manuscripts very expensive and of widely varying accuracy and quality. The transmission of ideas was slow and uncertain.
Then, in the 15th century, a German blacksmith and goldsmith named Johannes Gutenberg invented the most pivotal technology of the second millenium, the printing press. Suddenly manuscripts could be copied on a mass scale, with greater accuracy, speed, and with far less labor. The rate at which ideas could be shared, debated, and built upon increased dramatically.
It’s no coincidence that, in the centuries after the printing revolution, the Renaissance, the Protestant Reformation, the Scientific Revolution, and the European Age of Discovery all took place. The modern world was forged in the aftermath of the printing revolution.
The age of the internet
The internet is the culmination of many different technologies, from computing which has roots in the 19th century with Charles Babbage‘s analytical engine, to electronics and both wired and wireless communications. As a computer nerd in the early 80s, I participated in network services like Compuserve and BBSs (bulletin board systems), but those were isolated toy-like pockets of connectivity compared to the distributed network of networks that is the modern internet.
Those of us old enough to remember the world prior to the internet can see the profound effect it has already had on society, and that it is continuing to have. The world today is more connected than ever. The fact that you are reading this blog post, probably within a few hours of it being published, likely in a different region of the world from the author, speaks to the ease and speed of modern communications and collaboration.
As profound as the change has been so far, the history of the previous ages of communication show that this is most likely only the beginning, that we are just laying the foundations of this new age. What new intellectual movements will be begun as a result of this new medium of communication? What ancient philosophies will be altered, dispensed with, or enhanced? How different will humanity be after a couple of centuries in this new age of communication?
Only time will tell. Our perspective at this point may be far too limited to make any prediction that will be anything but amusing to future generations.