The iron rule of science?

I’m always interested in new takes on the demarcation between science and non-science, so after seeing the New Yorker write up on Michael Strevens’ new book, The Knowledge Machine: How Irrationality Created Modern Science, it seemed like something I needed to read.

Strevens begins by examining the two leading theories of science: Karl Popper’s falsifiability criterion, and Thomas Kuhn’s paradigm shifts. Falsifiability is the idea that science is primarily in the business of falsifying theories, and that for a theory to be scientific, it must be falsifiable, that is, it must make statements that could, at least in principle, be tested.

Paradigms refer to the prevailing assumptions during a particular period about how science works and what attributes a theory has to have to be considered science. These paradigms hold for long periods, until they begin to show cracks as scientist push their limits, eventually leading to their failure and a new scientific revolution and paradigm shift.

The problem with both views, Stevens argues, is that they don’t really account for what scientists actually do. His frequent example is Arthur Eddington’s 1919 experiment measuring the position of certain stars in the sky near the sun during a solar eclipse, an experiment which has historically been reported as a major victory for Einstein’s general relativity, demonstrating its predictions as more accurate than Newton’s.

But the reality is that Eddington’s results were a muddle that could have plausibly been interpreted as favoring either theory. For various reasons, Eddington had a strong preference for Einstein’s theory, so he spun the results, downplaying the results from one telescope that seemed to confirm Newtonian physics, and making allowances for another that seemed more friendly toward Einstein. History now recalls it as a validation for general relativity because the results held up in subsequent experiments. If they hadn’t, we’d likely remember it as a cautionary tale, if we remembered it at all.

Strevens cites many other instances where scientists played fast and loose with the data, including some pretty famous names. He also describes an observational study of lab work, and just how subjective and ad hoc many of the day to day evaluations of data turn out to be. Science is often said to be self correcting. Which raises the question, where does that self correction happen?

Strevens’ answer is that it happens in the “official” conversation, in venues like scientific journals. There, only one type of argument is allowed: empirical. He characterizes this as “the iron rule”, an uncompromising rule rigidly followed in science communication. The self correction comes if other subsequent papers fail to find the same results.

Note that this iron rule isn’t an assertion about how scientists reason personally, but about how they argue in these venues. For their own reasoning, pretty much any source of ideas are allowed: intuition, philosophy, theology, cultural memes, notions of theoretical beauty, personal biases, competition with other scientists, etc.

All of these influence how scientists do their work and which ideas they pursue. And many, particularly the one about competition, are crucial motivation for doing the often exceedingly tedious work of scientific observation and calculation. Scientists may cite some of them in informal or popular writing. But not in their actual scientific papers. There, Strevens argues, only empirical reasoning is allowed.

When reading this, I initially wondered how theoretical papers fit into this framework. But one of Strevens’ prime examples is Isaac Newton’s Principia Mathematica, pretty much the epitome of a theoretical work. So I think the thesis is that even theoretical work is going to have an empirical orientation.

Another aspect of the rule is that requiring only empirical justification allows for shallow explanations, explanations that pass the iron rule test, but fail many philosophical ones, such as Newton’s description of how gravity worked, while leaving unanswered what gravity itself actually was. (An omission that bothered many of Newton’s contemporaries.)

Strevens characterizes the iron rule as extremely counter intuitive. It doesn’t seem reasonable. Why not allow reasoning from all sources? This is why he thinks humanity went so long before discovering it. He sees it beginning with Newton, in the way Newton personally segregated his empirical and mathematical reasoning from his more philosophical and theological pursuits. Although Strevens recognizes its precursors in people like Tycho Brahe, Galileo, Francis Bacon, and Robert Boyle. But it only became the iron rule after Newton’s success and advocacy of what belonged in “experimental philosophy.”

Why did the iron rule develop when it did in the 17th century? Stevens points to the development of nations as distinct from religious identities, and the need many had to keep the modes of thought from these two domains separate. He argues that what was natural for Newton, may have spread and taken hold in that period exactly because it was a discipline that many people living at that time already had to develop for other reasons.

Toward the end of the book, Stevens ponders the future of science and argues that the iron rule shouldn’t be tampered with. He notes the argument from people like Richard Dawid for a post empirical science. He admits that he can’t be sure it would be disastrous, but sees the possibility as too dangerous to play with.

This is an interesting book, and I think it has some important insights, notably in the distinction between how science is conducted and how arguments are put forth in scientific writing. But the idea that science developed a central dogma: the iron rule, in the 17th century, and that we must adhere to it or risk disaster, makes me uncomfortable.

It is true that investigators became much more serious about observation during the scientific revolution. It was a strong contrast between early modern scientists and medieval philosophers, who tended to rate the authority of ancient figures over observation. But Strevens admits that Aristotle, millenia before the revolution, saw compatibility with observation as crucial. The issue is that Aristotle didn’t see it as the only criteria.

To be fair, Aristotle didn’t have the printing press, a mechanism for rapidly disseminating and responding to ideas. It was arguably this feedback mechanism in the 16th and 17th centuries, particularly with the rise of scientific journals, that made it far more obvious to early modern thinkers what worked and what didn’t.

Arguments grounded in empiricism led to lasting reliable results. Arguments grounded in other things didn’t. We can see in the various statements made by thinkers throughout those centuries leading up to Newton, a gathering consensus on what worked. (With some holdouts, like Descartes, arguing for rationalism.) It seems like if there is a central guiding idea in science, it’s focusing on what works.

And framing something as “the iron rule” makes it sound like science hasn’t changed since 1687. The fact is, what is acceptable in scientific papers is constantly evolving and varies tremendously by field. In many of those fields, what passed muster a century ago might have serious trouble today. In some cases, there have been changes just in the last few years.

Finally, despite the assumptions I made above about theoretical papers, I remain unsure exactly how they fit in with the iron rule. Many theoretical papers make no mention of empirical tests. They leave that to experimentalists. In some cases, it may be decades, or longer, before anyone figures out a way to test a theory. Although we can say those theories shouldn’t be taken as reliable until they pass empirical tests, so maybe they could be seen as pre-rule arguments, ones that shouldn’t be fully accepted until they do pass the test of the iron rule.

What do you think? Is Strevens on to something? Or are my concerns valid?

51 thoughts on “The iron rule of science?

  1. Well, I have not read the book, although I took note of it when I saw it was available. It seems there is some confusion between how science works and scientists work. The Eddington example shows how scientists work poorly but science works well. (Popper’s work and Kuhn’s work do not actually impinge upon how scientists work, so I wonder about the connection there). Eddington’s experiment, the one described, showed a case of scientist bias that we would all hope do not happen, but do from time to time (I was involved in one) and shows off quite well the self corrective aspect of science (not necessarily scientists).

    I think the argument for softening the preference for empirical reasoning is way too close to a preference for “other ways of knowing” so near and dear to theists. Currently scientists are perfectly free to write speculative pieces that are more than a little fanciful (In this corner I give you Freeman Dyson …). They just are not necessarily formal science papers. I read a wonderful letter in Physics Today back in the day in which one physicist was acknowledging the loss of a bet with another, the bet being that the winner wouldn’t hold a real job in the previous ten years. Such things are hardly “sciency” but quite human. The human spirit of scientists is not being repressed. And it is not as if we are failing to make progress at a recognizable rate. There are only a few areas that seem to be making glacial progress, if any, (understanding quantum mechanics, understanding consciousness, etc.) but I also question how many people are actually working on these problems. It is not as if we have put out a Manhattan Project level of effort on each of these to no avail.

    The book sounds as if it is closer to “interesting” than it is to “profound.” I suppose I will have to get around to reading it as some point.

    Liked by 2 people

    1. Sorry, didn’t realize your comment had been snagged by the spam folder until this evening.

      I have mixed feelings about the book. But he doesn’t advocate softening the preference for empirical reasoning.

      Unless you’re referring to my concern about being dogmatic about it. I prefer what works. Empirically grounded arguments work, so I’m onboard. But empirical data always requires interpretation and so is always theory laden. So it’s a complex thing requiring judgment. And replacing religious dogma with a new dogma is just trying to convert people to a new faith, and I don’t think that’s what we want to do.


  2. Vast topic! Science is one of many fundamental but ambiguous terms in our vocabulary. I guess it never would be defined precisely and forever. The idea that science could be based only on empiricism could not hold. Is mathematical physics a science? There are scientific journals dedicated to mathematical physics. Mathematical physics itself is a sibling to theoretical physics.
    The falsifiability criterion is, probably, the best so far. However, the reality is that an overwhelming majority of scientific works (in scientific journals) are very specialized and non-significant, and therefore are never repeated/tested for falsifiability.
    Paradigm shifts, on the other hand, are so rare, that in practical work in science scholars almost never mention them in articles.

    Liked by 2 people

    1. It is indeed a vast topic. I did think Strevens didn’t elaborate enough on exactly what he meant by the iron rule, particularly in terms of theoretical work, which as you note, is often mathematical. Although it could be argued that it’s always grounded in some empirical framework, as opposed to, say, pure mathematical theorems.

      On falsifiability, I think it’s a good standard, but not without its issues. People often seem to forget that it’s falsifiability in principle, not just in current practice, and that what is or isn’t falsifiable takes a lot of judgment.

      One thing Strevens does point out, is that for an area to be really flushed out, requires partisans, those who are either passionately convinced a theory is correct and want to show it, or those who are adamantly convinced it’s false and want to demonstrate that. As you note, a lot of papers never generate that kind of interest, so they may sit for a long time with no attempt at replication.

      Liked by 2 people

      1. In general, the major incentive to work on the replication of other scientists’ experiments is the importance of the experiment’s findings. Of course, importance is a subjective term.
        Also, sometimes, researchers publish falsified results. It highly depends on which field of science we talking about. You could look up (1) Wikipedia article –, and (2) How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data –


        1. Ah, I don’t mean on a technical level. I understand how it’s supposed to work.

          Let me give you an example from a very soft “science” – sociology. There is a ton of research in sociology about postmodernism and critical theory trying to explain social phenomenon. The predictive power of this is, generally speaking, very low. The internal consistency of this is, in many cases, lacking. Much of it is almost certainly wrong. However, the theory is very popular and many people are very invested in “wishing it true,” even going to far as to recommend the reformation of data gathering in order to eliminate unwanted findings. (See Alice Dreger’s book “Galileo’s Middle Finger”)

          When, and by what mechanism, would you predict the field ends up correcting itself and moving on to a new fad?


          1. Let us try to separate two different meanings of the term science: (a) an area of people’s activities and (b) a way to earn for living. Item (b) is subject to all our good or bad habits. On a scale between hard and soft science, the closer a specific field of science is towards hard (precise) science – the less is the impact of (b) on (a) output, and the faster would be correction. For example, on the soft side, the manipulation by hundreds of researchers of data that “fat is bad for the human body” lasted tens of years.
            The other important parameter is the rate of changes in the areas, where knowledge of specific science could/ should be applied. Nowadays the technology is the fastest-changing area of mankind’s activity. If new information from hard science promises an opportunity or an obstacle in technology usage – then it will be tested much sooner than in any soft sciences. That would be not tens of years. It would be years or sooner. It would depend also on how easy, hard, costly, or dangerous is to carry the experiments.

            Liked by 1 person

  3. Well, as I pointed out recently, facts always win in the end, and that’s all science really is — a search for the facts of existence. It’s self-correcting because those facts always win in the end.

    Empirical just means sticking to the facts one can prove. Theory papers do straddle a line. Math can be proved, which is one thing, but linking math to physical phenomenon is another. I think that does require tests that can be falsified.

    I think the “iron rule” is just the common gestalt, like a common language. It’s a way of avoiding what plagues political discourse: “my facts” versus “your facts”. Science can’t disagree on facts, so empiricism makes a good filter. Hard to argue with repeatable test results.

    Liked by 1 person

    1. I agree. I think what disturbs me about the iron rule, isn’t the rule in an of itself. We have centuries of progress showing its usefulness. It’s the idea that it’s this magic thing we accidentally stumbled on in the 17th century, that we might fall out of at any time, and therefore we must be dogmatic about it.

      As you said, it’s more a gestalt of accumulated experience throughout the history of science. The scientific method(s) are themselves a result of science. People like Dawid may propose changes, but unless they can demonstrate its efficacy with new reliable knowledge, few are going to be interested.

      Liked by 1 person

  4. Sounds like a book I should read. I will say this much: the back and forth dialogue that happens in scientific journals is far more important than what’s said in any one paper alone. After a new paper comes out, there will be a response, and then a response to the response, and so on. Science is an ongoing debate, not a proclamation of facts.

    Liked by 1 person

    1. It definitely has its strengths. But similar to your point about scientific papers (which I agree with), books like this are part of the conversation.

      The conversation aspect is one thing a lot of people seem to miss. I often have people give me a link to a paper from 1998 or something as supposed evidence that science takes some proposition seriously. I always want to ask, what is the impact of that paper? Who’s citing it and why?

      It’s why, as a lay person, I actually prefer getting most of my feel for a science from current general science books or textbooks. The authors generally have a feel for which ideas are being taken seriously and which aren’t. Of course, those books age fast, and you want multiple perspectives.

      Liked by 1 person

      1. I have a hard time with older sources like that. Sometimes, I feel like they must be out of date. Other times, though, I’m not so sure. I sort of touched on that two weeks ago with my post about the Venus Reference Atmosphere. It’s from 1985, yet apparently planetary scientists are still using it.

        But of course, if you’re going to cite a source that’s that old, you really should check to see if any follow up research has been done.

        Liked by 1 person

        1. Older sources aren’t automatically out of date. And sometimes, even when they are, they’re still interesting as long as you know what you’re reading. But yeah, if you’re trying to learn the basics, if it’s more than ten years old, caution is warranted.

          Liked by 1 person

      2. This is fascinating. Admittedly I am much more into researching history and philosophy than I am electrical engineering or something like that but I find older sources invaluable. People from the past are theory laden, but they are theory laden in different ways. It’s usually easy to see how the fashions and fads of their times affected their judgment and, by the same token, they make it easy to see how our fashions and fads are affecting our own judgment.


        1. How useful older materials are is definitely related to the particular field. If I’m researching World War II or the Thirty Years War, the most reliable sources are generally accounts written as close to the actual events as possible. But if I want to understand the brain, a neuroscience book written in the last ten years will be far better than anything you’ll find from the 19th century.

          I do agree there’s value in exploring the history of thought, as long as we understand what we’re doing. If I read Darwin or James, it’s not to get the best insights into how evolution or the mind works. If I want that, there are much more modern sources with better information. But if I want insight into how the theories were developed, and how 19th century norms might have affected that development, those are definitely some of the places to go.

          Liked by 1 person

          1. Do you think Darwin helps us see our hangups and how our norms affect our theories or is that, in your opinion, tied up in history/philosophy/psychology/etc?

            For example, I would think Darwin would have a lot fewer hangups frankly researching any biological thing connected to identity groups. You’d have to be a very brave 21st century scientist to honestly research any of those things.


          2. I haven’t read Darwin, but from what I’ve heard, he wasn’t free of 19th century racial paradigms. It actually took a lot of courage to not be in that paradigm back then. Since then, the Overton windows has decisively swung the other way, mostly in reaction to traditional attitudes. It’s definitely a difficult area to do science in. Some, like Pinker, do it anyway, but catch a lot of grief for it.

            Liked by 1 person

  5. Sounds like a very interesting book, and I’m interested to read it.

    My personal work in applied research involves modeling, testing, and innovation across several disciplines. One thing that has stood out to me over the years is the sharp difference between (1) answers that are reasonable and (2) those that are correct. In my own work and in that of others, I’ve seen over and over how 9 of the 10 steps may have been supported on evidence, and the 10th and final step seemed so reasonable, and that is where we often wind up being dead wrong. Reasonableness is just too easily satisfied. Prediction and testing separate the pros from the joes.

    On the extreme ‘joe’ end of things, we of course have conspiracy theorists, who feel that everything they say is entirely reasonable and makes perfect sense. Reasonableness as a filter is almost no filter at all.

    The longer I work at science myself, the wiser Feynman seems: “The first principle is that you must not fool yourself — and you are the easiest person to fool.”

    I like the iron rule personally – a good answer for Feynman’s first principle. An environment with other researchers replicating experiments and interpreting data, where empirical results are the only cash currency, effectively defeats the problem of of humans unwittingly using “reasonableness” and fooling themselves.

    Liked by 1 person

    1. Good points. I spent most of my career doing programming, where I learned that humans generally stink at logic. Something we feel is clear as day turns out to be dead wrong too often.

      And that’s assuming the starting point of the reasoning is valid. Often our starting assumptions are problematic. Or we have implicit assumptions we’re not even conscious of, assumptions that render our reasoning impotent.

      I agree that empiricism serves as a crucial reality check.

      And ideas which are so slippery that they can’t be subjected to that kind of check are suspect. Although I think we have to be careful about applying that standard too early or stringently. Developing scientific theories need space to work outside of those confines. But we shouldn’t regard them as reliable knowledge until key portions of their predictions make an empirical difference.

      Liked by 1 person

  6. Mike, this is an engaging essay. You have most definitely whetted my appetite to read this book. The philosophy and history of science are so new and exciting. Even the word—science—is a term of modern coinage. I still wrestle with the ideas of Popper or Kuhn. (Although I totally agree with Popper that Sigmund Freud’s ideas were mostly pseudoscience.)

    I assume that what is meant (in part) by Streven’s “Iron Rule” is that, fundamentally, modern science rests on the (then) innovative idea that nature is a closed system. I think the study of science as a closed system became firmly planted (although not fully accepted) by those we now call scientists in the 17th century. That is, natural phenomena must be understood, in turn, by natural causes—which, of course, necessarily requires observation, testing and mathematical expression. It’s so fundamental now that I think we forget what a huge step that was in the history of science, especially nowadays where many would even argue that science is the exclusive way to know anything about reality. In short, I think this is a book worth reading.

    Liked by 1 person

    1. Thanks Matti. Glad you found it useful. Hopefully my ambivalence about the book came through in the post. I think it’s interesting, but can’t give it a full throated recommendation. Still, if you find the philosophy of science interesting, it’s definitely a part of that conversation.

      I’m not quite sure what you mean by a closed system. It’s worth noting that the iron rule only requires empiricism. It doesn’t require any other philosophical test. Scientists have actually limited themselves when they did require such tests.

      For example, Newton’s predecessors, in attempting to understand why the planets moved as they did, tied themselves in knots attempting to find a solution under the mechanical philosophy. Newton’s solution, involving action at a distance, violated that philosophy. Likewise, when quantum theorists were figuring out how to move forward with what they had, they had to let go of some philosophical ideas of how reality worked.

      Of course, an adherent of those philosophies can point to later solutions that seemed to rescue the philosophy: general relativity in the case of gravity, and deterministic interpretations in the case of quantum mechanics (although they come with other costs). But those alternatives required conceptual work. Scientists prior to those conceptual breakthroughs, in order to move forward, had to develop pragmatic placeholder theories.

      All of which is to say, most scientists do expect naturalistic physicalist explanations, but what counts as naturalistic physicalist explanations is subject to change on new evidence.

      Liked by 1 person

      1. Mike, you ambivalence comes across. I like that. As I said I’m also wrestling with the the ideas of the likes Karl Popper and Thomas Kuhn.

        What I mean by a closed system is, from my understanding, what is foundational for empiricism and modern science. That is, by the 17th Century there developed an epistemological shift. Natural phenomena had to be understood in terms of cause and effect, rejecting Aristotle’s so-called four causes. Moreover, the natural world would be assumed to be a closed system in that natural phenomena had to be described solely in terms of natural causes. That, I think was an important step. And, when you think about it, foundational for empiricism.

        Prior to that shift natural philosophers (scientists) would regularly whistle up God, other supernatural causes, or authority to explain difficult gaps in their chain of reasoning.


        1. There was definitely a change in what was expected from an explanation. As you mention, medieval and classic natural philosophers were expected to explain all of the four causes. As the scientific revolution progressed, that fell by the way side, until only the efficient or moving cause was necessary. In some ways, this was imported from the field of engineering.

          Renaissance engineers were a class between manual laborers and natural philosophers. They had a tradition of only figuring out what worked. No one expected them to worry about things like material, formal, or final causes. A cannon either did what it was supposed to or it didn’t. If it did, that was good enough. If it didn’t, the philosophy was irrelevant.

          It seems like early scientific figures like Tycho Brahe, Johannes Kepler, William Gilbert, and Galileo, were heavily influenced by engineering as put forth by earlier engineers like Niccolo Tartaglia. (Galileo himself, in addition to being a natural philosopher and mathematician, was also an engineer.)


    2. I found this post and the associated New Yorker article quite interesting as well Matti. And like Mike, it does also leave me a bit ambivalent. To me “The Iron Rule of Empiricism” might serve as a marketing tool for a relatively obvious position. My hope is that some day someone distinguished will finally “go big” here, as I consider myself to. So try this:

      When natural philosophy (or “science”) emerged from the rest of philosophy around four centuries ago, it critically left behind the topics of metaphysics, epistemology, and axiology to carry on as before. Science went on to develop various communities or respected professionals with generally accepted ideas that seemed to work in all sorts of areas, but not so the left behind areas in philosophy. The basic problem as I see it is that scientists in general need effective principles of metaphysics, epistemology, and axiology in order to do their jobs better. Thus science today displays associated faults, though most substantially in its mental and behavioral areas (perhaps given that “harder” forms of science are effectively able to bypass the void in axiology).

      I believe that a new breed of philosopher must emerge (like Sabine Hossenfelder?) which has a singular mission — to develop effective principles that each member is able to endorse for scientists and people in general to use. Thus we’d have a traditional “artesian philosophy” for timeless appreciation and pondering, as well as this new form defined by a mandate to shore up the foundation upon which science itself rests.

      In my second principle of epistemology I don’t discuss anything as limited as “science”, but rather an exclusive means by which anything conscious (which is to say, possesses qualia) is able to consciously figure anything out. It takes what it thinks it knows (or “evidence”) and uses this to test what it’s not so sure about (or “a model”). As a given model continues to remain consistent with evidence, it tends to become more believed.

      But how useful could this “figuring out” business be in science when terms are commonly taken in separate ways? Thus I also provide my first principle of epistemology, or that it’s an assessors obligation to accept both the implicit and explicit definitions found in any given proposal in the attempt to grasp and assess what’s being said.

      Regarding metaphysics my observation is that science itself becomes obsolete to the extent that causality fails (or remains “open”). Thus I propose a brand of science where causality is presumed absolute, as well as a brand where scientists are also permitted to propose supernatural solutions. Conversely today science is mixed up in this regard, and so one side will be tainted by the other.

      Then finally there is the question of “purpose” or “value”. I believe that this community will need to formally state that “feeling good/bad” defines the personal value of existing for anything anywhere. I consider this stuff to drive the conscious form of function, somewhat like electricity drives the computers that we build. This position has long been presumed in the specialized science of economics, though the social tool of morality seems to make it difficult for central behavioral sciences, such as psychology, to take this step as well. If this proposed community were to become generally validated then science should thus progress, and I think most notably in our softer forms of science given an axiological premise from which to build.

      Liked by 1 person

      1. Eric,

        Your ideas are a bit overwhelming. That is my own inadequacy for sure. Frankly, my training in philosophy was very traditional or, perhaps, old school. I gravitated to this site because I discovered interesting debates well outside my own well trod philosophical territory. I decided to jump in, read, maybe contribute (I’ve done too much of that), but mostly learn. I do have some opinions regarding the relationship of science to philosophy beyond my rudimentary knowledge of the history of modern science. But as certain as I think I am, I suspect my conclusions are not fully formed at present. Mike’s book review did inspire me to order the book though.

        Liked by 1 person

        1. I’m quite sure that any “overwhelming” perception that you have of my proposal is not due to inadequacy on your part whatsoever Matti. It’s simply that I’m proposing something far bigger than what Strevens is proposing. I could have brought this up with Mike, though he’s heard it from me often enough before and may not have a whole lot more to say on the matter that he considers interesting. I’m always looking for intelligent people to potentially assess my ideas. This brings me hope that I’ll receive effective criticism, or might even interest someone.

          I suspect that you’ll find the Strevens book interesting, and that I would as well since I’m all for evidence based science. Regarding my own proposal however, I believe that science today doesn’t work nearly as well as it might since it lacks a community of respected professionals able to provide scientists with various accepted principles of epistemology, metaphysics, and axiology to help guide their work. Traditional philosophers often take exception to such speculation, and perhaps because it implies that they haven’t done a good enough job? Regardless they’ll sometimes cry “Scientism!”, or may even claim that there are both philosophy and science kinds of stuff to ponder — effectively “epistemic dualism”. If so then I guess I’d be making a category error, though I think not.

          To hopefully get around such worries I mean for philosophy as it stands today to be left alone. Instead I propose that a new community be created with a mandate to develop agreed upon principles of philosophy which scientists in general find useful. Furthermore above I did provide four principles that I consider pretty good, though we needn’t get into them. More importantly I wonder if you think that science does struggle without accepted philosophical principles to use, and so we should try to build the new community that I propose?


          1. Eric,

            You ask: Do I think science struggles without accepted philosophical principles? If so, should we try to build a new community?

            Since I myself “struggle” to comprehend your first question, I’ll ignore your second question for the time being.

            Since the middle of the twentieth century science has been funded to conduct research in a way unheard of in previous centuries. Governments, universities and industry commit astronomical sums for general research as well as defense-related research. So, first, I hardly think science struggles with any material resources.

            The relatively new studies of the history and philosophy of science are still in their early years of debating how science actually proceeds, what areas of inquiry lend themselves to a scientific approach and what basic or philosophical principles appear to guide science research. It has already provided fruitful tools of analysis and criticism. But it is still very much a work in progress.

            Second, science and a scientific way of describing reality, rather than struggling for public acceptance is, in fact, the overwhelmingly dominant world view or weltanschauung. In fact, it is criticized—as a result of this early and ongoing debate—for its hubris and its attempts crowd out and depreciate or dismiss other useful avenues of inquiry about the human condition, which are often labeled as mere opinion in contrast to a truly truth-seeking scientific inquiry.

            So, I suspect your word struggle is intended to mean that it proceeds in a wrong headed direction. And that—if that is your true meaning—is what I struggle with. So, in short, what do you mean by struggle? I don’t see much of a struggle myself from where I sit.

            Liked by 1 person

          2. Fortunately Matti I don’t consider science “wrong headed”. This is to say that I don’t consider it to be misguided or in bad judgement. Beyond the development of spoken language hundreds of millennia ago, and then specialized occupations maybe ten millennia ago, and then written language a bit over 5 millennia ago, I consider the emergence of hard forms of science less than a half millennia ago to exist as one of these four instrumental revolutions of human power. And as you’ve implied I’d also characterize this relatively modern endeavor as a “work in progress” .

            So rather than “wrong headed”, by “struggle” I meant that science is “restricted”, “constrained”, “hampered”, and the like given that we have no respected community of professionals who provide scientists with generally accepted understandings regarding epistemology, metaphysics, and axiology. This void is sometimes used by certain scientists to criticize the entire endeavor of philosophy as a waste of time. Here it may be noted that there aren’t any handbooks which describe what philosophers generally believe, and specifically because no reasonable consensus’ exists in the field. I conversely would like the field to continue on indefinitely as it already has for two and a half millennia. What I’m further suggesting however is that scientists need various basic principles of metaphysics, epistemology, and axiology in order to do their work more effectively than they do today, and so propose that we attempt to create a second and initially quite small community whose only purpose is to develop generally accepted principles for scientists to use.

            Sabine Hossenfelder is a physicist who has long decried that mainstream theoretical physicists substitute “beauty” for “evidence”, and contrary to Michael Strevens’ supposed “iron rule of empiricism”. Here’s a particularly sharp post that she did on the matter recently:

            Apparently physics has progressed so far that today many in the field find it convenient to drop the requirement for evidence based assessments. But more importantly as I see it, what about various “soft sciences” who’ve never even glimpsed Newtonian levels of reduction? What about the circus associated with “consciousness” for example? My position is that science in general suffers without accepted principles of philosophy to use, and “harder” forms of science have merely been superior because they happen to be less vulnerable in certain ways.

            Regardless, does my speculation sound plausible to you? And if you instead believe that there can be no basic principles of metaphysics, epistemology, or axiology by which science could function better than it does today, could you talk about why that might be the case?


  7. Eric,

    So, the word ‘struggle’ means that there is “no respected community of professionals who provide scientists with generally accepted understandings regarding epistemology, metaphysics, and axiology.” Holy cow! Well, either you are yanking my chain or there is a mountain of material to unpack in that statement. Let me understand: Epistemology (the study of knowledge), metaphysics (the study of being, ontology, and causality), and axiology (the study of all theories of value from ethics to aesthetics). So, apart from that trio, there’s really not much left is there?

    First of all, why would a scientist (or even an accountant or a dentist, or a lawyer, etc.) need some community of professionals to define virtually all of reality for him or her? I’m sure an astrophysicist could do his or her work quite nicely even though he or she was a practicing Catholic, observant Jew, atheist, Marxist, monarchist, vegetarian, carnivore, etc., or any combination thereof. I admit we might get into some dodgy issues if our astrophysicist is some flavor of duelist, idealist or physicalist. But perhaps not anything that can’t be worked around. I fail to see a problem. I’m not even getting into how one could possibly structure such a community of professionals to provide scientists with these generally accepted understandings. As Dezi used to say, “Lucy, you got a lot of ‘splaining’ to do!

    The physicist Sabine Hossenfelder appears to be engaged in a debate internal to the particle physics community. This, from the little I got from her You Tube presentation, seems natural and healthy for physics or any discipline for that matter. But, what relevance does it have for your concern that “science struggles without some group imposing a collection of all-embracing philosophical principles”?

    Liked by 1 person

    1. Matti,
      I’m quite pleased that you seem to grasp the magnitude of what I’m proposing! Furthermore given the extent to which I’m challenging your current perspective, I see that you’re extremely skeptical that I’ll be able to properly fasten all the associated lose ends back together in a sensible way. Fortunately trying to effectively “splain” my position should help demonstrate where it might be solid as well as where various elements might need to be altered or abandoned. (And damn it, somehow my first version of this response disappeared before I was able to publish! Unfortunately I’ve found that writing this “do over” has been frustrating.)

      It’s not quite that I consider us to need a respected community of professionals to “define virtually all of reality”. Instead I’m just saying that this proposed community would need to set up metaphysical, epistemological, and axiological rules from which to do science more effectively than it’s generally been done in the past. But evaluating my proposal in a mere hypothetical capacity should be challenging. Thus I’m providing some rules that I propose to at least give you a reasonably concrete example of what I’m talking about, and even if it turns out that they aren’t very good.

      My single principle of metaphysics:

      To the extent that causality fails, nothing exists to even potentially understand.

      What this should effectively do if accepted is divide science into a purely casual standard form, as well as a second variety of science which is open to non-causal dynamics. Though naturalistic explanations in modern science are of course preferred, supernatural speculation does seem to exist here as well. This principle however would effectively segregate the two, and not because we can establish that one happens to be more true than the other. Instead this would be because each reside under separate metaphysical classifications of function. The open question for a given scientist to ponder here would thus be, “Even if true, why should I attempt to grasp that which is proposed to lack any causal dynamics to potentially grasp?” This is a question that I’d like to ask people like David Chalmers.

      My first principle of epistemology:

      There are no true or false definitions for any given term, but rather more and less useful definitions in a given context. Thus here it shall be the evaluator’s obligation to accept both the explicit and implicit definitions associated with a given proposal.

      I’ve noticed that one of the most common challenges in science is that people tend to talk past each other by means of conflicting definitions. This principle should help remedy that by mandating exactly who shall have definitional right of way. And imagine how ridiculous things would become if the other way were to prevail, or that the proposer be mandated to conform with whatever definitions that a given audience happens to prefer? No, the proposer must have definitional freedom, though here the “usefulness” of any given definition shall remain open for potential criticism.

      My second principle of epistemology:

      There is only one process by which anything conscious, consciously figures anything out: It takes what it thinks it knows (or “evidence”) and uses this to assess what it’s not so sure about (or “a model”). As a given model continues to remain consistent with evidence, it tends to become progressively more believed.

      What this would do is formalize not just a procedure from which to do science, but decree a fundamental process by which qualia based machines (like us) function. Often philosophers try to demarcate science from pseudoscience, or a task which I consider ineffective. I’ve instead taken this question back to something that might be fundamental enough. This is the principle which I’d use to help Sabine’s cause in the physics community.

      My single principle of axiology:

      Qualia serves as a punishment / reward dynamic which constitutes all that’s valuable to anything, thus effectively motivating the conscious form of operation that we know of existence. It’s somewhat like the “non value” manner in which electricity drives the computers that we build. With qualia however, “value” may indeed transpire.

      I consider this not yet formally accepted position in academia to largely explain why our mental and behavioral sciences remain as soft as they do. How might one effectively reduce our function in a psychological capacity, without addressing our motivation? Observe that “hard” forms of science needn’t concern themselves with axiology, and so haven’t been thusly impeded.

      If this principle does happen to be valid then why might it be difficult for scientist to acknowledge that feeling good/bad constitutes the value of existing? I suspect because this lies in contrast with the social tool of morality. This is to say that we tend to be socially punished for formally repeating such hedonistic notions given that our self interested nature mandates that we instead portray ourselves in altruistic ways to thus reap various social benefits.

      This axiological position is formally known as “psychological egoism”. The science of economics does happen to be founded upon it, which is apparently permitted because economics is situated far enough from the central field of psychology to not challenge the social tool of morality.

      Whether or not you decide that any of my four principles might help science advance, can you effectively argue that science needs no community of respected professionals who provide scientists with at least some such guidance, and thus archeologists, biologists, sociologists and so on should instead be left to their own devices regarding these matters?

      Secondly you’ve wondered how the community that I propose might be structured if my plan does happen to be pretty good? My answer is the same way that the rest of science is structured. Popularity seems to be what matters here. So if science had some kind of overriding authority armed with rules from which to suggest “Yes that’s kosher”, or “No that’s bullshit”, then I suspect that’s all we’d need in order for such rules to be implemented.

      By the way I don’t know exactly how you’re defining “physicalism”, but what do you consider wrong with this position from your definition?


        1. I don’t mean to be overtly flattering here Matti, but “Fuck Yeah!” That’s exactly what I guy like me wants to hear. We all measure each other up almost involuntary. Sometimes for the people that we meet we’re pleasantly surprised to need an extended measuring tape.


          1. Eric,

            I have read and reread your thesis and I think I should just waive a white flag, respectfully bracket your thoughts, and just let it simmer on my mental back burner. I should repeat a comment I’ve made previously. I’m traditionally trained in philosophy. I gravitated to this site because I discovered interesting debates well outside my own well trod philosophical territory. I still need to get my sea legs on some of the issues discussed here.

            Regarding your proposition, I reiterate that I simply cannot see a problem hence a need for a “…respected community of professionals who provide scientists with generally accepted understandings regarding epistemology, metaphysics, and axiology.” We more or less had such an arrangement when the West was dominated by a universal Christian church. And that arrangement hardly calls to me now. Moreover, I’m really not worried that we need a set of “…principles [that] might help science advance” as you say. On the contrary, my concern is that science, especially in the last century, has attempted to define or reduce too many problems surrounding the human condition to issues that lend themselves to scientific or pseudoscience solutions. That is, the scientific mind-set has, in many respects engaged in over reach. So, I may not be an ally in your project. But, for now, I’ll keep an open mind.

            Liked by 1 person

          2. Thanks for your efforts Matti. It sounds like you’re here to expand your traditionally trained philosophical perspective, and I think you’ll find this to be a great place for that. If you ever decide that your sea legs are stable enough to dabble in “the hard problem of consciousness”, I’d love for you to assess my “thumb pain” thought experiment. I consider it more direct and thus potentially more effective than John Searle’s Chinese room.

            Though I’m certainly not traditionally trained in philosophy, I have developed at least some appreciation for the stuff. I used to blog quite a bit under professors Massimo Pigliucci (CUNY) and Daniel Kaufman (Missouri State). Though Massimo would rarely mix it up with me himself, often Dan couldn’t help himself, which made for some interesting exchanges. I’m quite sure that Massimo would chuckle to himself from time to time about how difficult it was for his esteemed colleague to come out on top, let alone others over there.

            I consider the worry about scientists venturing where they shouldn’t to somewhat help make my case. If there are things which scientists have no business attempting to grasp, then who shall be charged with clarifying these limits? If philosophers were to be the authority on the matter (which seems reasonable to me), then wouldn’t it make sense to have a respected community of them armed with a relatively agreed upon message regarding what’s beyond the scope of science? If not, how might standard scientists gain such an education? (And I’d say in the past with Christianity that we “less” had a community like the one I propose, since it was the rise of science itself which brought the progressive fall of theism in academia.)

            I’m aware that certain scientists belittle philosophy as a general waste of time. But how many of them also argue that pursuits like music, dance, and literature are also wastes of time? Few I think. It seems to me that philosophy has developed just as strong a place in western culture over its two and a half millennia as these others. Thus I’d consider it a tragedy for the modern rise of science to forever alter the traditional exploration of philosophy.

            The way I’d help shelter traditional philosophy from the turbulence of progression in science, is demand that a separate community be developed to help improve science. It matters not to me what name this second community takes, but merely that its mission would be to develop accepted principles of metaphysics, epistemology, and axiology which scientists in general find effective to guide their work. So I’m not actually talking about changing philosophy, but rather adding something to science so that it might function better than it has in the past. Perhaps a good name for this new discipline would be “meta science”? And maybe certain traditional philosophers would decide to join this community in addition to their own? Most wouldn’t I suppose, or wouldn’t even be allowed in given various conflicting positions.

            I brought up a skirmish in the physics community earlier, though in truth I merely find this convenient for my own purposes. I don’t consider humanity to actually suffer given that our traditionally hardest science has gone so far that it’s now striving to replace empirical evidence. But how might a creature which functions by means of a value dynamic, learn about its nature if a social tool exists which effectively punishes it for honestly acknowledging what’s valuable to it? This, I think, helps explain why psychologists have found it so challenging to develop effective general reductions regarding our nature. The concept of value is where the bulk of my ideas reside, and in an amoral capacity.


  8. “It seems like if there is a central guiding idea in science, it’s focusing on what works.”

    This is something that, to me, sounds very much out of William James. His central argument in “Essays on Pragmatism” is that the fundamental “truth,” whatever that is, is not what actually matters to us. He disposes of Kantian nuemenon by suggesting that fundamental truth can be ignored if we focus on pragmatic truth – a thing he defines as “something that improves our explanatory power.” In this sense, the Helios’ Chariot Theory of Solar Movement is more true than pre-calender methods but less true than relativity, even though all three are likely “wrong” in the fundamental sense of describing that burning ball of gas in the sky.

    This is what I actually believe ^^^^^^^ The stuff below this line is just a train of thought I think might amuse you and your readers.

    The scientific process of empirical checking you describe could be interpreted as a Schopenhaurian “will to power.” In Schopenhaurian terms, as far as I understand him, the will to power is not at all restricted to human beings. It is, instead, the “will” everything in the universe has to alter its surroundings and conditions. The falsification and paradigm shift models mentioned above seem to operate in this sense as a mechanism for collective subjectivity (empiricism) to overpower individual subjectivities at regular intervals.

    Liked by 1 person

    1. I think science is relentlessly pragmatic. Not necessarily individual scientists, who bring all their individual beliefs, hang ups, biases, conceits, etc, into the process. But for any meaningful assertion they might make, there are others with different biases, etc, to challenge them. What comes out is often the most reliable knowledge available at that point.

      But to your point, most reliable is all we ever get. We can never know if we’ve actually hit truth. Although I think it’s reasonable to think we’re getting progressively closer to it, despite occasional wrong turns and dead ends.

      I’m not familiar enough with Schopenhauer’s will to power to comment intelligently, but his use of “will” seems equivalent to “force”.


      1. Do you think it’s “most reliable,” “closest to fundamental truth” or “most useful” that scientists usually coalesce around? (I vote 3)

        As for Schopenhauer, it’s basically a drive to make the environment conform to self. Rocks can have this, according to him. 🙂


        1. On the scientists, I think the answer depends on the individual scientist. There won’t be homogeneity. A Niels Bohr might go for what’s most useful, depending on exactly what it’s useful for. An Albert Einstein wants fundamental truth. They’ll arbitrate between each other based on what can most reliably be demonstrated.

          As I understand it, Schopenhauer was an idealist, so that fits.

          Liked by 2 people

  9. I have some interest in philosophy of science, but I do not know enough about the issue. I think that this really depends, since we can’t really assert “what works” in fields such as sociology where research is pretty much still grounded in classical thinking such as authors as Comte, Marx, Durkheim, and Weber. The way scientists do science in a field differs from they it is done in different ones, so I don’t think we can really assert that there is such a thing as an iron rule.
    Dupré and Pigliucci are known for arguing in favor of a less strict concept of science, where we treat it as a family resemblance in Wittgensteinian terms. Science is a family resemblance concept with no strict definition, but this does not necessarily entail that we are doomed to lack full knowledge of its nature.

    Liked by 1 person

    1. Can’t say I’m too familiar with sociology. I do recall an article some years ago from a sociologist arguing that the field should rethink its commitment to positivism, that is, their commitment to empirical justification. (Not to be confused with the now dead logical positivism movement.) As I recall, his call got a lot of criticism. I think most in the field recognize their status as a science hinges on their commitment to empiricism.

      People in the positivism tradition tend to be thought of as scientists. Those in the anti-positivist tradition, such as Hegel and Marx, are now seen as philosophers.

      So we could see sociology’s commitment to positivism, to the extent it exists, as its version of the iron rule. But I agree that we’re more talking more about what works rather than some dogmatic article of faith.

      Liked by 1 person

  10. Hi Mike, I was interested in this part of your article:

    “Paradigms refer to the prevailing assumptions during a particular period about how science works and what attributes a theory has to have to be considered science. These paradigms hold for long periods, until they begin to show cracks as scientist push their limits, eventually leading to their failure and a new scientific revolution and paradigm shift.”

    It’s my sense that we are approaching another major paradigm shift.

    For the past few centuries at least, the “prevailing assumption” of science culture has been that the purpose of science is to seek more knowledge. It seems to be taken as an obvious given unworthy of examination that scientists should seek as much new knowledge as possible, as fast as possible, limited only by available resources such as budgets. This paradigm came to prominence in response to a long era of knowledge scarcity, and made sense as a response to that scarcity.

    The paradigm shift I see coming is the dawning realization that we no longer live in the old era of knowledge scarcity, but in a revolutionary new era characterized by knowledge exploding in every direction at an ever accelerating rate. Yes, everyone will say that they know this, but it doesn’t seem to have occurred to very many in the science community (and larger culture) that we are attempting to navigate today’s knowledge explosion era using a “more is better” paradigm which was designed to address a very different situation than we currently reside in. Technically, we are very advanced today. But the “more is better” paradigm today’s science is built upon is a philosophical relic of the 19th century and earlier.

    The old “more is better” paradigm is built upon the assumption that human beings can successfully manage any amount of new knowledge, and thus power, delivered at any rate. That was a reasonable assumption a century and more ago when the pace of change was slower than today and the powers being unleashed were of a smaller scale. That era ended on August 6, 1945 at 8:15 am over Hiroshima, Japan.

    Seventy years later we still have no idea how to protect ourselves from the first technology of existential scale, which can erase our civilization in minutes. And because scientists are still operating from the old “more is better” paradigm, they’ve learned nothing from the splitting of the atom, and are instead investing billions of dollars in unlocking the secrets of even more fundamental particles. On top of that, we’re deeply engaged in creating even more powers of existential scale like AI and genetic engineering. Our plan seems to be to create as many powers of existential scale as possible, and see what happens, as if that were impossible to predict.

    Sooner or later one or more of these vast powers is going to slip from our control and create unspeakable suffering on a never before imagined scale. And when that happens even those most deeply addicted to the old “more is better” paradigm are going to start questioning their loyalty to that worldview.

    The bottom line is that human beings are not gods, but limited creatures, like every other species on the planet. And because we are limited, there is some limit to how much power we can successfully manage. Once that is understood it should quickly become clear that the “more is better” relationship with knowledge paradigm is due for a serious review.

    After writing about this for years in many places, I’ve come to the conclusion that reason alone will not be sufficient for the creation of a new more modern paradigm. It’s going to take some level of pain to break out of the old paradigm. But perhaps this has always been the case. As example, the Enlightenment may have unfolded centuries ago for the simple reason that human beings finally got so fed up with living in mud huts that they were willing to consider a new paradigm beyond the Catholic Church which had dominated our culture for the previous 1,000 years.

    Liked by 1 person

    1. Hi Phil,
      Given that the etymological root of “science” is the Latin word for knowledge, we probably shouldn’t be too surprised that scientists see it as an unquestioned goal. And honestly, I’m mostly in that camp myself. I think the best way to deal with problematic knowledge is with additional knowledge.

      On the other hand, you might find this post interesting. It discusses an article that asks what we do with knowledge that is existentially dangerous.

      The return of heretical thought?


  11. Thanks for your reply Mike. You’ll find me OBSESSED with our relationship with knowledge. If I go on about it too much, please feel free to tell me to dial it back.

    The irony is that I’m arguing for more knowledge too. Knowledge about how to take control of the knowledge explosion. And critics of my perspective are typically arguing we shouldn’t acquire such knowledge. Who is the Luddite?

    Anyway, if your time and interest should permit, you can see my argument by clicking on my name. If this is something you would like to discuss you’ll find me very receptive. Maybe you’d like to write an article on the topic and we could discuss it there? Or, you should also feel free to republish my article if you prefer.

    Liked by 1 person

Your thoughts?

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.