Students showing up at college understanding the fact value distinction is a good thing.

Justin P. McBrayer, an ethics and philosophy of religion professor, has an opinion piece in the New York Times bemoaning the fact that students are showing up for college not believing that moral rules are facts.

What would you say if you found out that our public schools were teaching children that it is not true that it’s wrong to kill people for fun or cheat on tests? Would you be surprised?

I was. As a philosopher, I already knew that many college-aged students don’t believe in moral facts. While there are no national surveys quantifying this phenomenon, philosophy professors with whom I have spoken suggest that the overwhelming majority of college freshman in their classrooms view moral claims as mere opinions that are not true or are true only relative to a culture.

…A few weeks ago, I learned that students are exposed to this sort of thinking well before crossing the threshold of higher education. When I went to visit my son’s second grade open house, I found a troubling pair of signs hanging over the bulletin board. They read:

Fact: Something that is true about a subject and can be tested or proven.

Opinion: What someone thinks, feels, or believes.

McBrayer obviously intended his post to be an alarmist piece, and perhaps a call to arms.  But I have to admit to finding it reassuring.  If students are showing up for college understanding the distinction between facts and opinions, including values, then that says a lot about the caliber and open mindedness of today’s college students.  (Unfortunately, I tend to think McBrayer is exaggerating how many student do actually show up with these views.)

McBrayer points out that a fact is also an opinion, which is true enough, but it’s also true that it’s not just an opinion.  He asserts toward the end of his piece that facts are just opinions that are true.  Well yes, but what he is either missing or choosing to ignore is the issue of how we know that an opinion is true, that it is a fact.  The definition above that he finds so disturbing is not perfect, but as a working definition to separate fact from mere opinion, I find it far superior to his definition.

McBrayer asserts that understanding this distinction is inconsistent with enforcing ethical rules, particularly school rules on cheating.  I think that’s silly.  Whether or not ethical rules are right or wrong may be relative to a society (or in this case a particular school), but within the scope of that society, their existence as rules are facts that students have to deal with.  McBrayer asserts that understanding of the fact-value distinction is responsible for the increase in student cheating and other immorality.  But he offers no evidence for this assertion, no test or proof that his assertion is, well, a fact rather than just an opinion.

I can certainly understand the strong desire for moral precepts to be facts similar to mathematical truths or scientific conclusions.  I wish they were myself.  It would make ethical debates so much easier.  It would merely be a matter of testing a proposition or perhaps putting together a logical proof.  But moral values can only be proven in relation to other moral values.  Eventually, as you dig down through the moral axioms, you unavoidably hit a wall of subjectivity.

Now, I’ve written before why I think seeing moral rules as arbitrary whimsical opinions is wrong.  Moral conclusions are often deeply felt, visceral, intense reactions.  They arise from various moral foundation instincts, evolved biological impulses fine tuned by culture, that are often contradictory.  Unfortunately, the strengths of those various foundations vary among individuals and societies, which means that different people will conscientiously disagree on matters of morality.  Moral values are more than just whimsical opinion, but they don’t rise to the level of being absolute facts.

Often, many of the people insisting that moral precepts are facts are arguing for their particular moral values.  There are considerable dangers in that approach.  Ask any gay person currently living in a state that bans gay marriage, for instance.  Understanding that moral imperatives aren’t absolute facts makes a person far more open minded, and reduces the probability of discriminating against people whose outlooks don’t match their personal worldview.

Ultimately, fundamental values can’t be proven, they can only be advocated for.  If moral values are facts, we appear to have no reliable way to ascertain them.  We have no choice but to do the hard work of finding social rules that the majority of us can live with.  Accepting this reality isn’t decadence, it’s maturity.  And I’m happy to hear that at least a portion of the current generation of college students has it.

46 thoughts on “Students showing up at college understanding the fact value distinction is a good thing.

  1. Hmm, cheating on tests is wrong? Really? Were we not treated recently to the situation in China where a crackdown on cheating during high-stakes testing resulted in major push back by students and parents. They claimed that students have always been allowed to cheat and if they are not, that puts them at a great disadvantage.

    And killing people for fun is clearly wrong, but it is okay for profit or access to oil, right?


    1. Hmmm, hadn’t heard about the China cheating thing. But I know Confucian cultures have a tradition of high stakes tests. I’d imagine the incentive to cheat would be very strong. I wonder if it’s one of those things that are enforced inconsistently, with certain classes (which aren’t supposed to exist in modern China), being given more of a pass.

      People are killed for oil? (Just kidding; don’t answer that.)


  2. As far as moral precepts being facts or not I tend to lean toward your view SAP. But aren’t ethical philosophers all over the map on meta-ethics? I think some do believe moral facts exist. Not sure exactly what their evidence/reasoning is for believing that, but it does seem to be a view expressed by some I’ve seen (e.g. Shelly Kagan, Michael Martin, Erik Wielenberg, and perhaps even our very own ausomeawestin). I’m curious about this because I’m in the midst of writing a post on morality and want to make sure I’ve got the different viewpoints properly understood.

    Liked by 2 people

    1. Howie, you’re totally right that philosophers are all over the map on this.
      A slight majority are actually moral realists. (See number 14 on the linked list) I would describe my own position as a semi-realist, which to many realists is the same as being an anti-realist. I totally understand a realist’s desire for moral imperatives to be facts, but I’ve never had one demonstrate it convincingly to me. They are descriptive social or cultural facts, but only in relation to that society or culture.

      I’ve found this Wikipedia article informative. It succinctly summarizes what I’ve read elsewhere.

      Looking forward to your post!

      Liked by 2 people

  3. I think you bring out the point that students might hold this opinion even if they were not told moral judgements are opinions. Being told that we ought to be tolerant of other ways of life because it’s hard to know whom is right might lead to the, faulty mind you, inference that there are no facts of the matter. Given that it really truly is hard to know whom is right, assuming there are moral facts, we should bite the bullet and embrace tolerance based in nihilism rather than risk intolerance.

    You’re also quite right that it is rubbish that it is contradictory to tell students there are no moral facts but that they ought not to cheat on tests. That there is social utility to following rules that make fair institutions in a society function smoothly is a reason to follow those rules. Whether there are moral facts above and beyond that is a separate question — or whether that is all that moral reality is is likewise another question.

    Thanks for sharing this interesting piece and your thoughtful comments on it.


    Liked by 2 people

    1. Thanks ausomeawestin! Good point. There may well be a difference between the ontological status of moral rules and the epistemological status. But until we figure out a way to bridge that divide, we should be tolerant, when possible. (Obviously we won’t tolerate mass murderers, rapists, and the like.)


    2. “Being told that we ought to be tolerant of other ways of life because it’s hard to know whom is right might lead to the, faulty mind you, inference that there are no facts of the matter.”

      That right there, and the sentence that follows it, may be what’s really bugging McBrayer. I know it bugs the crap outta me! Our social desire for tolerance and acceptance seem to have led to an “it’s all good” philosophy that leads directly (in my eyes) to a loss of moral compass.


  4. Lol,those philosophers’ imaginaations are so twisted.

    How should fact and opinion can be “properly” defined anyway?


  5. Excellent post!

    There was a saying back in the day: “It’s good to have an open mind, but not so open your brains leak out.” Contrary to popular assertion: No! It’s not all good.

    There is a spectrum from ideas about the world to facts about the world. There is also a difference between facts that are true by definition (“unmarried men are bachelors”) and facts that are true by measurement (“that building has 12 floors”). The former cannot be denied without contradiction; the latter may be false. Plus, some facts are true for everyone (“the Moon orbits the Earth”) whereas others are individually true (“sausage is the best pizza topping”).

    Our moral precepts fall somewhere between mere opinion and measurable fact. Culture and environment play a big role — in one of your linked posts you talk about the differences between hunter, farming, and urban, societies. These may need different rules for prospering. But are there underlying principles that might unify these? (Such as the EM, weak, and strong, forces might be unified.)

    It’s hard to deny that rape or slavery or first-degree murder are wrong. Kant’s categorical imperative turns out to be a fairly good parsing tool for determining the good-bad value of a specific action. (I think it’s a lot less useful in parsing real world situations.) Perhaps what’s important with moral “facts” isn’t measuring or proving them (which is impossible) but in demonstrating their coherency.

    It may be that fundamental moral perceptions give rise to social rules, and those social rules express differently in different contexts. On a trivial level, a moral principle might be “do the right thing” where “right thing” is defined in a social context.

    Something you assert frequently is: “We have no choice but to do the hard work of finding social rules that the majority of us can live with.” I agree without reservation! The pity is that so few even begin that work. (One social advantage of the idea of god — regardless of the reality — is that it does present a moral compass for those not interested in, or not capable of, doing the work.)


    1. Thanks Wyrd.

      “But are there underlying principles that might unify these?”
      I think that’s the goal of Jonathan Haidt’s work. The moral foundations is his answer. But the foundations are not entirely satisfying for someone looking for guidance, since different people feel those foundations in different combinations of intensities.

      I do use that phrase frequently. I’m pretty sure I didn’t originate it, but I can’t recall who I heard it from.

      In many societies, their gods effectively serve as metaphors for how the society thinks about itself. The problem, of course, is agreeing on what exactly God defines as good or bad. The other problem, is that once people believe that God commands a moral rule, getting them to reconsider that rule becomes very difficult.


      1. Right, which is why I think about all you can really take from the idea of god is the first principle that we’re all “children of god” — that we’re all equal. Once you have a foundation for equality, moral principles have a grounding. Otherwise you’re stuck with nature, which is pretty clear on the fact that we’re not equal.

        Medicine is a human practice that was filled with superstition and guesswork, but rather than turn our backs on it, we updated it to make it consistent with what we’ve learned about the world. I’ve always thought it was a pity we find that such a challenge with religion.


        1. “I’ve always thought it was a pity we find that such a challenge with religion.”

          Maybe we are. It might all depend on what you’re willing to call a religion. Maybe we’re creating new religions, but calling them “not religion”. Humanism once briefly referred to itself as a religion, although most adherents now strongly resist that designation today.

          And there there are religious naturalism movements.


          1. For the record: I generally equate religious ideas with spiritual ideas. When I talk about updating religion I specifically mean a modern spiritual view. Modern progressive Christian thinking, for example, is at least taking a stab at this.


          2. You might check out Reverend Michael Dowd in the linked post. I think he sees himself as a deeply spiritual practitioner.

            Unless of course by “spiritual” you mean supernatural, then his views might leave you cold. (This is one reason why I’m not a big fan of the word “spiritual”; it gets thrown around these days for everything from awe of the universe, love of your fellow human beings, to supernatural beliefs.)


          3. Depends on what you mean by “supernatural.” 😛

            Rather than pursue that, I’ll try tying it back to your post’s point about opinion versus fact and whatever the territory is between them. One fairly big stick in my craw is the so-called “science” channel (ha!) running shows about ghost hunters or shows with the title, “Could Ghosts Be Real?”

            No, gaddam it. They can’t. End of discussion. Some stuff isn’t just wrong. It’s — as the saying goes — “not even wrong.”

            This ties back to what I said about minds so open the brains are leaking all over the floor and making a mess. (And I, for one, am not cleaning it up.) This kind of blind open-mindedness leads to moral relativism and even the denial of the idea of right and wrong.


          4. Ugh. Don’t get me started on cable channels. I’ve watched the Science channel’s deterioration with sadness. I knew it had to happen eventually since all the other did as well. Sigh. I suppose there’s always PBS.

            Liked by 1 person

          5. True, although even Nova seems to be leaning increasingly to glitz over content. I saw a recent one about the upgrade of the LHC at CERN, and it was almost entirely content-free.


  6. McBrayer is letting his religious belief influence him here I think:

    I don’t think moral rules are whimsical opinions. They are very strongly held opinions. But they are opinions. They are not facts.

    Many humans have an opinion that it’s wrong to kill, and that elevate that opinion into a moral opinion, which is an opinion they feel, for various reasons, should be formed into a rule: you should not kill. The only facts here are related to the evolutionary, biological and cultural effects that cause humans to feel so strongly about an opinion they hold. These are facts about the humans holding the opinion, not moral facts themselves.

    If ‘killing is wrong’ was a moral fact about the world then it should be discoverable. But we can imagine that if a species evolved without our empathy then killing might be a good thing – though there would have to be some evolutionary context that allowed that to be propagated.

    With no discoverable fact, ‘killing is wrong’, and with countless evolutionary examples where killing is a necessity, it seems ‘killing is wrong’ is an arbitrary opinion, in cosmological terms, but an evolved and developed opinion in humans.


    1. What about the fact that we nowhere find a perfect circle, but we do find imperfect circles that lead us to the idea of a perfect circle? In general, nowhere do we find actual mathematics in nature, but we find everywhere facts that lead to mathematics?


      1. Right, we never see these perfect items: points, circles, cubes, …, ‘chair’. Even when we do the math we only work with approximations – significant figures, or symbols that represent some fictitious notion. Models with many terms even drop less significant terms as intentional approximations, in order to simplify the models, and then, wow, some complex system nearly fits reality.

        Platonic forms gets the wrongs. Material objects are not crude examples of pure platonic forms. Idealised models approximations to real forms. They are simplistic models, compared to an accurate model of any particular real object. And since all real objects are unique (I think) the ideal models are like statistical models – analogous to the ideal gas laws, for example.

        I think the platonic perception comes from the human brain history: we awake and become self aware, and aware of our thinking, but at that time we know nothing of evolution. We come ready made. The material world seems messy, and the mental world seems cleaner, more perfect. And we can imagine things we don’t see. So the mental world is an indication of a more perfect reality. The primacy of the mental over the physical is established, so the imagined pure things must be real. But that turns out to be fiction, dreams, imagination, idealistic approximations. We are no more than evolved complex chemistry, sloppy biology, that survives.

        Why should we expect the universe to be pure or perfect in any way? Why should we think there’s any sort of reality to perfect models?

        Liked by 2 people

        1. I think the main thing is that the human mind is a pattern matching engine. We see those patterns and give them names. Platonism strikes me as a bit of a reification of that innate tendency. Of course, the patterns do exist, but most of them are emergent from lower level patterns.

          Liked by 1 person

        2. I think the better word here is “abstractions” — which are, in a sense, simplifications, but are not approximations. In fact, any real world circle is the approximation of the abstraction of a circle. Those approximations lead us to a theory of circles — which in some sense was there to be discovered all along.

          These kinds of mathematical abstractions are different than the abstractions that underlie things like chairs. It’s not hard to imagine a race of intelligent beings that never invent chairs (perhaps they don’t sit). And the ‘chair’ abstraction is notoriously difficult to define. But it’s very difficult to imagine an intelligent race failing to discover a theory of circles. A circle is easy to define.

          Likewise, it’s hard to imagine an intelligent race failing to discover math. The natural abstraction necessary to enable counting (as you say, all material objects are different in their details, so counting requires the notion of sets and set membership) naturally leads to mathematics.

          We simplify mathematical models to make them useable, but that doesn’t mean the math behind the behavior doesn’t fit — it fits very well. But proper calculation requires an analog computer with a complexity order equal to the real world. In fact, the real world is that analog computer; it calculates the three-body problem and weather predictions with ease.

          It’s chaos theory that tells us any attempt to round off our measurements — which is necessary to have numbers we can calculate with — destroys our ability to precisely calculate with those numbers. And certain mathematical models, even without the chaos problem, require so much precision and time as to make them intractable. But the math behind these theories is still precise.

          Note that I’m not claiming Plato’s realm of perfect forms has ontological reality, but I think it has an almost undeniable epistemological reality. If you grant that any intelligent species will discover mathematics and geometry, it almost has to have it.

          “Why should we expect the universe to be pure or perfect in any way? Why should we think there’s any sort of reality to perfect models?”

          It appears we differ on the reality of ideas. Reality is a tricky subject. Are the rules of baseball “real”? If I design a building, but don’t build it, is the building real? Isn’t there some form of reality to the design? Isn’t the building itself just a realization of the design?

          Liked by 1 person

  7. While I do think that the phrase “moral facts” is wrong (for the reasons you’ve stated) I think the bigger issue pointed out by that article is that students come to college with only two categories: facts and opinions. Ideas are either correct/incorrect or else you cannot evaluate them at all. This deprives them of the willingness and courage to examine their own values – if they think their ideas are just opinions, then they’re no more arguable than their preferences and tastes.

    Even something like the premise “morality is subjective” or “Socrates’ view is too harsh” is treated either dismissively or taken as a given. “In my opinion…” (and the forced third person versions students develop to subvert the ban on “I” in papers) is almost never followed by deeper explanation, and neither is “It is a fact that…”. My biggest challenge with students is getting them to treat their thoughts with rigor but without the arbitrary pronouncement of facticity. Maybe college is the time to introduce that, but I can tell it is frustrating for students who have been conditioned to categorize everything as either “correct fact” or “dismissible opinion.”

    But I think your post tackles this a bit better than the original article!

    Liked by 2 people

    1. Good point. It’s important to understand whether something is fact or opinion, but not all opinions are equal. There’s a danger of being logically positivistic and regarding anything that isn’t a provable or testable fact as not worthy of consideration. I agree that such a viewpoint surrenders far too much of human contemplation to emotion or cultural moods.

      It seems like we have a limitation of language here. There needs to be a word for a substantive opinion (such as a doctor’s medical opinion) as opposed to my opinion of what the best color is. I often use the word “conclusion” because it implies careful thought, at least to me.


      1. I think that conclusion is good – I often use “argument” as opposed to “opinion”, or distinguish between what students “think” and what they “feel” or “prefer”. I’m not sure that’s the best way to go, though.

        Liked by 1 person

      2. I also try to distinguish between values that need to be shared and values that can be held alone. What are we happy to just believe on our own, what beliefs do we wish other people shared – and why should we share them / should we want to share them?

        Liked by 1 person

        1. That’s an interesting distinction. It seems like what values someone is comfortable holding alone and which ones they’d like others to adopt is itself often a personal value.

          It seems like there are values most people are comfortable to hold alone, such as our favorite dessert. Then there are values most people want others to share, such as whether we should have capital punishment. But I know people who seem compelled to bring everyone around to their choice of dessert, or movies, or whatever. And there are anarchists who basically don’t care what others value as long as it doesn’t affect them.

          Society works because most of us can agree, or at least accept, what the minimal shared values must be. I think what many find disturbing is that this is an ever changing arrangement.

          Liked by 2 people

          1. Yeah, my approach is a bit limited – it works for me because my classroom approach is geared towards helping students identify, explore, and explain their own own values, rather than instill in them anything in particular.

            Liked by 1 person

    2. I was also struck by that phrase—moral facts. I got stuck there, I wondered what the phrase meant. Does it refer to something that can be stated in a sentence or two? Is it something objective? Something universal? I just find it peculiar, and I’m not a relativist in the extreme sense (although I do think some moral truths are relative to culture and situation.)

      Interesting point about students using the third person. I learned that in philosophy, it was actually beneficial for me to use the first person in a lot of cases, mostly for clarity. It’s a shame that students aren’t given all the writing tools at their disposal and taught various ways of using them.

      Liked by 1 person

  8. The part at the very beginning of the post, which mentioned scholastic cheating, recalled a high school memory you might get a kick out of. It’s one of my most cherished high school memories.

    It was a civics or history test we were taking when the teacher got a call over the intercom to come to the office. The moment he was out the door, a couple of students went up to his desk and through his desk drawers and found the answer key for the test.

    They passed this around the room. When it came to me, I just passed it to the next person without looking at it. I’ve never felt the urge or need to cheat on tests (for a number of reasons). By the time the teacher returned, everything looked normal.

    When we got our tests back, a good portion of the class got “Fail” marks. Turns out the call from the office was a set up. The answer key was a plant. Not only did it provide the wrong answers, it provided very specific wrong answers that clearly identified the cheaters.

    You don’t get to see justice served up quite that nicely very often. I loved it! The only thing that could have made it better is if it had been an ethics class! XD

    Liked by 1 person

  9. Great post and interesting comments! I don’t have much to contribute to the discussion without writing a treatise, which I don’t have the capacity for right now. 🙂 I will drop a hint about what I think: Moral values may not ever rise to the level of facts, but I think there is a more objective/universal appeal to health/well-being that can serve as a guide and prevents us from making moral values whimsical or arbitrary. Not to say we’re in perfectly tranquil, clear waters here.

    In the case of cheating, I do think it’s a more powerful argument to tell students that cheating will harm them, and that’s why they shouldn’t do it rather than to say it’s the rule imposed on them by the school and that’s why they shouldn’t cheat.


    1. Thanks Tina. On objective morality, one example I often use is the tension between safety and freedom. Some people are more willing to give up freedom for safety, while others are the reverse, with most people’s sweet spot being somewhere on the spectrum. There’s no right or wrong answer to what is the correct compromise. We can only debate it until we find a (hopefully) satisfactory compromise for most of us.

      On students, I think both arguments in tandem probably help. I never really cheated in school, partially because I knew it would ultimately be self defeating, partially because I knew there would be academic consequences if I were caught, but mostly because I knew if I were caught that the shame before friends and family would be devastating.

      Incidentally, my master’s thesis was a study on student attitudes toward using technology to cheat. Of course, their attitudes about it were often markedly different than most faculty. For example, most students saw no problem with googling homework answers, even if explicitly told not to by instructors. But they were in agreement that googling for test answers was unethical.


      1. Good point about safety/freedom. I think a lot of morality is like this in the sense that there’s a sort of continuum and we all differ in how we’d place ourselves on it. Another one I think of is abortion. Most people have a hard time calling themselves “pro-abortion” for instance—such a label would creep out most of us— but they’d say it’s a woman’s right to choose or something like that. And a lot of people would have a hard time saying that life happens a day after conception (although some would, of course). It seems like most of us want to draw the line somewhere inside the two extremes, but the line has to be somewhat arbitrary, a place that feels comfortable for the majority of us.

        Yes, there is something to be said for just throwing all reasons at the students and hoping one or all of them stick.

        Perhaps I said that because I used to let people cheat off of my work all the time, which I did because I wanted people to like me and I took pride in being trusted by my peers. I actually did this on a near daily basis, and it got to the point where a group of kids would come to depend on me for their work. The idea that I might get caught and punished for it only heightened the experience. I might be able to break the “brown-nosing” label if only I got caught! Oddly, it never occurred to me that I was harming anyone in letting them cheat, that I was complicit in depriving them of an education. If I had thought of this in HS, I definitely would have stopped. (In middle school the desire to be liked would have had more force than the desire to do the right thing, at least for me.)

        Your thesis sounds fascinating. I wonder how students justified using Google after their instructors told them not to? Did they think: “The instructor is stupid in thinking we won’t use what’s at our disposal…everyone uses Google. If I don’t use Google, I’ll be at a disadvantage”? Which reminds me of how people think about taxes.

        Speaking of technology, I remember a math class in college in which the instructor said we could use any and all calculators. I nearly curled up in fetal position because I knew what that meant.


        1. Your comments about safety/freedom remind me of how the Greeks defined virtue, as a moderate stance between extremes. For example courage exists somewhere between the vices of cowardice and foolhardiness.

          I agree that abortion is an excellent example. The abortion debate is largely a tension between the freedom of the mother to control her body and the possible wellbeing of the embryo / fetus. It’s hard to see a zygote winning against the mother (although religious conservatives would disagree), but might be a more difficult call on a 20 week old fetus. Actually, when religious conservatives do object to aborting zygotes, I think it’s more about a sanctity violation involving souls than concern about the wellbeing of a mindless collection of cells.

          I have to admit if we’re going back to high school and earlier that my “never cheated” starts to become less honest. What 14 year old can resist helping their friends out?

          On googling homework answers, we didn’t ask, but I suspect the students thought the instructors were being pointlessly strict, since the purpose of homework is to learn, not be tested. I guess it depends on what the homework involved. If it’s something you’re supposed to work through for yourself, I could see Google shortchanging that. As a college student, I probably would have tried to do the problem without googling, then checked my answer against it. (As a middle or high school student, I probably would have just googled the answers.)

          The other thing that used to make us quake with fear was when the instructor said a test would be open book. In grad school, one professor encouraged us to bring our laptops and look up things on the web during the final, which was awful.

          Liked by 1 person

Your thoughts?

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.