Reconciling the disorder definition of entropy

In last week’s post on entropy and information, I started off complaining about the most common definition of entropy as disorder or disorganization. One of the nice things about blogging is you often learn something in the subsequent discussion.

My chief complaint about the disorder definition was that it’s value-laden. I asked: disordered according to who? That, as it turns out, was the wrong question. Often asking the wrong question can leave us stuck, and what we really need to do to move forward is figure out the right question. The one I was asking treated the issue like some kind of ethical or aesthetic value. Instead I should have focused on a more intermediate value.

The question I should have asked then is, disordered or disorganized for what? Asked that way, the answer becomes obvious when we remember that Rudolf Clausius coined the word “entropy” from the Greek word for transformation. What we’re talking about is disordered or disorganized for transformation. Or in more engineering terms, disorganized for work.

So a low entropy system is organized for transformation, for change. Transformation requires energy gradients. In a low entropy system, the components are arranged such that the gradients are maximized, or that they support, reinforce, and/or magnify each other. But the act of transformation, of causal action, inevitably reduces that organization. If you think about it, it can’t be any other way, and the second law of thermodynamics is a natural consequence.

In a high entropy system, the gradients have become separated, fragmented, and no longer able to reinforce each other. That fragmentation effectively minimizes any possible transformations. The first law of thermodynamics still applies. The system still has the same amount of energy, but much of it is now unavailable for transformation, for work.

The classic example of this is mixing cream into coffee. Prior to the mixing, the two are in separated states. When we initially pour the cream in to the black coffee, the states are still initially relatively low entropy, in the sense that they’re organized for transformation. Specifically, the cream is primed to mix with the coffee. As that mixture happens, as the transformation ensues, the entropy increases, and the mixture becomes less organized for change, at least without new energy from outside.

It just so happens that this disordered, disorganized, fragmented state corresponds to one with a large number of microstates, a system with a high degree of uncertainty, one that requires a larger amount of information to describe than a lower entropy system. We can use various information compression techniques to efficiently describe an organized system, but that becomes increasingly infeasible with ones effectively random and disorganized.

Maybe what I’m saying here is obvious to most of you, but for me, it’s a conceptual breakthrough. Now I won’t have to be annoyed when a popular science article talks about entropy as disorder, except possibly about the ambiguity.

There is one important point worth mentioning here that may be less obvious. It’s important not to assign too much general goodness or badness to order and disorder. I’m currently reading Anil Seth’s new book: Being You: A New Science of Consciousness, and one of his points is that brains aren’t really maximally ordered or maximally disordered systems. While they contain a large amount of information, and therefore entropy, the necessity to integrate that information, to process it holistically, requires that they maintain an ongoing balance between order and disorder.

Of course, brains can work with large amounts of entropy because they’re not closed systems. They have a constant stream of energy coming in, and constant metabolic processing to remove waste. Information processing is thermodynamically costly, a reminder of how physical it is.

Anyway, I feel like I have a much better handle on common conceptions of entropy now. Unless of course I’m still missing something?

Featured image source

37 thoughts on “Reconciling the disorder definition of entropy

    1. It seems like for whether the CDs are sorted or unsorted make a difference in entropy, we have to include us and our knowledge and desires into the mix. So if the collection is sorted in English and I desire to play a particular album, then the me+sorted collection does seem like a system with less entropy than me+unsorted collection, in the sense that the system with the sorted version is more organized for work. The me+unsorted seems like it would require more energy.

      On the other hand, if the collection is sorted in Chinese, then the me+sorted collection probably has roughly the same entropy as the me+unsorted one. Or if my only desire is to just to play any kind of music, then the sorting doesn’t make a difference, at least for that day.

      Of course, involving me and my brain states in the mix really muddles the picture, because my knowledge and desires constantly change. Maybe I sorted the collection by artist yesterday, but now a friend has come over and I really want to play the ones in a particular genre or sub-genre for the occasion. The me+friend+collection entropy varies according to our current desires and knowledge.

      Liked by 2 people

      1. Yes, involving you and your brain states does muddle things up! Those are all irrelevant to the analogy, which is strictly about the sorted state of the CDs and how it quantifies entropy.

        If the collection was in Chinese, that you (or I) don’t recognize the sort order doesn’t matter. Chinese-speaking people would. What matters is that a sorting protocol exists and the collection can be judged in terms of that protocol. The CD collection is an analogy for a system with a well-defined zero-entropy state and well-defined macro-states that quantify the degree of entropy.

        Work enters this picture in the amount of it required to sort the CD collection. Spending the energy to do the work of sorting, or correcting a sorting error, generates entropy (waste heat) elsewhere. Reducing the entropy of a system always has a cost, right?

        Liked by 1 person

        1. I might be missing something crucial in the analogy. Or maybe I’m just taking it too literally? How does the CD collection, in and of itself, as a physical system, have higher or lower entropy due to being sorted according to some disconnected system? What if my randomly sorted collection just happens, by sheer chance, to be sorted in the language of the Phrift-thak, an alien race 10^500 light years from here? It’s hard to see how that could make a difference in the CD collection itself.

          Definitely exerting the effort to put the CD collection in a sorted state increases entropy somewhere. Although doesn’t exerting energy to move it from one random state to another carefully selected but still random state also do that? In principle, it seems like we can pay the price of increasing the universe’s entropy without necessarily getting a local drop in it.

          Like

          1. I think you are taking it too literally; the CD analogy is an abstraction. (The entropy of the physical CDs, or how they’re used, is a different discussion.) The key, I think, is to see entropy as a view-dependent way we can quantify a system. It’s not something fundamental like energy; it’s an emergent view that’s relative to a definition of low- and high-entropy states.

            Entropy being view-dependent is maybe why it’s so easily aligned with such a user-defined notion as “disorder” — both supervene on a definition.

            So the CD collection has a well-defined zero-entropy state under a specific definition of perfectly sorted (of “order”). Other definitions of perfectly sorted (e.g. by title, by release date, by byte size, by some other language) map to the same quantification of entropy states. There’s perfectly sorted (zero entropy), there’s one CD misplaced, then two, three, and so forth, until there’s no order left. (We could illustrate the same thing with a list of numbers, but I thought the CD collection was more evocative.)

            Exerting energy moving a (truly) random state to a “carefully selected” state of any kind (no matter how apparently random) is still moving the system from a high-entropy state to a low-entropy state. Think of the former as maximum uncertainty where any given CD is and the latter as minimum uncertainty. Sorting increases the entropy of whatever does the sorting — presumably that mechanism consumes energy and emits waste.

            In terms of work, given the initial work of sorting, a perfectly sorted collection (zero-entropy) takes the least work to find a given CD. At a minimum, a binary search works, and that has worst-case (and average) effort of log n, which is great for large collections. A random collection, however, has a worst-case effort of n (it was the last CD) and an average of n/2 — much less great for large collections. The work reduction applies to other uses of the collection. Inserting is essentially searching, but listing the collection is trivial if it’s sorted, whereas if randomized one essentially has to sort it first.

            Liked by 2 people

    1. I concur.
      High potential for change (low entropy) vs low potential for change (high entropy).
      Lower informational density vs higher informational density.

      Does the act of adding information increase entropy? In a closed system, I suppose that might be true. Increasing entropy adds information, right?

      Classifying data adds a layer of abstraction (more information) but simplifies the understanding of the data (less information). If energy was expended to perform the classification, did that extra energy, lower the entropy thereby increasing the potential for change?

      Liked by 3 people

      1. Thanks.

        On adding information increasing entropy, it seems like it does. It might help to remember that information is differentiation. So we’re increasing entropy by increasing the level of differentiation in the system.

        On adding a layer of abstraction, I’m not sure. Possibly if we include both the original system and the one where the abstraction is stored, entropy for the combined system might be lowered. But I think it’s worth remembering that information processing of any significance is going to require incoming energy to the system, as well as waste disposal. That can enable the system to operate at a much higher entropy than if it was a closed system.

        Liked by 1 person

        1. In creating a PageRank index of searchable content, does google increase or decrease overall entropy? Even if leaving all the existing content where it’s found, it’s now easily accessible whereas previously, finding anything would have been an exhausting task.
          Of course, the process of indexing must be costly. But one entity indexing, vs millions of searchers randomly seeking information must be a net reduction in energy expense.
          Data spread randomly is at its lowest potential, highest entropy right?
          Meta-organizing that data increases its potential thereby lowering the data’s entropy?
          (Sorry if I’m making a metaphorical mess with these concepts here.)

          Liked by 1 person

          1. Google famously generates a lot of entropy (waste heat) doing its thing. All big computer centers do. FWIW, I think you guys might be conflating energy and entropy. They’re very similar but energy is a real thing whereas entropy is just a measurement.

            Like

          2. It’s easy to see them as the same. Useful energy with the ability to do work always has some kind of low-entropy, and work always produces some kind of high-entropy waste product. But entropy ultimately is just a measurement.

            Like

          3. I’m not sure, but my guess is that Google increases the overall entropy. Of course, as Wyrd noted, these aren’t closed systems. They crucially depend on a lot of incoming energy, and generate a lot of waste heat. This is true even for just servicing query requests. So they’re certainly increasing entropy overall. The question is what the role of the generated index data might be.

            Again, I think we have to be careful not to equate entropy with badness. We need systems with a certain degree of entropy for dynamic work. The trick is to ride that functional tension between order and disorder.

            Like

  1. I like how you are thinking in terms of change (work and transformation) rather than eternal stasis.

    Also, I like how you say entropy is not necessarily the bad guy. In times past, it was popular in literary circles to think of it as a motif of doom. “Oh, entropy is going to get you. Everything you are and do is going to turn to dust.”

    But entropy is a necessary part of change. You can’t make something new unless you simultaneously take apart some of the old.

    I suspect that a lot of the confusion lies with trying to merge entropy and information theories. Information is probably one type of the opposite of disorder, but it isn’t the only opposite. And that can introduce all kinds of confusion.

    So for myself, I consider such a merger of theories to be an “unfinished” project. A lot of what we read is going to be speculation rather than rooted in stuff confirmed by experiment . . . for now, anyway.

    Liked by 2 people

    1. Thanks Deal. Your remarks on the previous thread helped me a lot. And I agree with entropy being a necessary part of change and not the enemy. Certainly too much entropy is undesirable, but no entropy, just maximizing regularities, can be rigid and unyielding, not allowing for dynamism.

      I don’t think information is the opposite of entropy, at least not Shannon information. A high entropy system is one that would take a lot of information to describe. Put another way, it contains a lot of information. It’s a counter-intuitive conclusion from the mathematical convergence between statistical entropy and information theory.

      On the other hand, semantic information, in order to maintain its semantics, might require a necessary degree of order. I don’t think I’d describe such as system as a low entropy one however, but as Seth noted, one that exists in the functional tension between order and disorder.

      Liked by 1 person

  2. My thinking is that it gets into the difference between ordered and organized.

    Say that you have a deck of cards. To “order” them means that you arrange them as per following a rule or some math. Maybe it’s to put them in sequential order, or it’s to lay them on a table and turn over every third card. You can describe what you are doing with math, including perhaps the math of information theory.

    But say that you build a house of cards with the cards. Now they are organized. But what is comparable to using math to describe an ordering? How can we scientifically talk about an organization? With organization, there is no simple pattern repeated over and over to describe with the math.

    Well, we can realize that it took work to put the house of cards together. Work is measured as energy expenditure. And how much work (energy) did it take to build the house of cards? Okay, we can undo the organization and see what happens. We can knock over the house of cards while measuring what occurs. More exactly, we can stick the house of cards (more usually a molecule that has been assembled) into a calorimeter and turn the whole thing into just a bunch of heat energy. Heat energy is easily measured. So how much energy we thereby record will be a measure of how much energy it took to put the house of cards together. And now we can use that number on the organization, to make other predictions.

    So entropy is a hugely valuable way of measuring how organized a thing is. But at least so far information theory is talking about how ordered the thing is. It is about the math of ordering instead of the measurement of organization. So that is the type of problem I am having with merging the two theories, entropy and information.

    I haven’t checked to confirm this, but I think that Clausius was talking about the transformation of energy into its various forms. Energy can change from being heat energy to chemical energy to light energy to mechanical energy, etc. But how much energy does it take to make the change? Well, we can convert it all to heat energy and measure that, and that will tell us how much work (how much energy) was required to do the transformation.

    Intuitively, it seems that information must be something that organization contains. And I relish how biologists use the word “information” in this manner. But I balk at using the math of ordering to describe organization. Information theory works for what it was designed for, without having to call it entropy.

    A highly-disordered state might take a lot of energy to describe every unique piece, but you can’t get a lot of use out of it. To me, information is useful, not a requirement for work to describe it in detail.

    But wow, you keep asking all the right questions to get to the heart of a matter.

    Liked by 1 person

    1. Thanks. I see what you’re getting at with the distinction between order and organization. An ordering might be simple regularities. But an organization could be sublimely complex. The problem is so is a random disorganized state. How to distinguish them mathematically?

      What Shannon entropy converges with is statistical entropy, so it seems like this difficulty should exist for it as well. The problem is most of the material out there talks in terms of gases for statistical entropy and transmission of information for Shannon entropy. I’m not sure what the answer here is, of if there is a well accepted one yet.

      Liked by 1 person

      1. Mathematically an ordering is an organization. A regularity is typically a repeating pattern with a simple order: ABC.ABC.ABC.… Both a sorted deck and a house of cards have zero entropy if those are the only states that qualify as sorted or as the intended house. One card out of place has low entropy; a random pile has high. In terms of uncertainty, a sorted deck or a finished house provide certainty where every card is.

        A highly disordered state does take a lot of energy to describe, because describing it fully effectively orders it (or gives you certainty about its parts). A key aspect of entropy is the difference between the easily described macro-states of a system (e.g. temperature, pressure, sort order, building state), not it’s full micro-state description.

        Liked by 1 person

  3. Would a concrete example help? The Wiki page uses a 100-coin example with heads-tails states. I used an analogy with CDs. Mathematically it can be reduced to just bits. The system macro-states, respectively, are the number of heads, the number of CDs out of place, or the number of one bits. In each case a single number. The micro-states are the actual patterns of 100 coins, 100 CDs, or 100 bits.

    Here are the first ten macro states:

    [100,0] 1 (0.000)
    [100,1] 100 (4.605)
    [100,2] 4950 (8.507)
    [100,3] 161700 (11.993)
    [100,4] 3921225 (15.182)
    [100,5] 75287520 (18.137)
    [100,6] 1192052400 (20.899)
    [100,7] 16007560800 (23.496)
    [100,8] 186087894300 (25.949)
    [100,9] 1902231808400 (28.274)

    The first two numbers are the N of the system (100 here) and how many heads or misplaced CDs or one-bits there are. The third number is the number of combinations possible with that many “disordered” coins/CDs/bits — i.e. that number of micro-states in that macro-state. Entropy, the fourth number, is the natural log of that (ignoring Boltzmann’s constant).

    The first state has zero entropy because there is only one no-heads/perfect-sort/no-one-bits state. With one out of order, there are 100, with two there are 4950, and so on. The maximum entropy of this system, a mix of heads/tails, or a random order of CDs, or a mix of 0/1 bits has the maximum possible combinations, 100,​891,​344,​545,​564,​202,​071,​714,​955,​264. Which has an entropy of 66.784.

    A system with 100 cards is pretty simple. With 200 cards, the maximum entropy is 90,​548,​514,​656,​103,​280,​680,​474,​743,​631,​187,​938,​442,​311,​687,​866,​457,​887,​080,​448. A 59-digit number with the natural log (raw entropy) of 135.753. With 300 cards, it jumps to an 89-digit number (with a natural log of 204.866).

    These are still tiny systems. A room might have about 10^26 molecules of air, and in these systems the numbers become truly vast (which is why the log of the number of micro-states is used).

    Hope that helped.

    Liked by 1 person

  4. Perhaps a concrete example would help. When the eyes send information to the brain along the optic nerve—and the biologists do speak of that as information—are we to believe that this nerve impulse consists of extreme randomness (entropy). Or is the nerve impulse extremely organized in a way that the brain can turn all of that information into the picture that we see?

    Math is great at describing regularities (order) but not so good at describing the unique ways that architecture is put together (organization). So entropy is a way of doing that through the back door, so to speak We can destroy the organization and see how much heat is released. So entropy is a measure of organization.

    I agree that recent popular literature tries to finesse this issue. When people first tried to merge entropy with information, there was a lot of resistance along the lines I’ve discussed. But the advocates of the merger get high marks for persistence. Still, I would advise reading them skeptically. Whether entropy increases or not depends on how you set up the problem—it depends on what aspect of it you’re looking at—and a blanket general statement can lead to confusion.

    I can tell you that the difference between order and organization is a topic much discussed in thermodynamics. The difference is very real. But the idea is to find ways to work around it in terms of making predictions.

    If you impose a binary yes-or-no ordering onto how you look at a system, you are of course going to find just ordering in your result. But nature is organized.

    Liked by 1 person

  5. Just getting caught up on these interesting posts, Mike, and think your second post here does capture a key insight. And Wyrd points out as well that energy and entropy are indeed not the same, which can be a very tricky line. As a mechanical engineer, the first definition of entropy I was taught was related to the ability of a system to do work, and yes, followed by lots of handwaving about order and disorder and monkeys on typewriters and the probability of a Shakespearean play emerging from the tumult, etc., etc.

    But here’s a wild one: if the Big Bang is correct, then with all the matter and energy of the universe in some sort of singularity, IF there were no gradients within that state, then the possibility for transformation was zero, and the entropy (on paper) quite high. I think physicists have some way of calling this a low entropy state, but I can’t remember how they do it.

    (I suppose there was possibility for transformation, but not by energy flows from one part of this infinitesimally small universe to another? Certainly as the universe expands, the changing geometry with a finite amount of energy is itself a transformation, but it’s really weird/difficult to apply thoughts about entropy and energy that work on Earth, where we’re almost always talking one system or part of a system relative to another, to a universe which by definition is a closed system. Or maybe the singularity was in fact not a true singularity but a very small universe with energetic gradients in it after all?)

    After the Big Bang in which the universe expands, (into… nothing at all–???), it requires the formation of gradients throughout space and time in order to produce the possibility of regions that have higher and lower entropies, relative to one another. It seems to me all transformation, and all life, depends on this zone of time in which such gradients formed and continue to exist.

    If there is an ultimate heat death, again the universe will be in a state in which there are no more energetic gradients. This entropy (on paper) is quite high, much like the original state. But now it’s more spread out. But what spread out means I don’t really know!

    That aside, if energy is not created or destroyed, which I suppose is an open question, then the energy of the universe is fixed while the entropy changes throughout time…

    In the middle, when energetic gradients throughout the universe do exist, then entropy is largely important as a “relative” metric between two regions of a system. A system such as a reservoir of water behind a dam could reach a stasis and we might call it a high entropy state, but when the discharge valve of the dam is opened and it is connected to another region, then suddenly it can do work. So where and how we define the boundaries of systems is important. Just by changing the boundary, we’ve changed the quantitative value of the entropy of a big puddle of water. One always has to have a reference point when speaking about energy and entropy I believe… this is why it’s hard for me to understand it for the universe as a whole.

    Michael

    Liked by 1 person

    1. Interesting questions Michael!

      On the Big Bang, I’m going off of memory here, but I think there are minute variances in the CMB. Those variances gave something for gravity to work with. In other words, there were very mild gradients which grew over time. Of course the question is, where did those initial infinitesimal gradients come from? The answer I think I remember is that inflation stretched out quantum fluctuations. The quantum fluctuations seem plausible enough, but I’m still not sure about inflation, although it may be like just an extreme version of what we already observe today with dark energy.

      I think you’re right that everything we know depends on those gradients and our place in the timeline of energy finding an equilibrium within them. That’s why I said it’s a mistake to view entropy as bad. Without it, we wouldn’t exist. But just like sunlight, too much of something we crucially depend on can definitely be bad.

      I think you’re right that entropy applies to isolated systems. Technically I guess it also applies to open systems, but it seems like things are much more complicated. For an isolated system, all its transformation has to come from within, and entropy will only increase. But for an open system with energy coming in and some form of waste disposal, it can operate for extended periods at a relatively high entropy successfully. As Seth noted, it’s about finding the sweet spot in the tension between order and disorder.

      For the universe as a whole, the question might be, is it a closed isolated system? Where did inflation come from? Where does dark energy come from?

      Liked by 1 person

      1. So I couldn’t help myself. Based on a very quick Google search of not entirely trusted, but mutually agreeable resources, the universe as we know it today has a higher entropy than the big bang largely because of black holes. I should probably have done more reading, but what I gather quickly is that the number of possible states of a black hole is proportional to its mass, and the mass of the black hole as the center of the Milky Way Galaxy, for instance, has WAY, WAY, WAY more possible states than the universe at the time of the Big Bang.

        My next question, of course, is how all the mass in the universe could have been in the Big Bang and it not been a black hole. I love how these threads can diverge into interesting questions. The answer, as I understand it from a text only link at the University of California (Riverside) is that one deftly assumes an initial velocity to the energetic-matter-energy plasma at the time of the Big Bang. You assume it’s going really fast outwards, so it gets to not be a black hole. But not too, too fast. This is another Goldilocks parameter apparently. The universal mass-energy-plasma-whatever has to be going JUST fast enough (when time t=0) to get beyond the Schwarzschild radius but not fast that it expands beyond gravity’s ability to create galaxies, stars, planets, etc.

        I will confess to being only partially satisfied, but not overly bothered, by the fact we have a little ad hoc mathematics going on at time t=0… 🙂

        I would love to know if you or others here have explored this before.

        Michael

        Liked by 1 person

        1. I should clarify, apparently the entropy of the black hole at the center of the Milky Way is only 10^3 greater than the conditions at time t=0. But when you start adding up black holes, then it gets into the WAY, WAY, WAY zone…

          Liked by 1 person

        2. Hmmm. I don’t think it’s only black holes, although they would certainly supply a lot of it. As I understand it, black holes are maximum entropy systems. You can’t have any more entropy within the space they exist in, because as soon as you reach that point, you have a black hole. But it’s worth noting that eventually all the black holes are expected to evaporate in something like 10^100 years.

          Brian Greene discusses a lot of this in his book “The Fabric of the Cosmos”. Wikipedia also has a whole bunch of articles on the earliest stages of the big bang. https://en.wikipedia.org/wiki/Timeline_of_the_early_universe

          I’ve heard some physicists characterize it as we have a good handle on what happened after the first second of the universe; less than one second is when things get speculative and all kinds of problems arise. Although looking at the Wikipedia article, it’s before 10^-32 seconds where the controversial stuff is proposed.

          The beginning of the universe is one of those knowledge frontiers. Eventually we always hit the limits of our knowledge, where what we think we know is structured in terms of our ignorance. I think it’s important to acknowledge that when looking at this subject.

          Liked by 1 person

          1. Agreed on acknowledging ignorance. I was trying to say the same, albeit flippantly perhaps.

            According to Scholarpedia Hawking radiation, which I think causes the evaporation you describe, is associated in the theories that define blackholes as an increase in entropy as well.

            You could be right about black holes and maximum entropy, but what’s interesting is that, at least as far as we know, their entropy is finite and calculable in ways that conform to the Laws of Thermodynamics.

            If there’s more to it than that (and assuredly there is, of course!) (more to the universe now being higher entropy than the conditions of the Big Bang at t=0), I’d be curious to know what they are.

            Another wild card is Poincare’s Theory of Return, or something like that, which predicts that certain classes of closed systems, given long enough, will return to any previous state, including a low entropy one. So IF the universe is closed, this is an additional thought to ponder on the cosmological scale. The universe would have to be ergodic, which means (I definitely had to look this up) they are deterministic, and also free of random perturbations, noise, etc. That may not apply to our universe, but it’s interesting to consider. I can see systems being both deterministic and chaotic (e.g. random, or containing noise), and I know this to be a favorite topic of Wyrd, so I’ll stop there. While it seems pretty safe to say the universe is probably not ergodic, it’s also interesting that there appears to be a branch of study of “quantum ergodicity” so maybe the random outcomes of QM do not in and of themselves deny the possibility of ergodicity.

            Michael

            Liked by 1 person

          2. On there being more to it, I guess it depends on how high you consider the initial entropy of the universe to have been. It seems like entropy is constantly increasing everywhere. Stars are the most common culprits. Admittedly, they are miniscule compared to a supermassive black hole.

            Poincare’s theorem, I think, reminds us that the second law of thermodynamic is about a statistical trend, that entropy almost always increases. Although the probability of it not happening is unimaginably tiny, with average timelines for occurrence going far beyond the heat death of the universe.

            This reminds me of Boltzmann brains, essentially minds that appear due to spontaneous decrease in entropy. The Wikipedia timeline article notes that the mean time estimate for this happening in a vacuum is 10^10^50 years. Boltzmann brains are interesting because if they’re possible, than it’s likely that you are a Boltzmann brain right now, a flash of sentience appearing with a whole life history, that will disappear almost immediately. (It is uncomfortably difficult to show they are not possible.)

            Disagreeable Me, an occasional commenter here, once pointed out to me that each of us might actually be numerous Boltzmann brains separated across vast spacetime intervals, with each instance of our life existing at each time and location, albeit not in any particular order, but that it wouldn’t make any difference to our subjective timeline.

            Liked by 1 person

Your thoughts?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.