Entropy and evolution

There’s an interesting article by Daniel Styer in the November 2008 issue of the American Journal of Physics.  The full article is behind a subscription wall, so if you’re not a subscriber you can’t see it.  The link should take you to the title and abstract, but in case it doesn’t, here they are:

Daniel Styer, “Entropy and Evolution,” American Journal of Physics, 76, 1031-1033 (November 2008).

Abstract: Quantitative estimates of the entropy involved in biological evolution demonstrate that there is no conflict between evolution and the second law of thermodynamics. The calculations are elementary and could be used to enliven the thermodynamics portion of a high school or introductory college physics course.

The article addresses a claim often made by creationists that the second law of thermodynamics is inconsistent with evolution.  The creationist argument goes like this: The second law of thermodynamics says that entropy always increases, which means that things always progress from order to disorder.  In biological evolution, order is created from disorder, so it is contradicted by the second law.

The creationist argument is wrong.  It rests on misunderstandings about what the second law of thermodynamics actually says.  Debunkings of the argument can be found in a bunch of places.  The point of Styer’s article is to assess quantitatively just how wrong it is.  This is a worthy goal, but Styer makes a mistake in his analysis that seriously distorts the physics. I feel a little bad about even mentioning this. After all, Styer is on the side of the (secular) angels here, and his final conclusion is certainly right: the creationists’ argument is without merit.  Fortunately, as I’ll show, Styer’s mistake can be fixed in a way that demonstrates this unambiguously.

This post is kind of long, so here’s the executive summary:

  1. The creationists’ argument that evolution and the second law are incompatible rests on misunderstandings.   (This part is just the conventional view, which has been written about many times before.)
  2. Styer’s attempt to make this point more quantitative requires him to guess a value for a certain number.  His guess is a wild underestimate, causing him to underestimate the amount of entropy reduction required for evolution and making his argument unpersuasive.  The reason it’s such a huge underestimate is at the heart of what statistical mechanics is all about, which is why I think it’s worth trying to understand it.
  3. We can fix up the argument, estimating things in a way that is guaranteed to overstate the degree of entropy reduction required for evolution.  This approach gives a quantitative and rigorous proof that there’s no problem reconciling evolution and the second law.

The conventional wisdom.  The main reason that the creationist argument is wrong is that the second law applies only to thermodynamically closed systems, that is to say to systems with no heat flow in or out of them.  Heat is constantly flowing into the Earth (from the Sun), and heat is constantly escaping from the Earth into space, so the Earth is not a thermodynamically closed system.  When you’re dealing with a system like that, the way to apply the second law is to think of the system you’re considering as part of a larger system that is (at least approximately) closed.  Then the second law says that the total entropy of the whole big system must increase.  It does not forbid the entropy of one subsystem from decreasing, as long as other parts increase still more.  In fact, when heat flows from a very hot thing (such as the Sun) to a colder thing (such as the Earth), that heat flow produces a huge increase in entropy, and as a result it’s very easy for decreases in entropy to occur in the rest of the system without violating the second law.

Imagine raking the leaves in your yard into a pile.  The entropy of a localized pile of leaves is less than the entropy of the leaves when they were evenly spread over your yard.  You managed to decrease that entropy by expending some energy, which was dissipated in the form of heat into the environment.  That energy dissipation increased the total entropy by more (much, much more, as it turns out) than the decrease in the entropy of the leaves.

There is a second reason the creationist argument is wrong, which is that entropy is not exactly the same thing as disorder.  Often, entropy can be thought of as disorder, but that correspondence is imperfect.  It seems clear that in some sense a world with lots of organism in it is more “organized” than a world with just a primordial soup, but the exact translation of that vague idea into a statement about entropy is tricky.  The first reason (Earth is not a closed system) is much more important, though.

The main purpose of Styer’s article is to quantify these points. In particular, he quantifies the amount of entropy supplied by sunlight on the Earth and the amount of entropy decrease supposedly required for biological evolution.  The point is to show that the first number is much greater than the second, and so there’s no problem reconciling the second law and evolution.

What’s wrong with Styer’s argument.  The problem with his argument comes in Section III, in which he tries to estimate the change in entropy of living things due to evolution:

Suppose that, due to evolution, each individual organism is 1000 times “more improbable” than the corresponding individual was 100 years ago.  In other words, if Ωi is the number of microstates consistent with the specification of an organism 100 years ago, and Ωf is the number of microstates consistent with the specification of today’s “improved and less probable” organism, then Ωf =  10-3Ωi. I regard this as a very generous rate of evolution, but you may make your own assumption.

Thanks!  I don’t mind if I do.

1000 is a terrible, horrible, ridiculous underestimate of this “improbability ratio.”  The reason gets at the heart of what statistical mechanics is all about. One of the most important ideas of statistical mechanics is that probability ratios like this, for macroscopic systems, are almost always exponentially large quantities.  It’s very hard to imagine anything you could do to a system as large as, say, a cell, that would change the number of available microstates (generally known as the “multiplicity”) by a mere factor of 1000.

If a single chemical bond is broken, for instance, then energy of about E=1 eV is absorbed from the system, reducing the multiplicity by a factor of about eE/kT, or about 1017 at typical biological temperatures.  That’s not the sort of thing Styer is talking about, of course: he’s talking about the degree to which evolution makes things “more organized.”  But changes of that sort always result in  reductions in multiplicity that are at least as drastic.  Let me illustrate with an example. [The fainthearted can feel free to skip the next paragraph — just look at the large numbers in the last couple of sentences.]

Suppose that the difference between an organism now and the same organism 100 years ago is that one additional protein molecule has formed somewhere inside the organism.  (Not one kind of protein — one single molecule.)  Suppose that that protein contains 20 amino acids, and that those amino acids were already present in the cell in the old organism, so that all that happened was that they got assembled in the right order.  That results in an enormous reduction of multiplicity: before the protein formed, the individual amino acids could have been anywhere in the cell, but afterwards, they have to be in a specific arrangement. A crude estimate of the multiplicity reduction is just the degree to which the multiplicity of a dilute solution of amino acids in water goes down when 19 amino acids are taken out of it (since those 19 have to be put in a certain place relative to the 20th).  The answer to that question is e-19μ/kT, where μ is the chemical potential of an amino acid.  Armed with a statistical mechanics textbook [e.g., equation (3.63) of the one by Schroeder that I’m currently teaching from], you can estimate the chemical potential.  Making the most pessimistic assumptions I can, I can’t get -μ/kT below about 10, which means that producing that one protein reduces the multiplicity of the organism (that is, makes it more “improbable”) by a factor of at least 1080 or so (that’s e-190).  And that’s to produce one protein.  If instead we imagine that a single gene gets turned on and produces a bunch of copies of this protein, then you have to raise this factor to the power of the number of protein molecules created.

That calculation is based on a bunch of crude assumptions, but nothing you do to refine them is going to change the fact that multiplicity changes are always by exponentially large factors in a system like this.  Generically, anything you do to a single molecule results in multiplicity changes given by  e-μ/kT, μ is always at least of order eV, and kT is only about 0.025 eV.  So the ante to enter this game is something like e40, or 1017.

Fixing the problem.  It’s been a few paragraphs since I said this, so let me say it again: Despite this error, the overall conclusion that there is no problem with evolution and the second law of thermodynamics is still correct.  Here’s a way to see why.

Let’s make an obviously ridiculous overestimate of the total entropy reduction required to create life in its present form on Earth.  Even this overestimate is still far less than what we get from the Sun.

Suppose that we started with a planet exactly like Earth, but with no life on it.  I’ll call this planet “Dead-Earth.” The difference between Dead-Earth and actual Earth is that every atom in the Earth’s biomass is in Dead-Earth’s atmosphere in its simplest possible form.  Now imagine that an omnipotent being (a Maxwell’s demon, if you like) turned Dead-Earth into an exact replica of Earth, with every single molecule in the position that it’s in right now, by plucking atoms from the atmosphere and putting them by hand into the right place.  Let’s estimate how much that would reduce the entropy of Dead-Earth.  Clearly the entropy reduction required to produce an exact copy of life on Earth is much less than the entropy reduction required to produce life in general, so this is an overestimate of how much entropy reduction the Sun has to provide us.

Earth’s total biomass has been estimated at 1015 kg or so, which works out to about 1041 atoms.  The reduction in multiplicity when you pluck one atom from the atmosphere and put it into a single, definite location is once again  e-μ/kT . This factor works out to at most about e10.  [To get it, use the same equation as before, (3.63) in Schroeder’s book, but this time use values appropriate for the simple molecules like N2 in air, not for amino acids in an aqueous solution.]  That means that putting each atom in place costs an amount of entropy ΔS = 10k where k is Boltzmann’s constant.  To put all 1041 atoms in place, therefore, requires an entropy reduction of 1042k.

Styer calculates (correctly) the amount of entropy increase caused by the throughput of solar energy on Earth, finding it to be of order  1037k every second.  So the Sun supplies us with enough entropy every 100,000 seconds (about a day) to produce all of life on Earth.

The conclusion: Even if we allow ourselves a ridiculous overestimate of the amount of entropy reduction associated with biological evolution, the Sun supplies us with enough entropy to evolve all of life on Earth in just a couple of days (six days, if you like).

Published by

Ted Bunn

I am chair of the physics department at the University of Richmond. In addition to teaching a variety of undergraduate physics courses, I work on a variety of research projects in cosmology, the study of the origin, structure, and evolution of the Universe. University of Richmond undergraduates are involved in all aspects of this research. If you want to know more about my research, ask me!

12 thoughts on “Entropy and evolution”

  1. It’s a bit surprising that in U.S. there seem to be professors, that don’t understand the fundamentals of thermodynamics as this article clearly indicates. Professor Bunn claims that the 2nd law applies only to “closed” systems. That is not true. The 2nd law is universal and it applies to all kind of thermodynamic systems (open, closed and isolated (adiabatic)). The 2nd law can be formulated in many different ways, but some of the formulations are reasonable only within a certain system type. Universe itself can be regarded as an isolated (or closed) system and per the definition used by professor Bunn, the 2nd law should be applied everywhere (in the universe). The 2nd law also predicts direction of all spontaneous (natural) processes eg. heat flow from hotter to colder object. Shouldn't the 2nd law be universal then direction of natural processes would be unpredictable. A minor mistake is just the terminology used. A “closed” system can exchange heat with the environment but not matter and an “isolated (adiabatic)” system can not exchange heat or matter with the environment. (As per finnish language litterature).

    However the mistakes made by professor Bunn so far are not as alarming as his most serious mistake. He does not seem to understand the concept of entropy in thermodynamics. Professor Bunn claims that the entropy of piled leaves is less than evenly spread leaves. And he also claims (in practice) that the order of molecules in a system affects entropy of the system.

    Entropy in thermodynamics can be understood in two ways, either as a state variable or as a statistic variable, which is a function of available microstates for a molecule. Mathematically a state variable is a function of two variables. Temperature is another well known state variable. The most important equation for entropy change is dQ/T. Applying this equation we can immediately see that if there is no heat transfer, then there is no entropy change. This applies to all practical situations. (The only exeption is when ideal gas expands into “empty space” and entropy of the system increases without heat transfer). This applies also to leaves. If there is no heat transfer, there is no entropy change. It does not matter how each leave is situated compared to each other ie. whether piled or spread. Professor Bunn states correctly that entropy is not the same as order (or disorder), but only a few lines later he forgets his own statement and uses the concept of entropy as it was equal to order. However his leaves example is very good in demonstrating that even a simple type of order as a form of piled leaves needs someone to do the work (W=Fs) needed for piling (=order). And this worker need to be intelligent enough so that it is able to make the piles. For some reason or another professor Bunn does not mention the work which is needed nor the intelligent agent which is also needed. Temperature which is another state value, does not matter “the organization” of the leaves either. But if temperature at night drops, then entropy of the leaves also decrease.

    The same applies also to molecules. The entropy of a system does not matter whether molecules of the system are in form of DNA or in some arbitrary form. The geometric situation of molecules does not affect to entropy of the system. But the amount of available microstates for molecules does. Microstate or thermodynamic probability is the amount of available energy states(levels), which give the same internal energy for the system.

    Entropy isn't very easy concept to understand correctly, as we often see in different context. It is often confused with (geometric) order. I think that the easiest way to understand it is to think it as a state variable, like temperature. Actually entropy and temperature are very closely related state variables as they often (almost always) seem to follow each other.

    Here I try to illustrate the concept of entropy as a statistic variable with an example:

    Let's take a brick and let's imagine that the brick is divided into very many small identical pieces. Let's take 10 different temperatures and give each peace one of these temperatures randomly. Given that there is no heat flow in the brick the internal energy and entropy of the brick are the same despite of the order of small pieces in the brick. Even if these small pieces (of different tempereture) would form a DNA like structure in the brick, it would not affect the entropy nor the internal energy of the brick. Only when heat would flow inside the brick and temperature differences would disappear, then the entropy would change. We could compare the amount of small peaces in the brick to the amount of available energy states for a molecule. The entropy of the brick is depending on the amount of different small pieces not the geometric position of them. When the amount of peaces with identical temperature is getting bigger, then entropy is increasing and when the amount of identical peaces is getting smaller, then entropy is decreasing.

    When entropy is determined as a statistic variable, it really means statistic. And statistics has nothing to do with geometry or order.

    Please forgive me any mistakes in terminology, since I'm not native English speaker nor a scientist.

    Raimo Lonka

  2. I don't have time to respond to everything here, so let me just pick out a couple of points.

    First, does the second law apply only to closed systems? There are a bunch of different ways to state the second law, but I think it was clear from context that I was talking about this one: The total entropy of a closed system can never decrease. And yes, you absolutely do need the qualifier "in a closed system" here. Imagine that you have heat flow from a hot object to a cold object. The two together are a closed system, whose entropy increases. The hot object, considered in isolation, is not a closed system, and its energy decreases.

    Now about the pile of leaves. I suspect the issue here is where to draw the line between macroscopic and microscopic. I was thinking of the positions of the individual leaves as microscopic (just as we usually do with the positions of molecules when considering a gas). That is, two leaf piles that were overall the same shape but had different detailed arrangements of leaves would be different microstates but the same macrostate. In this case, there are definitely many more more-or-less uniform ways to arrange leaves across your lawn than there are ways to arrange them in a pile. In other words, the pile has less entropy. If you don't believe me, try a thought experiment: Imagine a tornado (representing a thermodynamically irreversible process) passing across your lawn. It might turn a neat pile into a uniform strewing of leaves, but it's unlikely to go the other way around. That's the second law.

    If you regard the position of each individual leaf as part of the macrostate, then you're right that there's no entropy difference between a specific pile and a specific uniform spread of leaves.

    Let me finish with a bit of unsolicited advice. If you disagree with someone, even if you think the other person is a complete idiot, you're better off adopting an attitude of politeness rather than being insulting. There are a couple of reasons (aside from politeness being a good thing in itself). First, even if you're completely right and the other person is completely wrong, it's far more rhetorically effective, and even more satisfying, to take the person down with understated courtesy. (Try it some time: it feels awesome.) Second, if you happen to have made mistakes, as we all do from time to time, you won't look as bad.

  3. Thank you for the advice, I try to learn from it and I apologize for being insulting. It may be that I’ve been influenced by the general practice of forum debates here in Finland. Here it is just usual that other’s (those whom you think are wrong) intelligent and mental capabilities are estimated in very straight forward manner. When this kind of habit is usual, nobody really pays any attention to it. There is certainly a cultural
    difference. And I understand that even so that is not very good excuse.

    But I still don’t accept your claim that “the second law applies only to thermodynamically closed systems”
    because it is just not true. It is true that the entropy of an isolated (I wish to use correct definitions) system cannot decrease, but this does not mean that the 2nd law is only applicable to isolated systems. The 2nd law says also that
    to increase order work must be done. E.g. if we wish to transfer heat from colder object to hotter object.
    In general we always need to do work if we want to convert the direction of a spontaneous (natural) process to opposite direction. And the only reason why the entropy of an isolated system cannot decrease, is that the system is to not able to transfer heat to its surroundings. Please note that nothing else
    can reduce the entropy. Heat transfer is the only thing that affects to the entropy of a system.

    You wrote: “I was thinking of the positions of the individual leaves…”. Thank you for the clarification.
    I’m not a mind reader and therefore my response is solely based on what I read. However I think that your
    leaves example is not very good in demonstrating the statistic nature of entropy, because it is only statistic (based on the amount of available microstates for a molecule) and the geometric arrangement of molecules has nothing to do with it.

    And I don’t quite understand what is message of your tornado example. If you mean that a tordado is able to produce order, I can only wonder the logic behind that idea. Probably people living in the Midwest Tornado Alley might say, that tornados usually cause havoc. And in order to restore the original order someone must (again) do work (W=Fs). I would say that “That’s the 2nd law”. Please do not forget the work.

  4. We seem to be talking past each other on the whole “closed systems” thing. I really mean something quite trivial by it: One way of stating the second law is that the entropy of a closed system cannot decrease. (Replace “closed” by “isolated” if you like. I don’t care.)

    With this choice in mind, the sentence “The second law applies only to closed systems” is equivalent to “The statement that the entropy of a closed system cannot decrease applies only to closed systems.” I presume that you agree with this statement.

    You might think that this statement is too trivial to bother making. That’s probably because you don’t live in the US, so you don’t often hear creationists make the mistake of claiming that the Earth’s entropy cannot decrease due to the second law.

    You say that “Heat transfer is the only thing that affects the entropy of a system.” This is an error. You cited an example in your earlier comment, namely the free expansion of a gas. There are other examples as well. For quasistatic processes, dS = dQ/T, so there’s no entropy change without heat flow. But for processes that aren’t quasistatic, that equality becomes an inequality, so entropy changes can (and frequently do) occur without heat flow.

    You’re also mistaken when you say that “geometric arrangement of molecules” has nothing to do with entropy. Entropy depends on the number of available states. If you have a system in which the molecules are geometrically constrained, and another system in which they’re not, the number of available states will be smaller in the former, and hence the entropy will be too.

    The tornado is an example of an irreversible process, which increases entropy. It turns the low-entropy system (leaf pile) into a high-entropy system (leaves spread out). It’s very unlikely to go the other way around. That’s the second law in action.

  5. Sorry for the delay in responding… I was about to give up. From our progress so far I can conclude that obviously we are not going to reach mutual understanding in all details regarding thermodynamics and the 2nd law. But the subject is interesting, so why not drop a few lines…

    Before commenting your last post I’d like to give a brief look at the concept of entropy, which is an important part of the fundamentals of thermodynamics. We can identify atleast three different kind of entropy, which are often related to various discussions or debates on the subject, but which are not often separated from each other. This separation is essential, because otherwise we’ll have too broad and inaccurate concept of entropy for proper scientific argumentation.

    1) Entropy is a measure of energy unavailable for work. This is the original definition for entropy (S=dQ/T, Clausius 1865). In all processes there is “irreversible heat loss”, which reduces (any) system’s ability to do work. Entropy change is always related to heat flow.

    2) Entropy is a system property. Entropy is either a statistical variable or a state variable. Boltzmann (1877) formulated the statistical definition for entropy: S = k ln W. Boltzmann’s entropy is a function of the number (W) of microstates consistent with the given macrostate. Boltzmann’s entropy has actually very little to do with Clausius’s entropy. Boltzmann’s equation does not specify how entropy is changed. Mathematically Boltzmann’s entropy is a function of one variable (W). As a state variable entropy is a function of two variables and entropy is comparable to temperature, which is another state variable. Actually temperature and entropy are closely related to each other. If one of them changes, then the other changes too. But what causes entropy to change? From position 1 we learned that the most fundamental reason for entropy change is heat flow, dS = dQ/T. This applies also to position 2 entropy. Basically that’s all we need to know about the cause for entropy change. But we can still have a more detailed look at the issue. The 1st law of thermodynamics states that the change of internal energy of a system is a sum of heat and work transferred across system boundary dU = dQ + dW. From this equation we get another factor that causes system properties (state variables) to change, namely work (dW). Work is a form of energy just like heat. Work can completely transform into heat, but heat cannot completely transform into work (there is always a loss of heat, entropy;-). When work is transferred across system boundary it changes into movement of molecules ie. heat. As a system property entropy changes when internal energy (U) of a system changes. So the change of entropy is always due to energy flow.

    Entropy is also often called as a measure of disorder. Also this feature of entropy can be divided into two different forms.

    1a) Entropy is a measure of disorder in macroscopic or environmental level. We can have several systems with different state variable values; ie. temperatures, entropies etc varies from system to system. When separate systems reach termodynamic balances with each other entropy increases ie. disorder increases. Finally we may have just one uniform system.

    2a) Entropy is a measure of disorder in microscopic or system level. We focus our attention to only one system at a time. The amount of internal energy of a system is directly related to the number of microstates of molecules in the system. More internal energy -> more microstates -.> more "disorder". It's time to recall that the number of microstates is consistent with a certain macrostate (temperature, entropy, pressure etc..). Also geometry of molecule positions does not affect the (entropy)disorder of a system. A DNA like structure has exactly the same entropy as an arbitrary mess of molecules with same molecular mass, when the state variable values are equal.

    So far we have been dealing the concept of entropy within thermodynamics. This is the mandate thermodynamics gives us to deal with entropy. But there is still another form of entropy that goes beyond that mandate and this is the entropy related to order or information ie. things that we find in our everyday life. (We do not deal with microstates of molecules, because we do not see them easily).

    3) Entropy is a measure of order/information. We just learned that entropy can be regarded as a measure of disorder. So, can we use it as a measure of order or information? The answer is no!
    Entropy does not care about order or information. It cares only about the internal energy and heat loss of a system. Entropy of a book sheet with text in it is equal with entropy of a sheet of paper with equal amount of ink in it. Only values of state variables matters.

    But is there any connection with order/information and entropy? Yes, there is. There is a well known concept of work, which is the highest form of energy. Order and information can be created applying work. When we create order we actually create different kind of potential differences into world. Eg. we can transfer heat from outside to our house using airheatpump (very popular in Finland). Or we can write a letter. All potential energy differences (i.e. order or information) tend to diminish and as a result entropy increases.
    To put it simple (energy can not be created or distroyed but it can be transformed):

    Work -> Pontential energy (order&information) ->
    Entropy (as time goes, heat loss)

    Ok, let's have a look at your last post:

    With this choice in mind, the sentence "The second law applies only to closed systems" is equivalent to "The statement that the entropy of a closed system cannot decrease applies only to closed systems." I presume that you agree with this statement.

    Good guess, but wrong! I do not agree with you. The 2nd law can be formulated in many different ways, but it applies to all kind of thermodynamic systems. I can only wonder why you stick on the false claim that "The 2nd law applies only to isolated system", because the formulation and the statement are not equivalent.

    I can also only wonder what kind of claims about the validity range of the 2nd law you invent from these formulations of the 2nd law??:

    "Heat does not flow spontaneously from cold object to hot object"
    "Work must be done to increase order (change the direction of all natural processes)"
    "Temperature difference (two heat resevoirs) is required for producing work"

    You say that "Heat transfer is the only thing that affects the entropy of a system." This is an error. You cited an example in your earlier comment, namely the free expansion of a gas. There are other examples as well. For quasistatic processes, dS = dQ/T, so there's no entropy change without heat flow. But for processes that aren't quasistatic, that equality becomes an inequality, so entropy changes can (and frequently do) occur without heat flow.

    Please look at the beginning of my post.

    You say so. Do you really have equations to calculate entropy change in situations (during the process) that are not quasistatic? (My rather large litterature of thermodynamics do not know such equations…)

    I need to clarify my earlier statement about ideal gas expansion and the relation of factor dQ and entropy increase. The variable dQ does not dissappear but the equation for entropy change can be formulated other way. Entropy is a state variable and it can be expressed as a function of volume and temperature. The basic definition for entropy dS= dQ/ T and the 1st law can be combined.

    dS = dQ/T = (dU + pdV) / T

    and from the ideal gas state function we get

    p = vRT/V

    Differential for internal energy U

    dU = (f/2) vRdT

    and by combining these results we get

    dS = f/2 vR dT/T + vRdV/V = vR[f/2(dT/T) + (dV/V)]

    dQ did not dissappear, it was only written in a different form.

    You're also mistaken when you say that "geometric arrangement of molecules" has nothing to do with entropy. Entropy depends on the number of available states. If you have a system in which the molecules are geometrically constrained, and another system in which they're not, the number of available states will be smaller in the former, and hence the entropy will be too.

    1) Please look at the beginning of my post.
    2) Ok, please give an equation for entropy change, that takes into account geometrics. And I really want a proper reference for this! (A web site devoted to origins debate is out of question)

    Let's have a look at the formulation of the 2nd law you seem to love so much: "Entropy of an isolated system cannot decrease". This simple formulation gives us valuable information about thermodynamics and the 2nd law, but that information is not one you'd love to have: "The 2nd law applies only to isolated systems". Sorry for that.

    Q : When entropy of an isolated system increase?
    A : When there are irreversible processes in the system. Actually all real world processes are irreversible (there is a heat loss) ie. ideal processes are only theoretical.

    Q: When entropy of an isolated system remains the same?
    A : When there are no processes occuring in the system (system is in thermodynamic balance) or when there are only resersible (ideal) processes in the system

    Q: Why entropy of an isolated system never decrease?
    A : Because heat cannot escape from the system. Isolation keeps it inside.

    Q: Does the entropy of an isolated system decrease, when spontaneous order or information develops in it?
    A: No. Please look at the formulation and earlier answers.

    Q: What is the most important thing we learn from this formulation?
    A : System's entropy decreases only when heat is transferring out of the system.

    Finally I'd like to return to the work and entropy relation, because during writing this post (it took several days, a peace by peace) I got a quite new idea.
    I expressed the "flow" of energy in following order (from highest form to lowest form):

    Work => Potential energy (order&information) => Entropy (heat)

    However this logical order brings us beside a fundamental question/problem. Kelvin and Planck formulated the 2nd law: " Work cannot be produced without temperature difference (ie. with only one heat resevoir)".

    But the problem is that temperature or any other potential energy difference do not appear spontaneously. According to the 2nd law all potential energy differences tend to diminish. The 2nd law dictates the direction of all spontaneous processes and this direction is towards less order and more entropy.

    The big question is: What is the origin of the vast amount of potential energy in the world?
    Is there an unknown worker responsible for this?

  6. Thanks for this article, seems like you have found a solution 🙂 Needless to say that I will subscribe to your RSS feeds now. Keep it up and thanks for sharing.

    Manuela

  7. Hi Ted,

    Your argument assumes that adding energy to a system automatically results in organization. It most cases it doesn’t. I put gas in my car, it still breaks down; we eat, yet we will still become disorganized and die.

  8. Sorry, but I have no idea what you’re talking about here. I certainly don’t believe, nor does my argument assume, that adding energy to a system necessarily results in increased organization (whatever we mean by that last word).

  9. thanks for sharing. I love physics so much and spent many time to learn physics. I’m physics teacher, from indonesia. Hope someday i can study in usa 🙂
    Can I subscribe to your RSS feeds?

Comments are closed.