Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and the Yahoo Answers website is now in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Jonny B asked in Science & MathematicsPhysics · 1 decade ago

I have a few questions about entropy?

I know that entropy is the tendency for all things to reach equilibium. Is this a logarithmic function? and how does one calculate or measure it? What is its value when equilibium is attained? What are the units called?

2 Answers

Relevance
  • 1 decade ago
    Favorite Answer

    Entropy isn't really the "tendency toward equilibrium." Most people tie it to some sense of 'disorder' in a system, but that's hard to measure.

    The real definition is tied to the number of 'states' a system can find itself in. The entropy S is defined as

    S = k*ln(number of states)

    where k is Boltzmann's constant, 1.38066×10−23 joules per kelvin. The units of entropy are sort of "energy per temperature".

    It's still hard to imagine counting states. It turns out that entropy changes as heat energy flows in or out of system and the change in entropy is given by

    change in S = (amount of heat flowing into the system)/(temperature of the system)

    So one can tally up the total entropy be adding up all the heat flow into the system divided by the temperature at which the transfer takes place. This gets tricky because the temperature changes as the heat flows in most cases.

    There are a few situations where heat can flow while the temperature stays constant, like boiling a pan of water or melting a block if ice. Those are easier calculations.

  • Anonymous
    1 decade ago

    Entropy is the measure of disorder in a system. equivalently it is a measure of the information in a system. A "perfectly random" mixture has no information and is completely mixed or completely disordered. In communication theory (see Shannon) the information way to look at Entropy is used. In thermodynamics and chemistry we think about the disorder. More disorder, more Entropy. Units of entropy are energy per degree. Just like the idea of an absolute zero energy is meaningless, absolute zero entropy is also meaningless (for us backwards rubes on this third planet of a mediocre dwarf yellow star). We use entropy relatively, as the change in Entropy between two states. So we can calculate the relative entropy, while the absolute entropy is either a philosphical or a religious question. The measurements will relate both the energy and the entropy of a system in two different states (say before and after) and the total quantity will be the "driving force" behind any change. E = Q + TS

    where E will be the "free energy" or "potential to change"

    , Q will be the energy difference between state two and state one and S will be the entropy change between the two states ( T is the temperature, assumed here to be constant)

Still have questions? Get your answers by asking now.