Entropy

The entropy (Greek coinage ἐντροπία [ Entropia ], from εν ~ [ s ~ ] - a ~ in ~ and τροπή [ Trope ] - turn, conversion ) is a fundamental thermodynamic state variable, since it determines the second and third law, whether a process takes place at all. For all physically possible processes, the entropy remains or increases.

Coarse characterizations

  • In statistical physics and physical chemistry, the entropy is a measure of the number of microstates by which the observed macroscopic state of the system can be realized. This entropy can be used as the phase space volume of which is accessible from the system state without any external influence, be interpreted ( in the classical thermodynamics equilibrium states with the same entropy adiabatically equivalent). The gas after removal of the partition wall, a larger room. Exist after the expansion thus more micro-states, and the system has a higher entropy.
  • Entropy is an extensive, that is directly related to the system size growing state variable. Each state of a thermodynamic system can be assigned an entropy, which also doubled with a doubling of particle number and volume of the system.

Position in the thermodynamics

Classical thermodynamics is a fundamental physical theory, are described with the energetic interactions of systems with their environment. A system can in principle exchange in two ways with its environment energy: in the form of heat and work, which depends on the system and the process control different versions exist of this work, including work volume, chemical and magnetic work. In the course of such an energy exchange, the entropy of the system and the environment changes. Only if the global sum of the entropy change is positive, ie, system and environment can reach more microstates after the change than before the change takes place spontaneously.

Basics

The entropy S ( unit J / K) is an extensive state variable of a physical system and acts additively when combined multiple systems as well as the volume, the electric charge or the amount of substance. Dividing by the mass of the system, we obtain the specific entropy s with the unit J / ( kg · K) as an intensive state variable. The German physicist Rudolf Clausius introduced this term in 1865 to describe a circle processes.

The differential dS (a non- cursive d is used to emphasize that it is a complete differential ) is by Clausius in reversible processes, the ratio of heat transferred Dq and absolute temperature T:

This entropy change is positive when heat at heat dissipation negative.

Because of the completeness of the differential dS changes of entropy are independent of the path; That is, the entropy is a state variable. In this context, ie the reciprocal absolute temperature plays the role of an " integrated assessment factor ", which consists of the reversible or heat loss, then a -, makes incomplete differential, an associated exact differential, dS - mathematically.

In this respect, one can define the entropy in reversible litigation as the " valued at 1 / T thermal energy ". Further down, the problem is treated, the extent to which energy T dS can be converted into work.

But Clausius treated and irreversible processes. Namely, he showed that in an isolated thermodynamic system, the entropy satisfies the following inequality:

Where the equality sign holds only for reversible processes.

Because it is a non- isolated system, as discussed in (1), still can expand by the addition of an additive to an isolated system, the ( also additive ) Zusatzentropie a reversible thermal change corresponding to the following Eq. (2 ) that in the irreversible heat change in (1) the greater-than sign applies instead of the equal sign, even if the heat is not supplied, but is discharged ( the greater-than sign is also useful for negative numbers ). So would correspond in many textbooks still usual notation

These outdated notation is imprecise: A distinction is now more precisely between the transported entropy ( first inequality in isolated systems can not heat be transported ) and the entropy produced ( second inequality ) - and holds - universal, so also for the non- adiabatic system - both inequalities to an equation together:

It is δWdiss. , Inn. the power dissipated in the system operation, which is always positive, is therefore supplied to the system (for example, the frictional work ). These dissipative processes occur as heat in appearance ( for example, one speaks the ohmic resistance in the theory of electricity generated by the " Joule heat " ) and contribute to the entropy at. The " integrating denominator " T occurs in two terms. To accurately compare, but you must always specify at what precise initial state relate the changes.

Equation (3) is a form of the second law of thermodynamics. The differential dS exists but only for quasi- static state changes - this is also a need for more clarity - that is, that a sequence of states can be detected by measurement (states with only minor imbalances ). In the process shown in the image in the adiabatic system in which only the initial state and the final state can be specified, which would not be the case. For an ideal gas, but the entropy can be calculated easily using a reversible isothermal replacement process (submission of work and absorb heat from the environment ), as has been done in the Examples section.

The product T? S represents the non- recycled content ( " heat " ) in the isothermal production - isothermal means that at fixed temperature - work Δ A from existing energy Δ U dar. The maximum value of this work? A is the so-called free energy DELTA.F = DELTA.U - T? S. This is an equivalent form of the 2nd law.

History of the term

In 1712 the first steam engine of Thomas Newcomen was installed in a mine to pump water. The machine was doing its job, but needed a lot of fuel. At this time, the relationship between energy and heat was unclear and it took another 130 years before Julius Mayer published the first law of thermodynamics. From 1764 James Watt improved the steam engine and could their efficiency to more than 1% to more than double, and the. Without knowledge of the formal laws of thermodynamics It was not until 60 years later, the young French engineer Sadi Carnot had the crucial idea, which he published in 1824. Inspired by his father's work on water mills Carnot described a steam engine by a cyclic process in which heat from a hot spring flows to a cold sink and thereby does work. The ratio of removed by mechanical work? W to heat? Q was initiated, the efficiency η:

In its original writing Carnot was of the opinion that heat is a kind of imponderable substance which flowed always from a hot to a cooler body, like water always moves downhill. And just like cascading water could heat all the more do more work, the higher the slope is, in particular, the machine could not do more work was supplied as heat. Carnot corrected himself later, and already recognized a decade ago, Mayer, Joule and Thomson, the equivalence of heat and energy. He was so ahead of his time, but he died young and at first his work went unnoticed. Only Clausius formulated the relationship between temperature difference - the source and sink - with the efficiency of the Carnot engine and that this efficiency can not be exceeded by any other heat engine, otherwise heat would flow spontaneously from a cold to a hot body. The impossibility of such a process in nature is now known as the 2nd law of thermodynamics. Clausius formulated it with a circular process:

So put it more simply stating the 2nd law that temperature differences can not enlarge spontaneously in nature. Clausius was with this demand the set

Derived for arbitrary cycles. The equal sign is paid only for reversible processes. With this set of Clausius, it is obvious, the size

Differentially defined. Rudolf Clausius called this size entropy, a portmanteau, which leaned on the word energy and is about to translate content with conversion - in contrast to the heat content. It was usual with the time to formulate the second main theorem directly to the entropy, which does not lead to a deeper understanding. Only decades later Ludwig Boltzmann found with its statistical mechanics an explanation for the entropy as a measure of the accessible microstates of the system. Heat is distributed randomly over atoms and molecules and energy flows from hot to cold because the reverse is extremely unlikely, as Max Planck has found in his textbook on thermodynamics.

In 1999, the theoretical physicist Elliott Lieb and Jakob Yngvason have questioned the definition of entropy in the phenomenological thermodynamics to a strictly axiomatic basis. This definition does not make use of variables such as " heat " and " temperature", which can not be defined precisely without entropy, but is based on the concept of adiabatic accessibility.

Generation of entropy in an iced drink

When ice melts, the ordered ice crystal is transferred to a random motion of individual water molecules. Is energy and the entropy is transferred from the drink on the water molecules of the ice cube. If the temperature of the released water molecules consistently, so the entropy of the isolated system of ice and drink would be strictly preserved. But since there is a small temperature inevitable approximation of the subsystems of ice and drink, the entropy of the melted water increases faster than the decrease in entropy of the beverage equivalent: The setting of the thermodynamic equilibrium is not generated previously existing entropy.

To help understand

Energy is - contrary to popular speech - in the physical sense is not consumed but only converted, for example, in mechanical work and heat (1st law of thermodynamics - conservation of energy ). A gasoline engine is therefore supplied the same amount of chemical energy in the form of fuel in the course of a cycle, as will be discharged as a driving work and heat. Since the drive work is finally converted into heat by friction, lands at the end of the total energy content of the fuel as the amount of heat in the area, apart from possibly in potential energy or mechanically converted into deformation energy components from. The energy was not used up, but merely transformed; it makes sense to speak of an energy inflation. So you needed a size to describe the working capacity of the energy, since the amount of energy alone says nothing about the ability to work. Thus the oceans contain a huge amount of energy. But since it is present at ambient temperature, so that no work can be done. Therefore, it appears consistently for equation (1), the weighted entropy, for example, even when the differences in free energy F = U -TS occurs (U = internal energy ), as " waste heat ", " loss of energy " or similar to denote as it often happens in physics education, for example.

Clausius found that a given amount of heat can be more better to convert to work with a cyclic process, the higher the temperature at which it is fed to the machine (see Carnot efficiency ). The example of the engine, the chemical energy of the fuel is supplied by the combustion of the engine at about 2000-2500 ° C, and leaves it again through each about one-third at approximately 800 ° C by the exhaust gases, at about 50 ° C. the radiator and on the wheels. With the help of the equations of Clausius one can now predict how much of the cycle of the engine could provide a maximum. The energy supplied to this case has a low entropy, while the waste heat has a high entropy. The difference between the possible work can be calculated. The statement of the second law of thermodynamics, that entropy is constant for any reversible cycle, while they must increase in irreversible power cycle is equivalent to the following statement:

For isothermal processes (T = constant ) where the free energy is increased (), the work can be obtained at a maximum. ( Note that is not listed here, but ).

Problem

In popular science books but also in many textbooks, the entropy is equated with disorder. This analogy is true for some systems, for example, has an ordered crystal is much less than its entropy melt. For other systems, this view is rather problematic, for example, has an ordered biomembrane in water has a higher entropy than their disordered, dissolved in water constituents ( see applications of Entropiebegriffs below). The problem is mainly that the colloquial expression disorder is not clearly defined and the entropy is not a measure of the symmetry of the system, but for the number of microscopic reachable states, regardless of their degree of order, however defined. Especially in textbooks of theoretical physics, the term disorder is therefore avoided.

Confusion also by the fact that the term entropy is used in different disciplines for different phenomena arises. In the phenomenological thermodynamics and thus in particular in chemistry, the above interpretation is discussed - with (1 / T ) evaluated heat - relevant. In statistical physics according to the statistical interpretation above. The computer science considered abstract information without direct reference to the physical realization of the Shannon Informationentropie that corresponds to the statistical interpretation ( see below). The entropy is therefore a statistically defined size substantially and can be used effectively in many contexts. Notwithstanding the above definitions can be different in the individual disciplines. So Norbert Wiener used the concept of entropy as for the description of phenomena such as Claude Elwood Shannon information, but with a negative sign. The fact that the Convention Shannon has prevailed, especially of the better technological use of his work is due. However, it is clear from this example that an interdisciplinary application of Entropiebegriffes least caution and an accurate source analysis is offered.

The entropy is not directly measurable statistical quantity such as temperature and pressure. It can be detected only changes in entropy, and it is also not strictly conserved quantity such as energy, mass, particle number or charge of a system. This is also a significant difference between first and second law of thermodynamics. While the first law is nothing more than the formulation of the strictly valid energy conservation law in the language of thermodynamics, the second law basically just a mathematical formula of probability theory dar. However, the probability of a violation of the second law in macroscopic systems is extremely low. However, it can not be inferred directly from the microscopic equations, but only probabilistically, and has even been refuted in the context of classical mechanics by Poincare ' recurrence theorem ".

All these features lead to problems in the understanding of the concept of entropy.

Entropy in thermodynamics

An ideal, reversible process at any time without friction losses is also called reversible. Often the entropy remains unchanged during a process, well-known example is the adiabatic compression and expansion cycle of a Carnot machine. They call state changes with constant entropy and isentropic, but not all isentropic state changes adiabatically. If a process is adiabatic and reversible, but always follows that he is also isentropic.

If the added heat in a cycle at the temperature and the amount of heat delivered at again, is that the entropy does not change:

Provided heat absorption and release take place reversibly.

From this, the maximum work done and the maximum efficiency derived.

The picture on the right shows the mixture of a brown color in water. Initially, the color is uneven. After a long wait, the water assumes a uniform color.

The entropy is a measure of ignorance. As a measure of disorder you have to pay close attention to the terminology. Thus, the liquid is indeed " ordinary " stirred the right lens in the image example, but there by the large mixing of water and ink particles there a bigger mess. Consequently, there is the entropy higher than in the left lens. From the color, we know that it is distributed throughout the glass right in the water. The left picture tells us more. We can account for areas where paint is to be found in high concentrations, or areas that are free of paint.

The entropy of mixing can be calculated. Josiah Willard Gibbs pointed out the contradiction that the entropy should also occur if, instead of ink water is poured into the water glass ( Gibbs paradox).

The number of arrangements of the dye molecules at the beginning is much less than that if the color can be distributed throughout the volume. Because the dye molecules are only focused on a few areas. On the right, they can stay the entire glass. Entropy is here larger, so that the system tends towards this uniform distribution over time.

The entropy can only remain unchanged if the processes are reversible. Real state changes are always associated with energy losses ( eg friction ), which increases the entropy. A reduction in the total entropy in a closed system is not possible. But the entropy can be reduced locally if it grows accordingly in other places of the system.

Second and third law

Rudolf Julius Emanuel Clausius had recognized that by

Differentially given size is an extensive state variable, ie is independent of the reaction path and is proportional to the system size. The term place or stresses that the absorbed or emitted by the system thermal energy, even if it is fed reversible, in contrast to the entropy is path-dependent (for example, see cycle ) so that only

  • First, in a reversible process control and
  • Second, after multiplication by the " integrating factor " 1 / T

The specified expression with that of a full differential property results.

Clausius also found that in an isolated system the entropy increases monotonically:

He formulated this observation in the second law of thermodynamics as a negation of the existence of a perpetual motion machine of the second kind:

" There is no cycle, its only effect is to transfer heat from a cold reservoir to a warm reservoir. "

Obviously it would otherwise have constructed an inexhaustible source of energy. Equivalently, is the formulation of William Thomson, later Lord Kelvin:

" There is no cycle, which takes an amount of heat from a reservoir and completely transformed into work. "

In contrast to the well-known extensive quantities of thermodynamic systems, such as energy, volume and mass, the entropy initially eluded a deeper understanding. The entropy could only be satisfactorily explained as a measure of the phase space volume in the context of statistical mechanics by Ludwig Boltzmann that can be reached from the phase trajectory of the system in compliance with the constancy of selected macroscopic observables such as temperature, volume or number of particles.

Illustrate the entropy is therefore a measure of lack of information on the actual micro state when present only a small number of observable quantities for characterizing the macrostate. Ergodic the claims that the trajectory of the system in fact covers the entire measured by the entropy phase volume over time. Systems that exhibit this behavior are called ergodic. Only in these, the 2nd law can be usefully applied. Closely related to this is the irreversibility of processes in nature.

The third law (the so -called " Nernst heat theorem " ) defines the entropy of a perfect crystalline substance in which, for example, no spin degeneracy occurs at absolute zero as the zero fixed:

One implication is, for example, that the heat capacity of a system disappears at low temperatures, and in particular that the absolute zero temperature is not accessible ( this also applies to spin degeneracy ).

Partial derivatives of the entropy

From the 2nd law also often required statements about the partial derivatives of the entropy, for example, after the temperature T and on the volume V of a liquid or gaseous non-magnetic system to follow. With the second law applies initially Yes, that is in reversible changes of state. Follows because, according to the first law for the internal energy U is that the sum of the considered system supplied labor and the heat supplied together with the first law - no single state functions! - A state function gives precisely the " internal energy " of the system. It was assumed that the changes in volume and temperature take place adiabatically - slowly so that no irreversible processes are created

So

Having been used.

Or.

Similar relations are obtained if the system out of the density or the volume still depends on other variables, such as even more of electrical or magnetic moments.

From the third main theorem now follows that both mainly for must disappear, and although sufficiently rapidly, which ( as can be shown ) is only satisfied if not classical physics, but quantum physics is true for low temperatures.

Applications of Entropiebegriffs

In the introduction to the experiment of Gay- Lussac is described. The experiment shows that the overflow of an ideal gas takes place in a larger volume of any change in temperature; i.e., the internal energy of the gas has not changed (). What is now the change in entropy in the described experiment? Since the entropy is a state variable, it is independent of the path. Instead of the partition to pull out, they can also slowly move to the right until the final volume is reached. For an infinitesimal displacement, the volume increases by the entropy increases by. As only volume work is done from the first law follows with and:

From the state equation for ideal gases ( the number of gas atoms ):

Follows:

It follows immediately by integration:

Since atoms are indicated in the above example, the following applies:

More realistic would be, for example, 1 mol atoms, ie atoms, bringing

Results.

The entropy of a macrostate can be determined ( the number of its micro-states ) and through its statistical weight W. Are molecules distributed on two parts of the room that in one half and the other molecules are present, then the statistical weight

A whole mole () is in a half ( and in the other not ), then

The faculty can be approximated by the Stirling formula, where you may be limited to. The logarithm of is. This is

And

With kB = 1.3807 · 10-23 J / K and NA = 6.0220 · 1023 mol -1 is obtained for the entropy with respect to the expansion

In a system that exchanges neither mass nor energy with its environment, the entropy can never decrease spontaneously. For example, a kilogram of water has at 10 ° C, the entropy at 20 ° C, at 30 ° C. 1 kg of cold water (10 ° C) and 1 kg of warm water ( 30 ° C) in contact spontaneously in the state 2 kg of warm water (20 ° C) overlook because the entropy of the initial state ( 151 437 = 588 ) is smaller than the entropy of the final state (297 297 = 594 ). The spontaneous reversal of this process is not possible, because in this case the entropy of the existing of 2 kg of water system of 594 J / K to 588 J / K would decrease, which would contradict the second law of thermodynamics.

Are you lipids, occurring, for example, as building blocks of biomembranes in living creatures, in water, so that spontaneously form closed membrane structures called vesicles. Here, since temperature and pressure are given ( heat bath and pressure ensemble) is the thermodynamic potential, which seeks a minimum the free enthalpy. The enthalpy can be determined experimentally demonstrate is so measurable and positive. As the process proceeds spontaneously, but must be negative; That is, the entropy must increase. This is confusing at first glance, since the entropy is mostly responsible for ensuring that the substances mix ( entropy of mixing ). The entropy increase is due to a special property of water. It forms between the individual water molecules from hydrogen bonds, which constantly fluctuate and thus make a major contribution to the entropy of the water. To the long fatty acid chains of the lipids occurs when dissolved in water, a larger area in which no hydrogen bonds can be formed. In the areas around the fatty acid chains around the entropic contribution of the hydrogen bonding is absent, so that the total entropy decreases. This loss is considerably larger than the to be expected by the mere mixing of the water and increase of the lipid. When assemble the fatty acid chains, more hydrogen bonds can be formed, and the entropy increases. This could also be formulated so as to the ability of water to form fluctuating hydrogen bonds, the lipids drives from the solution. This property is ultimately responsible for the poor solubility of many non-polar substances which interfere with the formation of hydrogen bonds.

A living organism can be considered in a sense as a thermodynamic machine, which converts the chemical energy into work and heat, and at the same time seems to produce entropy. It is not clear in the present state of research, whether a biological system entropy is to you, because it is not in the state of thermodynamic equilibrium.

In addition to its role as a fundamental state variable of the phenomenological and statistical thermodynamics, the entropy is used in other fields, particularly in information theory and in economics. The entropy has a separate meaning in these areas. For example, it is in astrophysics necessary in the description of star births, white dwarfs, neutron stars, black holes ( they have the highest entropy of all known physical systems ), globular clusters, galaxies ( heap ), and ultimately the whole cosmos, to resorting to the concept of entropy.

Statistical Physics

Around 1880, Ludwig Boltzmann was able to explain the entropy and justified by him and James Maxwell statistical physics at the microscopic level. In statistical mechanics, the behavior of macroscopic thermodynamic systems is explained by the microscopic behavior of its components, ie, elementary particles and resulting composite systems such as atoms and molecules. A microstate is classically given by indicating all positions and momenta of the system for counting particles. Such a condition is, therefore, a micro- point in a 6N -dimensional space, which is called in this connection, the phase space. The canonical equations classical mechanics describe the temporal evolution of the system, the phase trajectory.

All phase points under given macroscopic constraints, such as total energy E, volume V and particle number N, accessible form a coherent phase-space volume. The entropy is a measure of the accessible under certain macroscopic boundary conditions phase space volume, ie for the number of accessible states. The greater the entropy, the indefinite state is microscopic, the less information is known about the system. The fundamental postulate of statistical physics states that each of the accessible microstates of a completely closed system in equilibrium occurs with equal probability and thus the entropy is maximum (see: maximum entropy method, Microcanonical Ensemble ).

The entropy is proportional to the logarithm of the accessible phase space volume (or quantum mechanics: the number of accessible states ) and is calculated from the SI- system

In terms of J / K. To perform this calculation concrete, ie the macroscopic observables of the system under consideration must first be known. For the ideal gas results in the Sackur - Tetrode equation. The constant is called in recognition of the achievements Ludwig Boltzmann in the development of statistical theory and Boltzmann's constant, it even has its value not determined.

The entropy of the thermodynamics, the Shannon entropy of the distribution of the states which maximize the Shannon entropy ( multiplied by a constant ). Thus, the thermodynamic entropy is just a special case of equilibrium states with maximum ignorance with given boundary conditions.

Quantum mechanics

In quantum statistics, a micro- state ( also called pure state ) is given by a vector in the Hilbert space of the system. Typically, this space is 1023 dimensional and therefore represents the same number of particles. This is based on the Avogadro constant, which is equal to the number. The corresponding macro-state describing by a statistical operator, which is also referred to as the density operator.

This includes all information about the system which are accessible by an ideal measurement ( which is much less than in the pure state, the micro mode). The macro state is classically given by an ensemble of microstates, which have in common with certain than " typical macroscopic values ​​" conserved quantities such as energy, volume and number of particles. The distribution of the microstates in phase space is given by a classical distribution function. In their place, occurs in the quantum mechanical description of the density operator:

Is the probability that the considered system is in "pure" state if the quantum states are all orthogonal. The number of degrees of freedom under consideration now is usually much smaller than

The expectation value of an observable on the state mixture described by the density operator is given by a trace formation:

The trace of an operator is defined as follows: for any ( complete ) basis.

The von Neumann entropy is defined as:

Multiplying these dimensionless von Neumann entropy with the Boltzmann constant one obtains an entropy with the ordinary unit.

The entropy is given about the probabilities of the individual pure quantum mechanical states in the macro state by

Wherein the probability is to be in the ith state microstructure. kB is Boltzmann's constant. The probabilities can take values ​​between 0 and 1, and the entropy is thus positive semidefinite. For pure states, for example, p1 = 1, p2 = ... = PN = 0, and hence the entropy of zero, on the one hand ln (1 ) = 0 and on the other hand x • ln x → 0 as x → 0 Non-zero values ​​of entropy obtained by " increasing the ignorance " of the system state:

As an example we take a spin system with four electrons. Spin and magnetic moment are known to be anti-parallel; the magnetic moment of a downward spins has the energy in the external magnetic field B. The ground state energy of the system is to be whole, which leads to the following four states:

It follows that the spin degeneracy - and as above applies here.

The above general formula (*), is up to a constant factor (the term is replaced by the dual logarithm ld ..., the logarithm to the base 2 ) is identical to the formula for the Shannon information entropy. That is, the physical entropy is a measure of the information that is missing one by knowing the state of the macro to the micro- state.

Properties of the statistical entropy of a quantum mechanical state

Be and density operators on the Hilbert space.

  • Gibbs inequality
  • Invariance under unitary transformations of ( with )
  • Minimum
  • Maximum
  • Concavity
  • Triangle inequality

Entropy at a glance

The thermodynamics describes the state of a substance by means of macroscopically measurable parameters such as temperature, pressure, internal energy, enthalpy, and also by the relative amounts of products and starting materials. There are many and some state variables can be represented as functions of other ( state functions). One is the entropy (unit: Joule / Kelvin).

While one out of everyday experience with sizes such as length, energy or power and units connects quantifiable notions, the unit " joule / Kelvin " the size entropy is less clear. It has the semantic content that "a state of higher entropy adjusts itself even more likely ." The term was originally coined by Clausius and quantified with the aid of thermodynamic quantities, and later of Boltzmann derived from a statistical consideration out and with the use of atomic or molecular idea.

1 Statistical formulation

What you can imagine under the probability of a state now? A substance consisting of many particles. Just as in a patient the same state (temperature, pulse rate) can have a variety of causes, which only turn out on closer examination, it can also be the same thermodynamic state due to many and very different arrangements of the molecules. If one number all N molecules of a gas, for example, the state of the uniform distribution over a volume could be formed by the fact that all molecules with low numbers (up to N / 2) in one hand, the higher numbers in the other half halt. However, the same condition also arises when all even or odd numbers were in one and the other half. The condition of " uniform distribution of the gas throughout the entire volume " is formed of an extremely large number of distribution patterns of the molecules. However, the state in which the molecules are concentrated in a partial volume and the rest is empty, of considerably less configurations of molecules is formed, such as in the following example:

Entropy and the distribution in space

Free mobile molecules are spread evenly across a room. This condition, the maximum entropy, or in other words, it is most likely. This can be illustrated through four molecules (Figure ), which are constantly in motion in a beaker between the left and right half. ( To reduce it to the essentials, the vertical exchange of positions is not only provided once.) This results in very different and constantly changing distributions. In a snapshot could, for example, only the second molecule from above in the right half be seen, the remaining three left, in another the molecules of two and three on the left ( highlighted in gray highlighted). Each layout variant - called micro state - is often equal to or likely. Stacked columns are those micro-states, where gathered on one side the same number of molecules have (none or a maximum of four). They give the same overall system macroscopic properties (such as the amount of pressure on the left or right wall) and together form a macrostate. The most common is the uniform distribution on both halves, as it can be realized by the largest number of microstates (in this case six). If there are more molecules in the beaker, there are more cells available space or can you also vertical exchange of positions, then the variety of possible arrangements is rapidly increasing. But always has a uniform distribution of the highest number of microstates and sets itself most often when one would initially the molecules crowded, introduced only into a corner. This is done out of itself and randomly and only because the molecules move.

The number of microstates of a macrostate is his " Statistical weight W". The W, when it comes to number of particles in the gas phase, a very large number (as opposed to mathematical probability), but can be very small even when strong forces between the particles act ( Nullpunktsentropie ). The natural logarithm of the weight W of a statistical state multiplied by the Boltzmann constant, results in its entropy S:

If you change the conditions faced by a substance ( for example, mixing the volume increases ), then the molecules form into a new state, namely the one that is the most likely under the changed conditions. The reason for this lies in the movement of the molecules that are constantly distributed via new space and energy levels. The approach to the equilibrium state is characterized - not causes - by the increase of entropy. This is a measure of the probability of a condition. The entropy is neither an energy nor a kind of substance, but the measure of a state of higher or lower probability.

About the spontaneous attitude of the most probable macrostate with respect to the spatial distribution of the Joule- Thomson effect, the osmosis and the temperature stability can be interpreted during the melting process.

Briefly: In general, a micro state, a certain distribution pattern of the molecules, but not only in space but also on the degrees of freedom of movement (e.g., translation, vibration and the energy level ) and the reactant or product molecules of the reaction. Many micro-states can form the same macroscopic state, just the " macro-state ", which can then be characterized by macroscopically measurable thermodynamic quantities. Since the molecules always buzzing confusion, like a swarm of bees, all microstates are equally or likely. But not the individual macro- states, as they can be made from many different microstates. The system approaches from even the most probable state, the one with the largest number of microstates. From the statistical weight W, the entropy can be calculated. She is a ( logarithmic ) measure of the number of species W, on which a microscopic system - in the macroscopic sense - may be the same.

2 Thermodynamic formulation

In relation to the Carnot cycle, it is found that the more heat between the reservoir and the working medium flows, the higher the temperature and vice versa. (This is with the help of the molecule idea understandable, since the heat transfer is due to the kinetic energy of the molecules, and this is proportional to the absolute temperature. ) Even without the molecule presentation took Clausius this proportionality, namely that in the Carnot cycle, the quantity Q / T, the reduced heat, the heat flow ( at high temperature) and during cooling ( at a lower temperature ) of the working medium has the same absolute value and produces at least reversible case where the sum of zero.

In the first step of the Carnot process, the isothermal expansion, the supplied thermal energy is converted into potential energy QT1 during expansion of the working fluid and so stored ( an arrangement that accomplishes this describes ). Using this stored energy of the process could be reversed. Mastering this transformation of QT1 in Epot not fully, perhaps because friction causes ( irreversible case), less work is saved (which would no longer be sufficient for a reversal ) and the difference appears as an additional ( friction ) heat emitted. This amount of heat is one negative and so the sum of all Q / T amounts is also negative. The greater the degree of irreversibility, the more negative is the sum. Thus, the function Q / T and a quantitative description of irreversibility. Clausius In this context, the function

In short: macroscopically measurable thermodynamic quantities is a state function can be derived, which reflects the degree of irreversibility.

3 Equivalence of Entropiebegriffe

In the expansion of a gas two different operations of the same can lead to the same final state output:

Despite the two different approaches is the change that remains at the end of the isothermal expansion at the working medium, the same in both cases. Microscopically, the final state of a larger number of microstates, and thus has a larger statistical weight W.

The probability expressed by the statistical weight, is used in the statistical thermodynamics same role as the entropy in the chemical thermodynamics. So there must be a relationship exist between the entropy and the statistical weight. To find this, we consider two independent systems of particles as a whole system. Meanwhile entropy S must equal the sum of the individual entropies S1 S2 be ( state functions). However, the statistical weights W1 and W2 must be multiplied, since each micro- state of a with any micro- state of the other system forms a new microstate of the whole system (W = W1 · W2). These two requirements are met, if the entropy is a logarithmic function of the statistical weight W:

The constant k * is once chosen arbitrarily. It turns out that she kB = R / NA = 1.38 ∙ 10-23 Joules / Kelvin is equal to the Boltzmann constant, if

Thus has the same dimension as the state quantity, so Joule / Kelvin. While with macroscopically measurable quantities only entropy can be determined ( eg, before and after heating or reaction), allows the statistical thermodynamics calculation of absolute entropies.

Clausius defined via the entropy. Since isothermal reversible expansion and the amount of heat is transferred,

, with the dimension J / K.

An ideal gas expands isothermally and irreversibly into the vacuum, the entropy can not be calculated in this way, since the definition of entropy by Clausius refers to reversibly exchanged heat quantities. However, the entropy ( as a state variable ) must change by the same amount. It is still only the knowledge of the entropy and after the irreversible expansion necessary.

According to Planck, the entropy is given by S = k ∙ LNW, with W as the statistical weight of a macrostate or the number of its microstates. The reasoning goes from the micro- states (Fig. section entropy and the distribution in space), which can form four molecules, eg, if they are independently reside, in the left or right half of the beaker. One half is the initial volume V1, the whole beaker, the volume doubled.

Apart from permutations of the particles among themselves, has the arrangement in which reside all N molecules (N = any number ) in the left half of the volume, the statistical weight W1 = 1 The entropy is

.

After the irreversible expansion of the number of possible arrangements, and thus the statistical weight of a macro state indicated by N1 and N2: or.

N is the total number of molecules, N1 and N2 each represents the number of molecules in the left and right sub-areas. In order to characterize the state after expansion in the vacuum of the right half, all macro states need to be added, since it may happen that also set less probable states for a short time. Overall, the sum of all the arrangements of the N molecules is:. This sum can be calculated using the binomial theorem:

The expression can be shown as a polynomial. For the case x = 1, y = 1, it simplifies to:

=

The entropy on the spreading of the molecules about twice the volume is so.

Substituting N is the Avogadro's number NA, S2 is the molar entropy after expansion.

And (since S1 = 0).

According to classical thermodynamics applies if you change the volume,

Thus, doubling the volume and isothermal reversible expansion, also.

According to the statistical thermodynamics includes a certain macrostate with n = particle microstates with the condition. The sum of the microstates of all macro states together can be determined using the Multinomialtheorems ( Polynomialsatz ):

Thus, for example, together with expansion to 3-fold volume for all microstates

In short, the microscopic and the macroscopic approach in thermodynamics lead to the same entropy.

4 Calculation and Use of tabulated entropy values

The molar entropy Smol at a certain temperature T2 and at constant pressure p is obtained using the molar heat capacity cp (T) by integration from absolute zero to the current temperature:

In addition, there Entropieanteile come at phase transitions. After Planck entropy ideal crystallized, pure solid is set at absolute zero is zero ( or mixtures frustrated crystals, however, retain a residual entropy ). Under standard conditions, one speaks of the Standardentropie S0. Even after statistical approach depend entropy and heat capacity together with each other: A high heat capacity means that a molecule can save a lot of energy, and that can, for example, based on a large number of low-lying and therefore easily accessible energy levels. Accordingly, many different distribution options on these levels, there are then also for the molecules and also leads to a high entropy value for the most probable state.

In electrochemical reactions, the reaction entropy? S results from the measured change of dE (electromotive force ) with temperature:

The entropy change in ideal mixtures are obtained with the help of the mole fractions xi of the substances involved:

Which in real mixtures still a Zusatzentropie results when mixing by changing the intermolecular forces.

Produced by a chemical reaction new molecules, then delivers the highest entropy in a certain state of equilibrium, in which the molecules can be as educt also distribute the product levels to both the. Over the following relationship in which the differences between the standard entropy ΔS0 of the substances involved play a major role, the equilibrium constant K can be calculated:

( the Δ means the change in size with complete reaction sequence in this case). How can you in a spontaneous process (eg chemical reactions, solution and mixing processes, setting of phase equilibria and their temperature dependence, osmosis, and others) can estimate the intensity of this process, which is the increase in the total entropy between the initial and equilibrium of the reactants and the area combined ( → chemical equilibrium ). The spontaneous increase of entropy in turn is a consequence of the constant movement of the molecules.

In short, the standard entropy of materials can be calculated from the history of the heat capacity with temperature. Knowing tabulated entropy values ​​(in conjunction with the reaction enthalpies ) the prediction of the chemical equilibrium.

Bekenstein - Hawking entropy " black holes "

Bekenstein and Hawking have shown that it is known as " black holes " astrophysical objects formally an entropy - can attribute if one identifies this entropy substantially with the surface A of the so-called event horizon - and thus a formal temperature. Details can be found in the specified article.

258797
de