Thermodynamics is the branch of physics that deals with heat and temperature, and their relation to energy, work, radiation, and properties of matter. The behavior of these quantities is governed by the four laws of thermodynamics which convey a quantitative description using measurable macroscopic physical quantities, but may be explained in terms of microscopic constituents by statistical mechanics. Thermodynamics applies to a wide variety of topics in science and engineering, especially physical chemistry, chemical engineering and mechanical engineering, but also in fields as complex as meteorology.
Historically, thermodynamics developed out of a desire to increase the efficiency of early steam engines, particularly through the work of French physicist Nicolas Léonard Sadi Carnot (1824) who believed that engine efficiency was the key that could help France win the Napoleonic Wars. Scots-Irish physicist Lord Kelvin was the first to formulate a concise definition of thermodynamics in 1854 which stated, "Thermo-dynamics is the subject of the relation of heat to forces acting between contiguous parts of bodies, and the relation of heat to electrical agency."
The initial application of thermodynamics to mechanical heat engines was extended early on to the study of chemical compounds and chemical reactions. Chemical thermodynamics studies the nature of the role of entropy in the process of chemical reactions and has provided the bulk of expansion and knowledge of the field. Other formulations of thermodynamics emerged. Statistical thermodynamics, or statistical mechanics, concerns itself with statistical predictions of the collective motion of particles from their microscopic behavior. In 1909, Constantin Carathéodory presented a purely mathematical approach in an axiomatic formulation, a description often referred to as geometrical thermodynamics.
A description of any thermodynamic system employs the four laws of thermodynamics that form an axiomatic basis. The first law specifies that energy can be exchanged between physical systems as heat and work. The second law defines the existence of a quantity called entropy, that describes the direction, thermodynamically, that a system can evolve and quantifies the state of order of a system and that can be used to quantify the useful work that can be extracted from the system.
In thermodynamics, interactions between large ensembles of objects are studied and categorized. Central to this are the concepts of the thermodynamic system and its surroundings. A system is composed of particles, whose average motions define its properties, and those properties are in turn related to one another through equations of state. Properties can be combined to express internal energy and thermodynamic potentials, which are useful for determining conditions for equilibrium and spontaneous processes.
With these tools, thermodynamics can be used to describe how systems respond to changes in their environment. This can be applied to a wide variety of topics in science and engineering, such as engines, phase transitions, chemical reactions, transport phenomena, and even black holes. The results of thermodynamics are essential for other fields of physics and for chemistry, chemical engineering, corrosion engineering, aerospace engineering, mechanical engineering, cell biology, biomedical engineering, materials science, and economics, to name a few.
This article is focused mainly on classical thermodynamics which primarily studies systems in thermodynamic equilibrium. Non-equilibrium thermodynamics is often treated as an extension of the classical treatment, but statistical mechanics has brought many advances to that field.
The history of thermodynamics as a scientific discipline generally begins with Otto von Guericke who, in 1650, built and designed the world's first vacuum pump and demonstrated a vacuum using his Magdeburg hemispheres. Guericke was driven to make a vacuum in order to disprove Aristotle's long-held supposition that 'nature abhors a vacuum'. Shortly after Guericke, the English physicist and chemist Robert Boyle had learned of Guericke's designs and, in 1656, in coordination with English scientist Robert Hooke, built an air pump. Using this pump, Boyle and Hooke noticed a correlation between pressure, temperature, and volume. In time, Boyle's Law was formulated, which states that pressure and volume are inversely proportional. Then, in 1679, based on these concepts, an associate of Boyle's named Denis Papin built a steam digester, which was a closed vessel with a tightly fitting lid that confined steam until a high pressure was generated.
Later designs implemented a steam release valve that kept the machine from exploding. By watching the valve rhythmically move up and down, Papin conceived of the idea of a piston and a cylinder engine. He did not, however, follow through with his design. Nevertheless, in 1697, based on Papin's designs, engineer Thomas Savery built the first engine, followed by Thomas Newcomen in 1712. Although these early engines were crude and inefficient, they attracted the attention of the leading scientists of the time.
The fundamental concepts of heat capacity and latent heat, which were necessary for the development of thermodynamics, were developed by Professor Joseph Black at the University of Glasgow, where James Watt was employed as an instrument maker. Black and Watt performed experiments together, but it was Watt who conceived the idea of the external condenser which resulted in a large increase in steam engine efficiency. Drawing on all the previous work led Sadi Carnot, the "father of thermodynamics", to publish Reflections on the Motive Power of Fire (1824), a discourse on heat, power, energy and engine efficiency. The book outlined the basic energetic relations between the Carnot engine, the Carnot cycle, and motive power. It marked the start of thermodynamics as a modern science.
The first thermodynamic textbook was written in 1859 by William Rankine, originally trained as a physicist and a civil and mechanical engineering professor at the University of Glasgow. The first and second laws of thermodynamics emerged simultaneously in the 1850s, primarily out of the works of William Rankine, Rudolf Clausius, and William Thomson (Lord Kelvin).
During the years 1873-76 the American mathematical physicist Josiah Willard Gibbs published a series of three papers, the most famous being On the Equilibrium of Heterogeneous Substances, in which he showed how thermodynamic processes, including chemical reactions, could be graphically analyzed, by studying the energy, entropy, volume, temperature and pressure of the thermodynamic system in such a manner, one can determine if a process would occur spontaneously. Also Pierre Duhem in the 19th century wrote about chemical thermodynamics. During the early 20th century, chemists such as Gilbert N. Lewis, Merle Randall, and E. A. Guggenheim applied the mathematical methods of Gibbs to the analysis of chemical processes.
The etymology of thermodynamics has an intricate history. It was first spelled in a hyphenated form as an adjective (thermo-dynamic) and from 1854 to 1868 as the noun thermo-dynamics to represent the science of generalized heat engines.
American biophysicist Donald Haynie claims that thermodynamics was coined in 1840 from the Greek root θέρμη therme, meaning heat and δύναμις dynamis, meaning power. However, this etymology has been cited as unlikely.
Pierre Perrot claims that the term thermodynamics was coined by James Joule in 1858 to designate the science of relations between heat and power, however, Joule never used that term, but used instead the term perfect thermo-dynamic engine in reference to Thomson's 1849 phraseology.
The study of thermodynamical systems has developed into several related branches, each using a different fundamental model as a theoretical or experimental basis, or applying the principles to varying types of systems.
Classical thermodynamics is the description of the states of thermodynamic systems at near-equilibrium, that uses macroscopic, measurable properties. It is used to model exchanges of energy, work and heat based on the laws of thermodynamics. The qualifier classical reflects the fact that it represents the first level of understanding of the subject as it developed in the 19th century and describes the changes of a system in terms of macroscopic empirical (large scale, and measurable) parameters. A microscopic interpretation of these concepts was later provided by the development of statistical mechanics.
Statistical mechanics, also called statistical thermodynamics, emerged with the development of atomic and molecular theories in the late 19th century and early 20th century, and supplemented classical thermodynamics with an interpretation of the microscopic interactions between individual particles or quantum-mechanical states. This field relates the microscopic properties of individual atoms and molecules to the macroscopic, bulk properties of materials that can be observed on the human scale, thereby explaining classical thermodynamics as a natural result of statistics, classical mechanics, and quantum theory at the microscopic level.
Equilibrium thermodynamics is the systematic study of transfers of matter and energy in systems as they pass from one state of thermodynamic equilibrium to another. The term 'thermodynamic equilibrium' indicates a state of balance. In an equilibrium state there are no unbalanced potentials, or driving forces, between macroscopically distinct parts of the system. A central aim in equilibrium thermodynamics is: given a system in a well-defined initial equilibrium state, and given its surroundings, and given its constitutive walls, to calculate what will be the final equilibrium state of the system after a specified thermodynamic operation has changed its walls or surroundings.
Non-equilibrium thermodynamics is a branch of thermodynamics that deals with systems that are not in thermodynamic equilibrium. Most systems found in nature are not in thermodynamic equilibrium because they are not in stationary states, and are continuously and discontinuously subject to flux of matter and energy to and from other systems. The thermodynamic study of non-equilibrium systems requires more general concepts than are dealt with by equilibrium thermodynamics. Many natural systems still today remain beyond the scope of currently known macroscopic thermodynamic methods.
Thermodynamics is principally based on a set of four laws which are universally valid when applied to systems that fall within the constraints implied by each. In the various theoretical descriptions of thermodynamics these laws may be expressed in seemingly differing forms, but the most prominent formulations are the following:
This statement implies that thermal equilibrium is an equivalence relation on the set of thermodynamic systems under consideration. Systems are said to be in equilibrium if the small, random exchanges between them (e.g. Brownian motion) do not lead to a net change in energy. This law is tacitly assumed in every measurement of temperature. Thus, if one seeks to decide if two bodies are at the same temperature, it is not necessary to bring them into contact and measure any changes of their observable properties in time. The law provides an empirical definition of temperature and justification for the construction of practical thermometers.
The zeroth law was not initially named as a law of thermodynamics, as its basis in thermodynamical equilibrium was implied in the other laws. The first, second, and third laws had been explicitly stated prior and found common acceptance in the physics community. Once the importance of the zeroth law for the definition of temperature was realized, it was impracticable to renumber the other laws, hence it was numbered the zeroth law.
The first law of thermodynamics is an expression of the principle of conservation of energy. It states that energy can be transformed (changed from one form to another), but cannot be created or destroyed.
The first law is usually formulated by saying that the change in the internal energy of a closed thermodynamic system is equal to the difference between the heat supplied to the system and the amount of work done by the system on its surroundings. It is important to note that internal energy is a state of the system (see Thermodynamic state) whereas heat and work modify the state of the system. In other words, a change of internal energy of a system may be achieved by any combination of heat and work added or removed from the system as long as those total to the change of internal energy. The manner by which a system achieves its internal energy is path independent.
The second law of thermodynamics is an expression of the universal principle of decay observable in nature. The second law is an observation of the fact that over time, differences in temperature, pressure, and chemical potential tend to even out in a physical system that is isolated from the outside world. Entropy is a measure of how much this process has progressed. The entropy of an isolated system which is not in equilibrium will tend to increase over time, approaching a maximum value at equilibrium. However, principles guiding systems that are far from equilibrium are still debatable. One of such principles is the maximum entropy production principle. It states that non-equilibrium systems behave such a way as to maximize its entropy production.
In classical thermodynamics, the second law is a basic postulate applicable to any system involving heat energy transfer; in statistical thermodynamics, the second law is a consequence of the assumed randomness of molecular chaos. There are many versions of the second law, but they all have the same effect, which is to explain the phenomenon of irreversibility in nature.
The third law of thermodynamics is a statistical law of nature regarding entropy and the impossibility of reaching absolute zero of temperature. This law provides an absolute reference point for the determination of entropy. The entropy determined relative to this point is the absolute entropy. Alternate definitions are, "the entropy of all systems and of all states of a system is smallest at absolute zero," or equivalently "it is impossible to reach the absolute zero of temperature by any finite number of processes".
Absolute zero, at which all activity would stop if it were possible to happen, is −273.15 °C (degrees Celsius), or −459.67 °F (degrees Fahrenheit), or 0 K (kelvin), or 0° R (degrees Rankine).
An important concept in thermodynamics is the thermodynamic system, which is a precisely defined region of the universe under study. Everything in the universe except the system is called the surroundings. A system is separated from the remainder of the universe by a boundary which may be a physical boundary or notional, but which by convention defines a finite volume. Exchanges of work, heat, or matter between the system and the surroundings take place across this boundary.
In practice, the boundary of a system is simply an imaginary dotted line drawn around a volume within which is going to be a change in the internal energy of that volume. Anything that passes across the boundary that effects a change in the internal energy of the system needs to be accounted for in the energy balance equation. The volume can be the region surrounding a single atom resonating energy, such as Max Planck defined in 1900; it can be a body of steam or air in a steam engine, such as Sadi Carnot defined in 1824; it can be the body of a tropical cyclone, such as Kerry Emanuel theorized in 1986 in the field of atmospheric thermodynamics; it could also be just one nuclide (i.e. a system of quarks) as hypothesized in quantum thermodynamics, or the event horizon of a black hole.
Boundaries are of four types: fixed, movable, real, and imaginary. For example, in an engine, a fixed boundary means the piston is locked at its position, within which a constant volume process might occur. If the piston is allowed to move that boundary is movable while the cylinder and cylinder head boundaries are fixed. For closed systems, boundaries are real while for open systems boundaries are often imaginary. In the case of a jet engine, a fixed imaginary boundary might be assumed at the intake of the engine, fixed boundaries along the surface of the case and a second fixed imaginary boundary across the exhaust nozzle.
Generally, thermodynamics distinguishes three classes of systems, defined in terms of what is allowed to cross their boundaries:
|Type of system||Mass flow||Work||Heat|
As time passes in an isolated system, internal differences of pressures, densities, and temperatures tend to even out. A system in which all equalizing processes have gone to completion is said to be in a state of thermodynamic equilibrium.
Once in thermodynamic equilibrium, a system's properties are, by definition, unchanging in time. Systems in equilibrium are much simpler and easier to understand than are systems which are not in equilibrium. Often, when analysing a dynamic thermodynamic process, the simplifying assumption is made that each intermediate state in the process is at equilibrium, producing thermodynamic processes which develop so slowly as to allow each intermediate step to be an equilibrium state and are said to be reversible processes.
When a system is at equilibrium under a given set of conditions, it is said to be in a definite thermodynamic state. The state of the system can be described by a number of state quantities that do not depend on the process by which the system arrived at its state. They are called intensive variables or extensive variables according to how they change when the size of the system changes. The properties of the system can be described by an equation of state which specifies the relationship between these variables. State may be thought of as the instantaneous quantitative description of a system with a set number of variables held constant.
A thermodynamic process may be defined as the energetic evolution of a thermodynamic system proceeding from an initial state to a final state. It can be described by process quantities. Typically, each thermodynamic process is distinguished from other processes in energetic character according to what parameters, such as temperature, pressure, or volume, etc., are held fixed; Furthermore, it is useful to group these processes into pairs, in which each variable held constant is one member of a conjugate pair.
Several commonly studied thermodynamic processes are:
There are two types of thermodynamic instruments, the meter and the reservoir. A thermodynamic meter is any device which measures any parameter of a thermodynamic system. In some cases, the thermodynamic parameter is actually defined in terms of an idealized measuring instrument. For example, the zeroth law states that if two bodies are in thermal equilibrium with a third body, they are also in thermal equilibrium with each other. This principle, as noted by James Maxwell in 1872, asserts that it is possible to measure temperature. An idealized thermometer is a sample of an ideal gas at constant pressure. From the ideal gas law pV=nRT, the volume of such a sample can be used as an indicator of temperature; in this manner it defines temperature. Although pressure is defined mechanically, a pressure-measuring device, called a barometer may also be constructed from a sample of an ideal gas held at a constant temperature. A calorimeter is a device which is used to measure and define the internal energy of a system.
A thermodynamic reservoir is a system which is so large that its state parameters are not appreciably altered when it is brought into contact with the system of interest. When the reservoir is brought into contact with the system, the system is brought into equilibrium with the reservoir. For example, a pressure reservoir is a system at a particular pressure, which imposes that pressure upon the system to which it is mechanically connected. The Earth's atmosphere is often used as a pressure reservoir. If ocean water is used to cool a power plant, the ocean is often a temperature reservoir in the analysis of the power plant cycle.
The central concept of thermodynamics is that of energy, the ability to do work. By the First Law, the total energy of a system and its surroundings is conserved. Energy may be transferred into a system by heating, compression, or addition of matter, and extracted from a system by cooling, expansion, or extraction of matter. In mechanics, for example, energy transfer equals the product of the force applied to a body and the resulting displacement.
Conjugate variables are pairs of thermodynamic concepts, with the first being akin to a "force" applied to some thermodynamic system, the second being akin to the resulting "displacement," and the product of the two equalling the amount of energy transferred. The common conjugate variables are:
Thermodynamic potentials are different quantitative measures of the stored energy in a system. Potentials are used to measure the energy changes in systems as they evolve from an initial state to a final state. The potential used depends on the constraints of the system, such as constant temperature or pressure. For example, the Helmholtz and Gibbs energies are the energies available in a system to do useful work when the temperature and volume or the pressure and temperature are fixed, respectively.
The five most well known potentials are:
|Helmholtz free energy|
|Gibbs free energy|
|Landau Potential (Grand potential)||,|
Thermodynamic potentials can be derived from the energy balance equation applied to a thermodynamic system. Other thermodynamic potentials can also be obtained through Legendre transformation.
The following titles are more technical:
Atmospheric thermodynamics is the study of heat-to-work transformations (and their reverse) that take place in the earth's atmosphere and manifest as weather or climate. Atmospheric thermodynamics use the laws of classical thermodynamics, to describe and explain such phenomena as the properties of moist air, the formation of clouds, atmospheric convection, boundary layer meteorology, and vertical instabilities in the atmosphere. Atmospheric thermodynamic diagrams are used as tools in the forecasting of storm development. Atmospheric thermodynamics forms a basis for cloud microphysics and convection parameterizations used in numerical weather models and is used in many climate considerations, including convective-equilibrium climate models.Black hole thermodynamics
In physics, black hole thermodynamics is the area of study that seeks to reconcile the laws of thermodynamics with the existence of black-hole event horizons. As the study of the statistical mechanics of black-body radiation led to the advent of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle.Chemical thermodynamics
Chemical thermodynamics is the study of the interrelation of heat and work with chemical reactions or with physical changes of state within the confines of the laws of thermodynamics. Chemical thermodynamics involves not only laboratory measurements of various thermodynamic properties, but also the application of mathematical methods to the study of chemical questions and the spontaneity of processes.
The structure of chemical thermodynamics is based on the first two laws of thermodynamics. Starting from the first and second laws of thermodynamics, four equations called the "fundamental equations of Gibbs" can be derived. From these four, a multitude of equations, relating the thermodynamic properties of the thermodynamic system can be derived using relatively simple mathematics. This outlines the mathematical framework of chemical thermodynamics.Critical point (thermodynamics)
In thermodynamics, a critical point (or critical state) is the end point of a phase equilibrium curve. The most prominent example is the liquid-vapor critical point, the end point of the pressure-temperature curve that designates conditions under which a liquid and its vapor can coexist. At higher temperatures, the gas cannot be liquefied by pressure alone. At the critical point, defined by a critical temperature Tc and a critical pressure pc, phase boundaries vanish. Other examples include the liquid–liquid critical points in mixtures.Entropy
In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB. Formally (assuming equiprobable microstates),
Macroscopic systems typically have a very large number Ω of possible microscopic configurations. For example, the entropy of an ideal gas is proportional to the number of gas molecules N. The number of molecules in twenty liters of gas at room temperature and atmospheric pressure is roughly N ≈ 6×1023 (the Avogadro number). At equilibrium, each of the Ω ≈ eN configurations can be regarded as random and equally likely.
The second law of thermodynamics states that the entropy of an isolated system never decreases over time. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment's entropy increases by at least that amount so that the total entropy increases. Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy.
Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system, or of the lack of information about it. The concept of entropy plays a central role in information theory.
Boltzmann's constant, and therefore entropy, have dimensions of energy divided by temperature, which has a unit of joules per kelvin (J⋅K−1) in the International System of Units (or kg⋅m2⋅s−2⋅K−1 in terms of base units). The entropy of a substance is usually given as an intensive property—either entropy per unit mass (SI unit: J⋅K−1⋅kg−1) or entropy per unit amount of substance (SI unit: J⋅K−1⋅mol−1).First law of thermodynamics
The first law of thermodynamics is a version of the law of conservation of energy, adapted for thermodynamic systems. The law of conservation of energy states that the total energy of an isolated system is constant; energy can be transformed from one form to another, but can be neither created nor destroyed. The first law is often formulated
It states that the change in the internal energy ΔU of a closed system is equal to the amount of heat Q supplied to the system, minus the amount of work W done by the system on its surroundings. An equivalent statement is that perpetual motion machines of the first kind are impossible.Heat
In thermodynamics, heat is energy in transfer to or from a thermodynamic system, by mechanisms other than thermodynamic work or transfer of matter. The mechanisms include conduction, through direct contact of immobile bodies, or through a wall or barrier that is impermeable to matter; or radiation between separated bodies; or isochoric mechanical work done by the surroundings on the system of interest; or Joule heating by an electric current driven through the system of interest by an external system; or a combination of these. When there is a suitable path between two systems with different temperatures, heat transfer occurs necessarily, immediately, and spontaneously from the hotter to the colder system. Thermal conduction occurs by the stochastic (random) motion of microscopic particles (such as atoms or molecules). In contrast, thermodynamic work is defined by mechanisms that act macroscopically and directly on the system's whole-body state variables; for example, change of the system's volume through a piston's motion with externally measurable force; or change of the system's internal electric polarization through an externally measurable change in electric field. The definition of heat transfer does not require that the process be in any sense smooth. For example, a bolt of lightning may transfer heat to a body.
Convective circulation allows one body to heat another, through an intermediate circulating fluid that carries energy from a boundary of one to a boundary of the other; the actual heat transfer is by conduction and radiation between the fluid and the respective bodies. Though spontaneous, convective circulation does not necessarily and immediately occur merely because of temperature difference; for it to occur in a given arrangement of systems, there is a threshold temperature difference that needs to be exceeded.
Like thermodynamic work, heat transfer is a process involving two systems, not a property of any one system. In thermodynamics, energy transferred as heat (a process function) contributes to change in the system's cardinal energy variable of state, for example its internal energy, or for example its enthalpy. This is to be distinguished from the ordinary language conception of heat as a property of the system.
Although heat flows spontaneously from a hotter body to a cooler one, it is possible to construct a heat pump; a refrigeration system expends work to transfer energy from a colder body to hotter surroundings; such a device can also transfer energy from colder surroundings to a hotter body; these devices have internal temperatures that lie outside the range between those of the body and the surroundings. In contrast, a heat engine reduces an existing temperature difference to supply work to another system. Another thermodynamic type of heat transfer device is an active heat spreader, which expends work to speed up transfer of energy to colder surroundings from a hotter body, for example a computer component.The amount of heat transferred in any process can be defined as the total amount of transferred energy excluding any macroscopic work that was done and any energy contained in matter transferred. For the precise definition of heat, it is necessary that it occur by a path that does not include transfer of matter. As an amount of energy (being transferred), the SI unit of heat is the joule (J). The conventional symbol used to represent the amount of heat transferred in a thermodynamic process is Q. Heat is measured by its effect on the states of interacting bodies, for example, by the amount of ice melted or a change in temperature. The quantification of heat via the temperature change of a body is called calorimetry.Latent heat
Latent heat is thermal energy released or absorbed, by a body or a thermodynamic system, during a constant-temperature process — usually a first-order phase transition.
Latent heat can be understood as heat energy in hidden form which is supplied or extracted to change the state of a substance without changing its temperature. Examples are latent heat of fusion and latent heat of vaporization involved in phase changes, i.e. a substance condensing or vaporizing at a specified temperature and pressure.The term was introduced around 1762 by British chemist Joseph Black. It is derived from the Latin latere (to lie hidden). Black used the term in the context of calorimetry where a heat transfer caused a volume change in a body while its temperature was constant.
In contrast to latent heat, sensible heat is a heat transfer that results in a temperature change in a body.Laws of thermodynamics
The three laws of thermodynamics define physical quantities (temperature, energy, and entropy) that characterize thermodynamic systems at thermal equilibrium. The laws describe how these quantities behave under various circumstances, and preclude the possibility of certain phenomena (such as perpetual motion).
The three laws of thermodynamics are:
First law of thermodynamics: When energy passes, as work, as heat, or with matter, into or out from a system, the system's internal energy changes in accord with the law of conservation of energy. Equivalently, perpetual motion machines of the first kind (machines that produce work with no energy input) are impossible.
Second law of thermodynamics: In a natural thermodynamic process, the sum of the entropies of the interacting thermodynamic systems increases. Equivalently, perpetual motion machines of the second kind (machines that spontaneously convert thermal energy into mechanical work) are impossible.
Third law of thermodynamics: The entropy of a system approaches a constant value as the temperature approaches absolute zero. With the exception of non-crystalline solids (glasses) the entropy of a system at absolute zero is typically close to zero.In addition, there is conventionally added a "zeroth law", which defines thermal equilibrium:
Zeroth law of thermodynamics: If two systems are each in thermal equilibrium with a third system, they are in thermal equilibrium with each other. This law helps define the concept of temperature.There have been suggestions of additional laws, but none of them achieve the generality of the four accepted laws, and they are not mentioned in standard textbooks.The laws of thermodynamics are important fundamental laws in physics and they are applicable in other natural sciences.Net generation
Net generation is the amount of electricity generated by a power plant that is transmitted and distributed for consumer use. Net generation is less than the total gross power generation as some power produced is consumed within the plant itself to power auxiliary equipment such as pumps, motors and pollution control devices. Thus
net generation = gross generation − usage within the plant (a.k.a. in-house loads)Non-equilibrium thermodynamics
Non-equilibrium thermodynamics is a branch of thermodynamics that deals with physical systems that are not in thermodynamic equilibrium but can be described in terms of variables (non-equilibrium state variables) that represent an extrapolation of the variables used to specify the system in thermodynamic equilibrium. Non-equilibrium thermodynamics is concerned with transport processes and with the rates of chemical reactions. It relies on what may be thought of as more or less nearness to thermodynamic equilibrium. Non-equilibrium thermodynamics is a work in progress, not an established edifice. This article will try to sketch some approaches to it and some concepts important for it.
Almost all systems found in nature are not in thermodynamic equilibrium, for they are changing or can be triggered to change over time, and are continuously and discontinuously subject to flux of matter and energy to and from other systems and to chemical reactions. Some systems and processes are, however, in a useful sense, near enough to thermodynamic equilibrium to allow description with useful accuracy by currently known non-equilibrium thermodynamics. Nevertheless, many natural systems and processes will always remain far beyond the scope of non-equilibrium thermodynamic methods due to the existence of non variational dynamics, where the concept of free energy is lost .
The thermodynamic study of non-equilibrium systems requires more general concepts than are dealt with by equilibrium thermodynamics. One fundamental difference between equilibrium thermodynamics and non-equilibrium thermodynamics lies in the behaviour of inhomogeneous systems, which require for their study knowledge of rates of reaction which are not considered in equilibrium thermodynamics of homogeneous systems. This is discussed below. Another fundamental and very important difference is the difficulty or impossibility, in general, in defining entropy at an instant of time in macroscopic terms for systems not in thermodynamic equilibrium; it can be done, to useful approximation, only in carefully chosen special cases, namely those that are throughout in local thermodynamic equilibrium.Second law of thermodynamics
The second law of thermodynamics states that the total entropy of an isolated system can never decrease over time. The total entropy of a system and its surroundings can remain constant in ideal cases where the system is in thermodynamic equilibrium, or is undergoing a (fictive) reversible process. In all processes that occur, including spontaneous processes, the total entropy of the system and its surroundings increases and the process is irreversible in the thermodynamic sense. The increase in entropy accounts for the irreversibility of natural processes, and the asymmetry between future and past.Historically, the second law was an empirical finding that was accepted as an axiom of thermodynamic theory. Statistical mechanics, classical or quantum, explains the microscopic origin of the law.
The second law has been expressed in many ways. Its first formulation is credited to the French scientist Sadi Carnot, who in 1824 showed that there is an upper limit to the efficiency of conversion of heat to work, in a heat engine.Statistical mechanics
Statistical mechanics is one of the pillars of modern physics. It is necessary for the fundamental study of any physical system that has a large number of degrees of freedom. The approach is based on statistical methods, probability theory and the microscopic physical laws.It can be used to explain the thermodynamic behaviour of large systems. This branch of statistical mechanics, which treats and extends classical thermodynamics, is known as statistical thermodynamics or equilibrium statistical mechanics.
Statistical mechanics shows how the concepts from macroscopic observations (such as temperature and pressure) are related to the description of microscopic state that fluctuates around an average state. It connects thermodynamic quantities (such as heat capacity) to microscopic behavior, whereas, in classical thermodynamics, the only available option would be to measure and tabulate such quantities for various materials.Statistical mechanics can also be used to study systems that are out of equilibrium. An important subbranch known as non-equilibrium statistical mechanics deals with the issue of microscopically modelling the speed of irreversible processes that are driven by imbalances. Examples of such processes include chemical reactions or flows of particles and heat. The fluctuation–dissipation theorem is the basic knowledge obtained from applying non-equilibrium statistical mechanics to study the simplest non-equilibrium situation of a steady state current flow in a system of many particles.Temperature
Temperature is a physical quantity expressing hot and cold. It is measured with a thermometer calibrated in one or more temperature scales. The most commonly used scales are the Celsius scale (formerly called centigrade) (denoted °C), Fahrenheit scale (denoted °F), and Kelvin scale (denoted K). The kelvin (the word is spelled with a lower-case k) is the unit of temperature in the International System of Units (SI), in which temperature is one of the seven fundamental base quantities. The Kelvin scale is widely used in science and technology.
Theoretically, the coldest a system can be is when its temperature is absolute zero, at which point the thermal motion in matter would be zero. However, an actual physical system or object can never attain a temperature of absolute zero. Absolute zero is denoted as 0 K on the Kelvin scale, −273.15 °C on the Celsius scale, and −459.67 °F on the Fahrenheit scale.
For an ideal gas, temperature is proportional to the average kinetic energy of the random microscopic motions of the constituent microscopic particles.
Temperature is important in all fields of natural science, including physics, chemistry, Earth science, medicine, and biology, as well as most aspects of daily life.Thermochemistry
Thermochemistry is the study of the heat energy associated with chemical reactions and/or physical transformations. A reaction may release or absorb energy, and a phase change may do the same, such as in melting and boiling. Thermochemistry focuses on these energy changes, particularly on the system's energy exchange with its surroundings. Thermochemistry is useful in predicting reactant and product quantities throughout the course of a given reaction. In combination with entropy determinations, it is also used to predict whether a reaction is spontaneous or non-spontaneous, favorable or unfavorable.
Endothermic reactions absorb heat, while exothermic reactions release heat. Thermochemistry coalesces the concepts of thermodynamics with the concept of energy in the form of chemical bonds. The subject commonly includes calculations of such quantities as heat capacity, heat of combustion, heat of formation, enthalpy, entropy, free energy, and calories.Thermodynamic equilibrium
Thermodynamic equilibrium is an axiomatic concept of thermodynamics. It is an internal state of a single thermodynamic system, or a relation between several thermodynamic systems connected by more or less permeable or impermeable walls. In thermodynamic equilibrium there are no net macroscopic flows of matter or of energy, either within a system or between systems. In a system in its own state of internal thermodynamic equilibrium, no macroscopic change occurs. Systems in mutual thermodynamic equilibrium are simultaneously in mutual thermal, mechanical, chemical, and radiative equilibria. Systems can be in one kind of mutual equilibrium, though not in others. In thermodynamic equilibrium, all kinds of equilibrium hold at once and indefinitely, until disturbed by a thermodynamic operation. In a macroscopic equilibrium, almost or perfectly exactly balanced microscopic exchanges occur; this is the physical explanation of the notion of macroscopic equilibrium.
A thermodynamic system in its own state of internal thermodynamic equilibrium has a spatially uniform temperature. Its intensive properties, other than temperature, may be driven to spatial inhomogeneity by an unchanging long range force field imposed on it by its surroundings.
In non-equilibrium systems, by contrast, there are net flows of matter or energy. If such changes can be triggered to occur in a system in which they are not already occurring, it is said to be in a metastable equilibrium.
Though it is not a widely named law, it is an axiom of thermodynamics that there exist states of thermodynamic equilibrium. The second law of thermodynamics states that when a body of material starts from an equilibrium state, in which, portions of it are held at different states by more or less permeable or impermeable partitions, and a thermodynamic operation removes or makes the partitions more permeable and it is isolated, then it spontaneously reaches its own new state of internal thermodynamic equilibrium, and this is accompanied by an increase in the sum of the entropies of the portions.Thermodynamic system
A thermodynamic system is a group of material and/or radiative contents. Its properties may be described by thermodynamic state variables such as temperature, entropy, internal energy, and pressure.
The simplest state of a thermodynamic system is a state of thermodynamic equilibrium, as opposed to a non-equilibrium state. A system is defined as quantity of matter or a region in space chosen for study. Everything external to the system is surrounding. Thermodynamic system and surrounding is always separated by the boundary.The system can be separated from its surrounding by a wall or without a wall.
When the state of its content varies in space, the system can be considered as many systems located next to each other, each being a different thermodynamical system.
A thermodynamic system is subject to external interventions called thermodynamic operations; these alter the system's walls or its surroundings; as a result, the system undergoes thermodynamic processes according to the principles of thermodynamics. (This account mainly refers to the simplest kind of thermodynamic system; compositions of simple systems may also be considered.)
The thermodynamic state of a thermodynamic system is its internal state as specified by its state variables. In addition to the state variables, a thermodynamic account also requires a special kind of quantity called a state function, which is a function of the defining state variables. For example, if the state variables are internal energy, volume and mole amounts, that special function is the entropy. These quantities are inter-related by one or more functional relationships called equations of state, and by the system's characteristic equation. Thermodynamics imposes restrictions on the possible equations of state and on the characteristic equation. The restrictions are imposed by the laws of thermodynamics.
According to the permeabilities of the walls of a system, transfers of energy and matter occur between it and its surroundings, which are assumed to be unchanging over time, until a state of thermodynamic equilibrium is attained. The only states considered in equilibrium thermodynamics are equilibrium states. Classical thermodynamics includes equilibrium thermodynamics. It also considers: (a) systems considered in terms of cyclic sequences of processes rather than of states of the system; such were historically important in the conceptual development of the subject; and (b) systems considered in terms of processes described by steady flows; such are important in engineering.
In 1824 Sadi Carnot described a thermodynamic system as the working substance (such as the volume of steam) of any heat engine under study. The very existence of such thermodynamic systems may be considered a fundamental postulate of equilibrium thermodynamics, though it is only rarely cited as a numbered law. According to Bailyn, the commonly rehearsed statement of the zeroth law of thermodynamics is a consequence of this fundamental postulate.In equilibrium thermodynamics the state variables do not include fluxes because in a state of thermodynamic equilibrium all fluxes have zero values by postulation. Equilibrium thermodynamic processes may involve fluxes but these must have ceased by the time a thermodynamic process or operation is complete bringing a system to its eventual thermodynamic state. Non-equilibrium thermodynamics allows its state variables to include non-zero fluxes, that describe transfers of mass or energy or entropy between a system and its surroundings.Third law of thermodynamics
The Third law of thermodynamics is sometimes stated as follows, regarding the properties of closed systems in thermodynamic equilibrium: The Entropy of a system approaches a constant value as its temperature approaches absolute zero.
This constant value cannot depend on any other parameters characterizing the closed system, such as pressure or applied magnetic field. At absolute zero (zero kelvin) the system must be in a state with the minimum possible energy. Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy. In such a case, the entropy at absolute zero will be exactly zero. If the system does not have a well-defined order (if its order is glassy, for example), then there may remain some finite entropy as the system is brought to very low temperatures, either because the system becomes locked into a configuration with non-minimal energy or because the minimum energy state is non-unique. The constant value is called the residual entropy of the system. The entropy is essentially a state-function meaning the inherent value of different atoms, molecules, and other configurations of particles including subatomic or atomic material is defined by entropy, which can be discovered near 0 K.
The Nernst–Simon statement of the third law of thermodynamics concerns thermodynamic processes at a fixed, low temperature: The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as the temperature at which it is performed approaches 0 K.
Here a condensed system refers to liquids and solids.
A classical formulation by Nernst (actually a consequence of the Third Law) is: It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations.
There also exists a formulation of the Third Law which approaches the subject by postulating a specific energy behavior: If the composite of two thermodynamic systems constitutes an isolated system, then any energy exchange in any form between those two systems is bounded.Zeroth law of thermodynamics
The zeroth law of thermodynamics states that if two thermodynamic systems are each in thermal equilibrium with a third one, then they are in thermal equilibrium with each other. Accordingly, thermal equilibrium between systems is a transitive relation.
Two systems are said to be in the relation of thermal equilibrium if they are linked by a wall permeable only to heat and they do not change over time. As a convenience of language, systems are sometimes also said to be in a relation of thermal equilibrium if they are not linked so as to be able to transfer heat to each other, but would still not do so (even) if they were connected by a wall permeable only to heat.
The physical meaning is expressed by Maxwell in the words: "All heat is of the same kind". Another statement of the law is "All diathermal walls are equivalent".The law is important for the mathematical formulation of thermodynamics, which needs the assertion that the relation of thermal equilibrium is an equivalence relation. This information is needed for a mathematical definition of temperature that will agree with the physical existence of valid thermometers.
|Health and safety|