# Ground state

The ground state of a quantum-mechanical system is its lowest-energy state; the energy of the ground state is known as the zero-point energy of the system. An excited state is any state with energy greater than the ground state. In the quantum field theory, the ground state is usually called the vacuum state or the vacuum.

If more than one ground state exists, they are said to be degenerate. Many systems have degenerate ground states. Degeneracy occurs whenever there exists a unitary operator that acts non-trivially on a ground state and commutes with the Hamiltonian of the system.

According to the third law of thermodynamics, a system at absolute zero temperature exists in its ground state; thus, its entropy is determined by the degeneracy of the ground state. Many systems, such as a perfect crystal lattice, have a unique ground state and therefore have zero entropy at absolute zero. It is also possible for the highest excited state to have absolute zero temperature for systems that exhibit negative temperature.

Energy levels for an electron in an atom: ground state and excited states. After absorbing energy, an electron may jump from the ground state to a higher-energy excited state.

## Ground state has no nodes in one-dimension

In one dimension, the ground state of the Schrödinger equation can be proven to have no nodes.[1]

Consider the average energy of a state with a node at x = 0; i.e., ψ(0) = 0. The average energy in this state would be

${\displaystyle \langle \psi |H|\psi \rangle =\int dx\,\left(-{\frac {\hbar ^{2}}{2m}}\psi ^{*}{\frac {d^{2}\psi }{dx^{2}}}+V(x)|\psi (x)|^{2}\right),}$

where V(x) is the potential.

Now, consider a small interval around ${\displaystyle x=0}$; i.e., ${\displaystyle x\in [-\epsilon ,\epsilon ]}$. Take a new (deformed) wave function ψ'(x) to be defined as ${\displaystyle \psi '(x)=\psi (x)}$, for ${\displaystyle x<-\epsilon }$; and ${\displaystyle \psi '(x)=-\psi (x)}$, for ${\displaystyle x>\epsilon }$; and constant for ${\displaystyle x\in [-\epsilon ,\epsilon ]}$. If ${\displaystyle \epsilon }$ is small enough, this is always possible to do, so that ψ'(x) is continuous.

Assuming ${\displaystyle \psi (x)\approx -cx}$ around ${\displaystyle x=0}$, one may write

${\displaystyle \psi '(x)=N{\begin{cases}|\psi (x)|,&|x|>\epsilon ,\\c\epsilon ,&|x|\leq \epsilon ,\end{cases}}}$

where ${\displaystyle N={\frac {1}{\sqrt {1+{\frac {4}{3}}|c|^{2}\epsilon ^{3}}}}}$ is the norm.

Note that the kinetic-energy density ${\displaystyle \left|{\frac {d\psi '}{dx}}\right|^{2}<\left|{\frac {d\psi }{dx}}\right|^{2}}$ everywhere because of the normalization. More significantly, the average kinetic energy is lowered by ${\displaystyle O(\epsilon )}$ by the deformation to ψ'.

Now, consider the potential energy. For definiteness, let us choose ${\displaystyle V(x)\geq 0}$. Then it is clear that, outside the interval ${\displaystyle x\in [-\epsilon ,\epsilon ]}$, the potential energy density is smaller for the ψ' because ${\displaystyle |\psi '|<|\psi |}$ there.

On the other hand, in the interval ${\displaystyle x\in [-\epsilon ,\epsilon ]}$ we have

${\displaystyle {V_{\text{avg}}^{\epsilon }}'=\int _{-\epsilon }^{\epsilon }dx\,V(x)|\psi '|^{2}={\frac {\epsilon ^{2}|c|^{2}}{1+{\frac {4}{3}}|c|^{2}\epsilon ^{3}}}\int _{-\epsilon }^{\epsilon }dx\,V(x)\simeq 2\epsilon ^{3}|c|^{2}V(0)+\cdots ,}$

which holds to order ${\displaystyle \epsilon ^{3}}$.

However, the contribution to the potential energy from this region for the state ψ with a node is

${\displaystyle V_{\text{avg}}^{\epsilon }=\int _{-\epsilon }^{\epsilon }dx\,V(x)|\psi |^{2}=|c|^{2}\int _{-\epsilon }^{\epsilon }dx\,x^{2}V(x)\simeq {\frac {2}{3}}\epsilon ^{3}|c|^{2}V(0)+\cdots ,}$

lower, but still of the same lower order ${\displaystyle O(\epsilon ^{3})}$ as for the deformed state ψ', and subdominant to the lowering of the average kinetic energy. Therefore, the potential energy is unchanged up to order ${\displaystyle \epsilon ^{2}}$, if we deform the state ${\displaystyle \psi }$ with a node into a state ψ' without a node, and the change can be ignored.

We can therefore remove all nodes and reduce the energy by ${\displaystyle O(\epsilon )}$, which implies that ψ' cannot be the ground state. Thus the ground-state wave function cannot have a node. This completes the proof. (The average energy may then be further lowered by eliminating undulations, to the variational absolute minimum.)

## Examples

Initial wave functions for the first four states of a one-dimensional particle in a box
• The wave function of the ground state of a particle in a one-dimensional box is a half-period sine wave, which goes to zero at the two edges of the well. The energy of the particle is given by ${\displaystyle {\frac {h^{2}n^{2}}{8mL^{2}}}}$, where h is the Planck constant, m is the mass of the particle, n is the energy state (n = 1 corresponds to the ground-state energy), and L is the width of the well.
• The wave function of the ground state of a hydrogen atom is a spherically symmetric distribution centred on the nucleus, which is largest at the center and reduces exponentially at larger distances. The electron is most likely to be found at a distance from the nucleus equal to the Bohr radius. This function is known as the 1s atomic orbital. For hydrogen (H), an electron in the ground state has energy −13.6 eV, relative to the ionization threshold. In other words, 13.6 eV is the energy input required for the electron to no longer be bound to the atom.
• The exact definition of one second of time since 1997 has been the duration of 9192631770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium-133 atom at rest at a temperature of 0 K.[2]

## Notes

1. ^ See, for example, Cohen, M. (1956). "Appendix A: Proof of non-degeneracy of the ground state". The energy spectrum of the excitations in liquid helium (PDF) (Ph.D.). California Institute of Technology. Published as Feynman, R. P.; Cohen, Michael (1956). "Energy Spectrum of the Excitations in Liquid Helium". Physical Review. 102 (5): 1189. Bibcode:1956PhRv..102.1189F. doi:10.1103/PhysRev.102.1189.
2. ^ "Unit of time (second)". SI Brochure. International Bureau of Weights and Measures. Retrieved 2013-12-22.

## Bibliography

Absolute zero

Absolute zero is the lowest limit of the thermodynamic temperature scale, a state at which the enthalpy and entropy of a cooled ideal gas reach their minimum value, taken as 0. The fundamental particles of nature have minimum vibrational motion, retaining only quantum mechanical, zero-point energy-induced particle motion. The theoretical temperature is determined by extrapolating the ideal gas law; by international agreement, absolute zero is taken as −273.15° on the Celsius scale (International System of Units), which equals −459.67° on the Fahrenheit scale (United States customary units or Imperial units). The corresponding Kelvin and Rankine temperature scales set their zero points at absolute zero by definition.

It is commonly thought of as the lowest temperature possible, but it is not the lowest enthalpy state possible, because all real substances begin to depart from the ideal gas when cooled as they approach the change of state to liquid, and then to solid; and the sum of the enthalpy of vaporization (gas to liquid) and enthalpy of fusion (liquid to solid) exceeds the ideal gas's change in enthalpy to absolute zero. In the quantum-mechanical description, matter (solid) at absolute zero is in its ground state, the point of lowest internal energy.

The laws of thermodynamics indicate that absolute zero cannot be reached using only thermodynamic means, because the temperature of the substance being cooled approaches the temperature of the cooling agent asymptotically, and a system at absolute zero still possesses quantum mechanical zero-point energy, the energy of its ground state at absolute zero. The kinetic energy of the ground state cannot be removed.

Scientists and technologists routinely achieve temperatures close to absolute zero, where matter exhibits quantum effects such as superconductivity and superfluidity.

Aufbau principle

The aufbau principle states that in the ground state of an atom or ion, electrons fill atomic orbitals of the lowest available energy levels before occupying higher levels. For example, the 1s shell is filled before the 2s subshell is occupied. In this way, the electrons of an atom or ion form the most stable electron configuration possible. An example is the configuration 1s2 2s2 2p6 3s2 3p3 for the phosphorus atom, meaning that the 1s subshell has 2 electrons etc.

Aufbau is a German noun that means construction or "building-up". The aufbau principle is sometimes called the building-up principle or the aufbau rule.

The details of this "building-up" tendency are described mathematically by atomic orbital functions. Electron behavior is elaborated by other principles of atomic physics, such as Hund's rule and the Pauli exclusion principle. Hund's rule asserts that if multiple orbitals of the same energy are available, electrons will occupy different orbitals singly before any are occupied doubly. If double occupation does occur, the Pauli exclusion principle requires that electrons which occupy the same orbital must have different spins (+1/2 and −1/2).

As we pass from one element to another of next higher atomic number, one electron is added each time to the atom.

The maximum number of electrons in any shell is 2n2, where n is the principal quantum number.

The maximum number of electrons in a subshell (s, p, d or f) is equal to 2(2ℓ+1) where ℓ = 0, 1, 2, 3...

Thus these subshells can have a maximum of 2, 6, 10 and 14 electrons respectively.

In the ground state the electronic configuration can be built up by placing electrons in the lowest available orbitals until the total number of electrons added is equal to the atomic number. Thus orbitals are filled in the order of increasing energy, using two general rules to help predict electronic configurations:

1. Electrons are assigned to orbitals in order of increasing value of (n+ℓ).

2. For subshells with the same value of (n+ℓ), electrons are assigned first to the sub shell with lower n.A version of the aufbau principle known as the nuclear shell model is used to predict the configuration of protons and neutrons in an atomic nucleus.

Chromophore

A chromophore is the part of a molecule responsible for its color.

The color that is seen by our eyes is the one not absorbed within a certain wavelength spectrum of visible light. The chromophore is a region in the molecule where the energy difference between two separate molecular orbitals falls within the range of the visible spectrum. Visible light that hits the chromophore can thus be absorbed by exciting an electron from its ground state into an excited state. In biological molecules that serve to capture or detect light energy, the chromophore is the moiety that causes a conformational change of the molecule when hit by light.

Density functional theory

Density functional theory (DFT) is a computational quantum mechanical modelling method used in physics, chemistry and materials science to investigate the electronic structure (or nuclear structure) (principally the ground state) of many-body systems, in particular atoms, molecules, and the condensed phases. Using this theory, the properties of a many-electron system can be determined by using functionals, i.e. functions of another function, which in this case is the spatially dependent electron density. Hence the name density functional theory comes from the use of functionals of the electron density. DFT is among the most popular and versatile methods available in condensed-matter physics, computational physics, and computational chemistry.

DFT has been very popular for calculations in solid-state physics since the 1970s. However, DFT was not considered accurate enough for calculations in quantum chemistry until the 1990s, when the approximations used in the theory were greatly refined to better model the exchange and correlation interactions. Computational costs are relatively low when compared to traditional methods, such as exchange only Hartree–Fock theory and its descendants that include electron correlation.

Despite recent improvements, there are still difficulties in using density functional theory to properly describe: intermolecular interactions (of critical importance to understanding chemical reactions), especially van der Waals forces (dispersion); charge transfer excitations; transition states, global potential energy surfaces, dopant interactions and some strongly correlated systems; and in calculations of the band gap and ferromagnetism in semiconductors. The incomplete treatment of dispersion can adversely affect the accuracy of DFT (at least when used alone and uncorrected) in the treatment of systems which are dominated by dispersion (e.g. interacting noble gas atoms) or where dispersion competes significantly with other effects (e.g. in biomolecules). The development of new DFT methods designed to overcome this problem, by alterations to the functional or by the inclusion of additive terms, is a current research topic.

Electron configuration

In atomic physics and quantum chemistry, the electron configuration is the distribution of electrons of an atom or molecule (or other physical structure) in atomic or molecular orbitals. For example, the electron configuration of the neon atom is 1s2 2s2 2p6, using the notation explained below.

Electronic configurations describe each electron as moving independently in an orbital, in an average field created by all other orbitals. Mathematically, configurations are described by Slater determinants or configuration state functions.

According to the laws of quantum mechanics, for systems with only one electron, a level of energy is associated with each electron configuration and in certain conditions, electrons are able to move from one configuration to another by the emission or absorption of a quantum of energy, in the form of a photon.

Knowledge of the electron configuration of different atoms is useful in understanding the structure of the periodic table of elements. This is also useful for describing the chemical bonds that hold atoms together. In bulk materials, this same idea helps explain the peculiar properties of lasers and semiconductors.

Energy level

A quantum mechanical system or particle that is bound—that is, confined spatially—can only take on certain discrete values of energy. This contrasts with classical particles, which can have any energy. These discrete values are called energy levels. The term is commonly used for the energy levels of electrons in atoms, ions, or molecules, which are bound by the electric field of the nucleus, but can also refer to energy levels of nuclei or vibrational or rotational energy levels in molecules. The energy spectrum of a system with such discrete energy levels is said to be quantized.

In chemistry and atomic physics, an electron shell, or a principal energy level, may be thought of as an orbit followed by electrons around an atom's nucleus. The closest shell to the nucleus is called the "1 shell" (also called "K shell"), followed by the "2 shell" (or "L shell"), then the "3 shell" (or "M shell"), and so on farther and farther from the nucleus. The shells correspond with the principal quantum numbers (n = 1, 2, 3, 4 ...) or are labeled alphabetically with letters used in the X-ray notation (K, L, M, …).

Each shell can contain only a fixed number of electrons: The first shell can hold up to two electrons, the second shell can hold up to eight (2 + 6) electrons, the third shell can hold up to 18 (2 + 6 + 10) and so on. The general formula is that the nth shell can in principle hold up to 2(n2) electrons. Since electrons are electrically attracted to the nucleus, an atom's electrons will generally occupy outer shells only if the more inner shells have already been completely filled by other electrons. However, this is not a strict requirement: atoms may have two or even three incomplete outer shells. (See Madelung rule for more details.) For an explanation of why electrons exist in these shells see electron configuration.If the potential energy is set to zero at infinite distance from the atomic nucleus or molecule, the usual convention, then bound electron states have negative potential energy.

If an atom, ion, or molecule is at the lowest possible energy level, it and its electrons are said to be in the ground state. If it is at a higher energy level, it is said to be excited, or any electrons that have higher energy than the ground state are excited. If more than one quantum mechanical state is at the same energy, the energy levels are "degenerate". They are then called degenerate energy levels.

Excited state

In quantum mechanics, an excited state of a system (such as an atom, molecule or nucleus) is any quantum state of the system that has a higher energy than the ground state (that is, more energy than the absolute minimum). Excitation is an elevation in energy level above an arbitrary baseline energy state. In physics there is a specific technical definition for energy level which is often associated with an atom being raised to an excited state. The temperature of a group of particles is indicative of the level of excitation (with the notable exception of systems that exhibit negative temperature).

The lifetime of a system in an excited state is usually short: spontaneous or induced emission of a quantum of energy (such as a photon or a phonon) usually occurs shortly after the system is promoted to the excited state, returning the system to a state with lower energy (a less excited state or the ground state). This return to a lower energy level is often loosely described as decay and is the inverse of excitation.

Long-lived excited states are often called metastable. Long-lived nuclear isomers and singlet oxygen are two examples of this.

Ground State (Angel)

"Ground State" is episode 2 of season 4 in the television show Angel. In this episode, Wesley, now a hardened demon-hunter/killer with his own gang, leads the search to Denza who tells them that the Axis talisman will aid him in their search for Cordelia. Meanwhile, Gwen Raiden, a young mercenary with uncontrollable electrical abilities, is also looking for the talisman at the request of her employer, a wealthy businessman with connections to Wolfram & Hart.

Hydrogen line

The hydrogen line, 21-centimeter line or H I line refers to the electromagnetic radiation spectral line that is created by a change in the energy state of neutral hydrogen atoms. This electromagnetic radiation is at the precise frequency of 1420405751.7667±0.0009 Hz, which is equivalent to the vacuum wavelength of 21.1061140542 cm in free space. This wavelength falls within the microwave region of the electromagnetic spectrum, and it is observed frequently in radio astronomy, since those radio waves can penetrate the large clouds of interstellar cosmic dust that are opaque to visible light.

The microwaves of the hydrogen line come from the atomic transition of an electron between the two hyperfine levels of the hydrogen 1s ground state that have an energy difference of ≈ 5.87433 µeV. It is called the spin-flip transition. The frequency, ν, of the quanta that are emitted by this transition between two different energy levels is given by the Planck–Einstein relation E = hν. According to that relation, the photon energy of a 1,420,405,751.7667 Hz photon is ≈ 5.87433 µeV. The constant of proportionality, h, is known as the Planck constant.

Ionization

Ionization or ionisation, is the process by which an atom or a molecule acquires a negative or positive charge by gaining or losing electrons, often in conjunction with other chemical changes. The resulting electrically charged atom or molecule is called an ion. Ionization can result from the loss of an electron after collisions with subatomic particles, collisions with other atoms, molecules and ions, or through the interaction with electromagnetic radiation. Heterolytic bond cleavage and heterolytic substitution reactions can result in the formation of ion pairs. Ionization can occur through radioactive decay by the internal conversion process, in which an excited nucleus transfers its energy to one of the inner-shell electrons causing it to be ejected.

Isotopes of caesium

Caesium (55Cs) has 40 known isotopes, making it, along with barium and mercury, the element with the most isotopes. The atomic masses of these isotopes range from 112 to 151. Only one isotope, 133Cs, is stable. The longest-lived radioisotopes are 135Cs with a half-life of 2.3 million years, 137Cs with a half-life of 30.1671 years and 134Cs with a half-life of 2.0652 years. All other isotopes have half-lives less than 2 weeks, most under an hour.

Beginning in 1945 with the commencement of nuclear testing, caesium isotopes were released into the atmosphere where caesium is absorbed readily into solution and is returned to the surface of the earth as a component of radioactive fallout. Once caesium enters the ground water, it is deposited on soil surfaces and removed from the landscape primarily by particle transport. As a result, the input function of these isotopes can be estimated as a function of time.

Metastability

In physics, metastability is a stable state of a dynamical system other than the system's state of least energy.

A ball resting in a hollow on a slope is a simple example of metastability. If the ball is only slightly pushed, it will settle back into its hollow, but a stronger push may start the ball rolling down the slope. Bowling pins show similar metastability by either merely wobbling for a moment or tipping over completely. A common example of metastability in science is isomerisation. Higher energy isomers are long lived as they are prevented from rearranging to their preferred ground state by (possibly large) barriers in the potential energy.

During a metastable state of finite lifetime, all state-describing parameters reach and hold stationary values. In isolation:

the state of least energy is the only one the system will inhabit for an indefinite length of time, until more external energy is added to the system (unique "absolutely stable" state);

the system will spontaneously leave any other state (of higher energy) to eventually return (after a sequence of transitions) to the least energetic state.The metastability concept originated in the physics of first-order phase transitions. It then acquired new meaning in the study of aggregated subatomic particles (in atomic nuclei or in atoms) or in molecules, macromolecules or clusters of atoms and molecules. Later, it was borrowed for the study of decision-making and information transmission systems.

Many complex natural and man-made systems can demonstrate metastability.

Metastability is common in physics and chemistry – from an atom (many-body assembly) to statistical ensembles of molecules (viscous fluids, amorphous solids, liquid crystals, minerals, etc.) at molecular levels or as a whole (see Metastable states of matter and grain piles below). The abundance of states is more prevalent as the systems grow larger and/or if the forces of their mutual interaction are spatially less uniform or more diverse.

In dynamic systems (with feedback) like electronic circuits, signal trafficking, decisional systems and neuroscience – the time-invariance of the active or reactive patterns with respect to the external influences defines stability and metastability (see brain metastability below). In these systems, the equivalent of thermal fluctuations in molecular systems is the "white noise" that affects signal propagation and the decision-making.

Nuclear isomer

A nuclear isomer is a metastable state of an atomic nucleus caused by the excitation of one or more of its nucleons (protons or neutrons). "Metastable" describes nuclei whose excited states have half-lives 100 to 1000 times longer than the half-lives of the excited nuclear states that decay with a "prompt" half life (ordinarily on the order of 10−12 seconds). The term "metastable" is usually restricted to isomers with half-lives of 10−9 seconds or longer. Some references recommend 5 × 10−9 seconds to distinguish the metastable half life from the normal "prompt" gamma emission half life. Occasionally the half-lives are far longer than this and can last minutes, hours, or years. For example the 180m73Ta nuclear isomer survives so long that it has never been observed to decay (at least 1015 years).

Sometimes, the gamma decay from a metastable state is referred to as isomeric transition, but this process typically resembles shorter-lived gamma decays in all external aspects with the exception of the long-lived nature of the meta-stable parent nuclear isomer. The longer lives of nuclear isomers' metastable states are often due to the larger degree of nuclear spin change which must be involved in their gamma emission to reach the ground state. This high spin change causes these decays to be forbidden transitions and delayed. Delays in emission are caused by low or high available decay energy.

The first nuclear isomer and decay-daughter system (uranium X2/uranium Z, now known as 234m91Pa/23491Pa) was discovered by Otto Hahn in 1921.

Quantum annealing

Quantum annealing (QA) is a metaheuristic for finding the global minimum of a given objective function over a given set of candidate solutions (candidate states), by a process using quantum fluctuations. Quantum annealing is used mainly for problems where the search space is discrete (combinatorial optimization problems) with many local minima; such as finding the ground state of a spin glass. It was formulated in its present form by T. Kadowaki and H. Nishimori (ja) in "Quantum annealing in the transverse Ising model" though a proposal in a different form had been made by A. B. Finilla, M. A. Gomez, C. Sebenik and J. D. Doll, in "Quantum annealing: A new method for minimizing multidimensional functions".Quantum annealing starts from a quantum-mechanical superposition of all possible states (candidate states) with equal weights. Then the system evolves following the time-dependent Schrödinger equation, a natural quantum-mechanical evolution of physical systems. The amplitudes of all candidate states keep changing, realizing a quantum parallelism, according to the time-dependent strength of the transverse field, which causes quantum tunneling between states. If the rate of change of the transverse field is slow enough, the system stays close to the ground state of the instantaneous Hamiltonian (also see adiabatic quantum computation). If the rate of change of the transverse field is accelerated, the system may leave the ground state temporarily but produce a higher likelihood of concluding in the ground state of the final problem Hamiltonian, i.e., diabatic quantum computation. The transverse field is finally switched off, and the system is expected to have reached the ground state of the classical Ising model that corresponds to the solution to the original optimization problem. An experimental demonstration of the success of quantum annealing for random magnets was reported immediately after the initial theoretical proposal. An introduction to combinatorial optimization (NP-hard) problems, the general structure of quantum annealing-based algorithms and two examples of this kind of algorithms for solving instances of the max-SAT and Minimum Multicut problems together with an overview of the quantum annealing systems manufactured by D-Wave Systems are presented in.

Quantum machine

A quantum machine is a human-made device whose collective motion follows the laws of quantum mechanics. The idea that macroscopic objects may follow the laws of quantum mechanics dates back to the advent of quantum mechanics in the early 20th century. However, as highlighted by the Schrödinger's cat thought experiment, quantum effects are not readily observable in large-scale objects. Consequently, quantum states of motion have only been observed in special circumstances at extremely low temperatures. The fragility of quantum effects in macroscopic objects may arise from rapid quantum decoherence. Researchers created the first quantum machine in 2009, and the achievement was named the "Breakthrough of the Year" by Science in 2010.

Swing state

In American politics, the term swing state refers to any state that could reasonably be won by either the Democratic or Republican presidential candidate. These states are usually targeted by both major-party campaigns, especially in competitive elections. Meanwhile, the states that regularly lean to a single party are known as safe states, as it is generally assumed that one candidate has a base of support from which they can draw a sufficient share of the electorate.

Due to the winner-take-all style of the Electoral College, candidates often campaign only in competitive states, which is why a select group of states frequently receives a majority of the advertisements and partisan media. The battlegrounds may change in certain election cycles, and may be reflected in overall polling, demographics, and the ideological appeal of the nominees. Election analytics website FiveThirtyEight identifies the states of Colorado, Florida, Iowa, Michigan, Minnesota, Ohio, Nevada, New Hampshire, North Carolina, Pennsylvania, Virginia, and Wisconsin as "perennial" swing states that have regularly seen close contests over the last few presidential campaigns.

Term symbol

In quantum mechanics, the term symbol is an abbreviated description of the (total) angular momentum quantum numbers in a multi-electron atom (however, even a single electron can be described by a term symbol). Each energy level of an atom with a given electron configuration is described by not only the electron configuration but also its own term symbol, as the energy level also depends on the total angular momentum including spin. The usual atomic term symbols assume LS coupling (also known as Russell-Saunders coupling or spin-orbit coupling). The ground state term symbol is predicted by Hund's rules.

The use of the word term for an energy level is based on the Rydberg-Ritz combination principle, an empirical observation that the wavenumbers of spectral lines can be expressed as the difference of two terms. This was later explained by the Bohr quantum theory, which identified the terms (multiplied by hc, where h is the Planck constant and c the speed of light) with quantized energy levels and the spectral wavenumbers (again multiplied by hc) with photon energies.

Tables of atomic energy levels identified by their term symbols have been compiled by the National Institute of Standards and Technology. In this database, neutral atoms are identified as I, singly ionized atoms as II, etc. Neutral atoms of the chemical elements have the same term symbol for each column in the s-block and p-block elements, but may differ in d-block and f-block elements, if the ground state electron configuration changes within a column. Ground state term symbols for chemical elements are given below.

Third law of thermodynamics

The third law of thermodynamics is sometimes stated as follows, regarding the properties of closed systems in thermodynamic equilibrium: The entropy of a system approaches a constant value as its temperature approaches absolute zero.

This constant value cannot depend on any other parameters characterizing the closed system, such as pressure or applied magnetic field. At absolute zero (zero kelvin) the system must be in a state with the minimum possible energy. Entropy is related to the number of accessible microstates, and there is typically one unique state (called the ground state) with minimum energy. In such a case, the entropy at absolute zero will be exactly zero. If the system does not have a well-defined order (if its order is glassy, for example), then there may remain some finite entropy as the system is brought to very low temperatures, either because the system becomes locked into a configuration with non-minimal energy or because the minimum energy state is non-unique. The constant value is called the residual entropy of the system. The entropy is essentially a state-function meaning the inherent value of different atoms, molecules, and other configurations of particles including subatomic or atomic material is defined by entropy, which can be discovered near 0 K.

The Nernst–Simon statement of the third law of thermodynamics concerns thermodynamic processes at a fixed, low temperature: The entropy change associated with any condensed system undergoing a reversible isothermal process approaches zero as the temperature at which it is performed approaches 0 K.

Here a condensed system refers to liquids and solids.

A classical formulation by Nernst (actually a consequence of the Third Law) is: It is impossible for any process, no matter how idealized, to reduce the entropy of a system to its absolute-zero value in a finite number of operations.

There also exists a formulation of the Third Law which approaches the subject by postulating a specific energy behavior: If the composite of two thermodynamic systems constitutes an isolated system, then any energy exchange in any form between those two systems is bounded.

Ultraviolet–visible spectroscopy

Ultraviolet–visible spectroscopy or ultraviolet–visible spectrophotometry (UV–Vis or UV/Vis) refers to absorption spectroscopy or reflectance spectroscopy in part of the ultraviolet and the full, adjacent visible spectral regions. This means it uses light in the visible and adjacent ranges. The absorption or reflectance in the visible range directly affects the perceived color of the chemicals involved. In this region of the electromagnetic spectrum, atoms and molecules undergo electronic transitions. Absorption spectroscopy is complementary to fluorescence spectroscopy, in that fluorescence deals with transitions from the excited state to the ground state, while absorption measures transitions from the ground state to the excited state.

Background
Fundamentals
Formulations
Equations
Interpretations
Experiments
Science
Technology
Extensions

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.