In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics:xi and is used to construct physical models of subatomic particles (in particle physics) and quasiparticles (in condensed matter physics).
QFT treats particles as excited states (also called quanta) of their underlying fields, which are—in a sense—more fundamental than the basic particles. Interactions between particles are described by interaction terms in the Lagrangian involving their corresponding fields. Each interaction can be visually represented by Feynman diagrams, which are formal computational tools, in the process of relativistic perturbation theory.
As a successful theoretical framework today, quantum field theory emerged from the work of generations of theoretical physicists spanning much of the 20th century. Its development began in the 1920s with the description of interactions between light and electrons, culminating in the first quantum field theory — quantum electrodynamics. A major theoretical obstacle soon followed with the appearance and persistence of various infinities in perturbative calculations, a problem only resolved in the 1950s with the invention of the renormalization procedure. A second major barrier came with QFT's apparent inability to describe the weak and strong interactions, to the point where some theorists called for the abandonment of the field theoretic approach. The development of gauge theory and the completion of the Standard Model in the 1970s led to a renaissance of quantum field theory.
The earliest successful classical field theory is one that emerged from Newton's law of universal gravitation, despite the complete absence of the concept of fields from his 1687 treatise Philosophiæ Naturalis Principia Mathematica. The force of gravity as described by Newton is an "action at a distance" — its effects on faraway objects are instantaneous, no matter the distance. In an exchange of letters with Richard Bentley, however, Newton stated that "it is inconceivable that inanimate brute matter should, without the mediation of something else which is not material, operate upon and affect other matter without mutual contact.":4 It was not until the 18th century that mathematical physicists discovered a convenient description of gravity based on fields — a numerical quantity (a vector (mathematics and physics)) assigned to every point in space indicating the action of gravity on any particle at that point. However, this was considered merely a mathematical trick.:18
Fields began to take on an existence of their own with the development of electromagnetism in the 19th century. Michael Faraday coined the English term "field" in 1845. He introduced fields as properties of space (even when it is devoid of matter) having physical effects. He argued against "action at a distance", and proposed that interactions between objects occur via space-filling "lines of force". This description of fields remains to this day.:301:2
The theory of classical electromagnetism was completed in 1862 with Maxwell's equations, which described the relationship between the electric field, the magnetic field, electric current, and electric charge. Maxwell's equations implied the existence of electromagnetic waves, a phenomenon whereby electric and magnetic fields propagate from one spatial point to another at a finite speed, which turns out to be the speed of light. Action-at-a-distance was thus conclusively refuted.:19
Despite the enormous success of classical electromagnetism, it was unable to account for the discrete lines in atomic spectra, nor for the distribution of blackbody radiation in different wavelengths. Max Planck's study of blackbody radiation marked the beginning of quantum mechanics. He treated atoms, which absorb and emit electromagnetic radiation, as tiny oscillators with the crucial property that their energies can only take on a series of discrete, rather than continuous, values. These are known as quantum harmonic oscillators. This process of restricting energies to discrete values is called quantization.:Ch.2 Building on this idea, Albert Einstein proposed in 1905 an explanation for the photoelectric effect, that light is composed of individual packets of energy called photons (the quanta of light). This implied that the electromagnetic radiation, while being waves in the classical electromagnetic field, also exists in the form of particles.
In 1913, Niels Bohr introduced the Bohr model of atomic structure, wherein electrons within atoms can only take on a series of discrete, rather than continuous, energies. This is another example of quantization. The Bohr model successfully explained the discrete nature of atomic spectral lines. In 1924, Louis de Broglie proposed the hypothesis of wave-particle duality, that microscopic particles exhibit both wave-like and particle-like properties under different circumstances. Uniting these scattered ideas, a coherent discipline, quantum mechanics, was formulated between 1925 and 1926, with important contributions from de Broglie, Werner Heisenberg, Max Born, Erwin Schrödinger, Paul Dirac, and Wolfgang Pauli.:22-23
In the same year as his paper on the photoelectric effect, Einstein published his theory of special relativity, built on Maxwell's electromagnetism. New rules, called Lorentz transformation, were given for the way time and space coordinates of an event change under changes in the observer's velocity, and the distinction between time and space was blurred.:19 It was proposed that all physical laws must be the same for observers at different velocities, i.e. that physical laws be invariant under Lorentz transformations.
Two difficulties remained. Observationally, the Schrödinger equation underlying quantum mechanics could explain the stimulated emission of radiation from atoms, where an electron emits a new photon under the action of an external electromagnetic field, but it was unable to explain spontaneous emission, where an electron spontaneously decreases in energy and emits a photon even without the action of an external electromagnetic field. Theoretically, the Schrödinger equation could not describe photons and was inconsistent with the principles of special relativity — it treats time as an ordinary number while promoting spatial coordinates to linear operators.
Quantum field theory naturally began with the study of electromagnetic interactions, as the electromagnetic field was the only known classical field as of the 1920s.:1
Through the works of Born, Heisenberg, and Pascual Jordan in 1925-1926, a quantum theory of the free electromagnetic field (one with no interactions with matter) was developed via canonical quantization by treating the electromagnetic field as a set of quantum harmonic oscillators.:1 With the exclusion of interactions, however, such a theory was yet incapable of making quantitative predictions about the real world.:22
In his seminal 1927 paper The quantum theory of the emission and absorption of radiation, Dirac coined the term quantum electrodynamics (QED), a theory that adds upon the terms describing the free electromagnetic field an additional interaction term between electric current density and the electromagnetic vector potential. Using first-order perturbation theory, he successfully explained the phenomenon of spontaneous emission. According to the uncertainty principle in quantum mechanics, quantum harmonic oscillators cannot remain stationary, but they have a non-zero minimum energy and must always be oscillating, even in the lowest energy state (the ground state). Therefore, even in a perfect vacuum, there remains an oscillating electromagnetic field having zero-point energy. It is this quantum fluctuation of electromagnetic fields in the vacuum that "stimulates" the spontaneous emission of radiation by electrons in atoms. Dirac's theory was hugely successful in explaining both the emission and absorption of radiation by atoms; by applying second-order perturbation theory, it was able to account for the scattering of photons, resonance fluorescence, as well as non-relativistic Compton scattering. Nonetheless, the application of higher-order perturbation theory was plagued with problematic infinities in calculations.:71
In 1928, Dirac wrote down a wave equation that described relativistic electrons — the Dirac equation. It had the following important consequences: the spin of an electron is 1/2; the electron g-factor is 2; it led to the correct Sommerfeld formula for the fine structure of the hydrogen atom; and it could be used to derive the Klein-Nishina formula for relativistic Compton scattering. Although the results were fruitful, the theory also apparently implied the existence of negative energy states, which would cause atoms to be unstable, since they could always decay to lower energy states by the emission of radiation.:71–72
The prevailing view at the time was that the world was composed of two very different ingredients: material particles (such as electrons) and quantum fields (such as photons). Material particles were considered to be eternal, with their physical state described by the probabilities of finding each particle in any given region of space or range of velocities. On the other hand photons were considered merely the excited states of the underlying quantized electromagnetic field, and could be freely created or destroyed. It was between 1928 and 1930 that Jordan, Eugene Wigner, Heisenberg, Pauli, and Enrico Fermi discovered that material particles could also be seen as excited states of quantum fields. Just as photons are excited states of the quantized electromagnetic field, so each type of particle had its corresponding quantum field: an electron field, a proton field, etc. Given enough energy, it would now be possible to create material particles. Building on this idea, Fermi proposed in 1932 an explanation for β decay known as Fermi's interaction. Atomic nuclei do not contain electrons per se, but in the process of decay, an electron is created out of the surrounding electron field, analogous to the photon created from the surrounding electromagnetic field in the radiative decay of an excited atom.:22-23
It was realized in 1929 by Dirac and others that negative energy states implied by the Dirac equation could be removed by assuming the existence of particles with the same mass as electrons but opposite electric charge. This not only ensured the stability of atoms, but it was also the first proposal of the existence of antimatter. Indeed, the evidence for positrons was discovered in 1932 by Carl David Anderson in cosmic rays. With enough energy, such as by absorbing a photon, an electron-positron pair could be created, a process called pair production; the reverse process, annihilation, could also occur with the emission of a photon. This showed that particle numbers need not be fixed during an interaction. Historically, however, positrons were at first thought of as "holes" in an infinite electron sea, rather than a new kind of particle, and this theory was referred to as the Dirac hole theory.:72:23 QFT naturally incorporated antiparticles in its formalism.:24
Robert Oppenheimer showed in 1930 that higher-order perturbative calculations in QED always resulted in infinite quantities, such as the electron self-energy and the vacuum zero-point energy of the electron and photon fields, suggesting that the computational methods at the time could not properly deal with interactions involving photons with extremely high momenta.:25 It was not until 20 years later that a systematic approach to remove such infinities was developed.
A series of papers were published between 1934 and 1938 by Ernst Stueckelberg that established a relativistically invariant formulation of QFT. In 1947, Stueckelberg also independently developed a complete renormalization procedure. Unfortunately, such achievements were not understood and recognized by the theoretical community.
Faced with these infinities, John Archibald Wheeler and Heisenberg proposed, in 1937 and 1943 respectively, to supplant the problematic QFT with the so-called S-matrix theory. Since the specific details of microscopic interactions are inaccessible to observations, the theory should only attempt to describe the relationships between a small number of observables (e.g. the energy of an atom) in an interaction, rather than be concerned with the microscopic minutiae of the interaction. In 1945, Richard Feynman and Wheeler daringly suggested abandoning QFT altogether and proposed action-at-a-distance as the mechanism of particle interactions.:26
In 1947, Willis Lamb and Robert Retherford measured the minute difference in the 2S1/2 and 2P1/2 energy levels of the hydrogen atom, also called the Lamb shift. By ignoring the contribution of photons whose energy exceeds the electron mass, Hans Bethe successfully estimated the numerical value of the Lamb shift.:28 Subsequently, Norman Myles Kroll, Lamb, James Bruce French, and Victor Weisskopf again confirmed this value using an approach in which infinities cancelled other infinities to result in finite quantities. However, this method was clumsy and unreliable and could not be generalized to other calculations.
The breakthrough eventually came around 1950 when a more robust method for eliminating infinities was developed Julian Schwinger, Feynman, Freeman Dyson, and Shinichiro Tomonaga. The main idea is to replace the initial, so-called "bare", parameters (mass, electric charge, etc.), which have no physical meaning, by their finite measured values. To cancel the apparently infinite parameters, one has to introduce additional, infinite, "counterterms" into the Lagrangian. This systematic computational procedure is known as renormalization and can be applied to arbitrary order in perturbation theory.
By applying the renormalization procedure, calculations were finally made to explain the electron's anomalous magnetic moment (the deviation of the electron g-factor from 2) and vacuum polarisation. These results agreed with experimental measurements to a remarkable degree, thus marking the end of a "war against infinities".
At the same time, Feynman introduced the path integral formulation of quantum mechanics and Feynman diagrams.:2 The latter can be used to visually and intuitively organise and to help compute terms in the perturbative expansion. Each diagram can be interpreted as paths of particles in an interaction, with each vertex and line having a corresponding mathematical expression, and the product of these expressions gives the scattering amplitude of the interaction represented by the diagram.:5
It was with the invention of the renormalization procedure and Feynman diagrams that QFT finally arose as a complete theoretical framework.:2
Given the tremendous success of QED, many theorists believed, in the few years after 1949, that QFT could soon provide an understanding of all microscopic phenomena, not only the interactions between photons, electrons, and positrons. Contrary to this optimism, QFT entered yet another period of depression that lasted for almost two decades.:30
The first obstacle was the limited applicability of the renormalization procedure. In perturbative calculations in QED, all infinite quantities could be eliminated by redefining a small (finite) number of physical quantities (namely the mass and charge of the electron). Dyson proved in 1949 that this is only possible for a small class of theories called "renormalizable theories", of which QED is an example. However, most theories, including the Fermi theory of the weak interaction, are "non-renormalizable". Any perturbative calculation in these theories beyond the first order would result in infinities that could not be removed by redefining a finite number of physical quantities.:30
The second major problem stemmed from the limited validity of the Feynman diagram method, which are based on a series expansion in perturbation theory. In order for the series to converge and low-order calculations to be a good approximation, the coupling constant, in which the series is expanded, must be a sufficiently small number. The coupling constant in QED is the fine-structure constant α ≈ 1/137, which is small enough that only the simplest, lowest order, Feynman diagrams need to be considered in realistic calculations. In contrast, the coupling constant in the strong interaction is roughly of the order of one, making complicated, higher order, Feynman diagrams just as important as simple ones. There was thus no way of deriving reliable quantitative predictions for the strong interaction using perturbative QFT methods.:31
With these difficulties looming, many theorists began to turn away from QFT. Some focused on symmetry principles and conservation laws, while others picked up the old S-matrix theory of Wheeler and Heisenberg. QFT was used heuristically as guiding principles, but not as a basis for quantitative calculations.:31
In 1954, Yang Chen-Ning and Robert Mills generalised the local symmetry of QED, leading to non-Abelian gauge theories (also known as Yang-Mills theories), which are based on more complicated local symmetry groups.:5 In QED, (electrically) charged particles interact via the exchange of photons, while in non-Abelian gauge theory, particles carrying a new type of "charge" interact via the exchange of massless gauge bosons. Unlike photons, these gauge bosons themselves carry charge.:32
Sheldon Glashow developed a non-Abelian gauge theory that unified the electromagnetic and weak interactions in 1960. In 1964, Abdus Salam and John Clive Ward arrived at the same theory through a different path. This theory, nevertheless, was non-renormalizable.
Peter Higgs, Robert Brout, and François Englert proposed in 1964 that the gauge symmetry in Yang-Mills theories could be broken by a mechanism called spontaneous symmetry breaking, through which originally massless gauge bosons could acquire mass.:5-6
By combining the earlier theory of Glashow, Salam, and Ward with the idea of spontaneous symmetry breaking, Steven Weinberg wrote down in 1967 a theory describing electroweak interactions between all leptons and the effects of the Higgs boson. His theory was at first mostly ignored,:6 until it was brought back to light in 1971 by Gerard 't Hooft's proof that non-Abelian gauge theories are renormalizable. The electroweak theory of Weinberg and Salam was extended from leptons to quarks in 1970 by Glashow, John Iliopoulos, and Luciano Maiani, marking its completion.
Harald Fritzsch, Murray Gell-Mann, and Heinrich Leutwyler discovered in 1971 that certain phenomena involving the strong interaction could also be explained by non-Abelian gauge theory. Quantum chromodynamics (QCD) was born. In 1973, David Gross, Frank Wilczek, and Hugh David Politzer showed that non-Abelian gauge theories are "asymptotically free", meaning that under renormalization, the coupling constant of the strong interaction decreases as the interaction energy increases. (Similar discoveries had been made numerous times prior, but they had been largely ignored.) :11 Therefore, at least in high-energy interactions, the coupling constant in QCD becomes sufficiently small to warrant a perturbative series expansion, making quantitative predictions for the strong interaction possible.:32
These theoretical breakthroughs brought about a renaissance in QFT. The full theory, which includes the electroweak theory and chromodynamics, is referred to today as the Standard Model of elementary particles. The Standard Model successfully describes all fundamental interactions except gravity, and its many predictions have been met with remarkable experimental confirmation in subsequent decades.:3 The Higgs boson, central to the mechanism of spontaneous symmetry breaking, was finally detected in 2012 at CERN, marking the complete verification of the existence of all constituents of the Standard Model.
The 1970s saw the development of non-perturbative methods in non-Abelian gauge theories. The 't Hooft–Polyakov monopole was discovered by 't Hooft and Alexander Polyakov, flux tubes by Holger Bech Nielsen and Poul Olesen, and instantons by Polyakov et al.. These objects are inaccessible through perturbation theory.:4
Supersymmetry also appeared in the same period. The first supersymmetric QFT in four dimensions was built by Yuri Golfand and Evgeny Likhtman in 1970, but their result failed to garner widespread interest due to the Iron Curtain. Supersymmetry only took off in the theoretical community after the work of Julius Wess and Bruno Zumino in 1973.:7
Among the four fundamental interactions, gravity remains the only one that lacks a consistent QFT description. Various attempts at a theory of quantum gravity led to the development of string theory,:6 itself a type of two-dimensional QFT with conformal symmetry. Joël Scherk and John Schwarz first proposed in 1974 that string theory could be the quantum theory of gravity.
Although quantum field theory arose from the study of interactions between elementary particles, it has been successfully applied to other physical systems, particularly to many-body systems in condensed matter physics.
Historically, the Higgs mechanism of spontaneous symmetry breaking was a result of Yoichiro Nambu's application of superconductor theory to elementary particles, while the concept of renormalization came out of the study of second-order phase transitions in matter.
Soon after the introduction of photons, Einstein performed the quantization procedure on vibrations in a crystal, leading to the first quasiparticle — phonons. Lev Landau claimed that low-energy excitations in many condensed matter systems could be described in terms of interactions between a set of quasiparticles. The Feynman diagram method of QFT was naturally well suited to the analysis of various phenomena in condensed matter systems.
Gauge theory is used to describe the quantization of magnetic flux in superconductors, the resistivity in the quantum Hall effect, as well as the relation between frequency and voltage in the AC Josephson effect.
A classical field is a function of spatial and time coordinates. Examples include the gravitational field in Newtonian gravity g(x, t) and the electric field E(x, t) and magnetic field B(x, t) in classical electromagnetism. A classical field can be thought of as a numerical quantity assigned to every point in space that changes in time. Hence, it has infinite degrees of freedom.
Many phenomena exhibiting quantum mechanical properties cannot be explained by classical fields alone. Phenomena such as the photoelectric effect are best explained by discrete particles (photons), rather than a spatially continuous field. The goal of quantum field theory is to describe various quantum mechanical phenomena using a modified concept of fields.
The simplest classical field is a real scalar field — a real number at every point in space that changes in time. It is denoted as ϕ(x, t), where x is the position vector, and t is the time. Suppose the Lagrangian of the field is
we obtain the equations of motion for the field, which describe the way it varies in time and space:
The quantisation procedure for the above classical field is analogous to the promotion of a classical harmonic oscillator to a quantum harmonic oscillator.
The displacement of a classical harmonic oscillator is described by
where a is a complex number (normalised by convention), and ω is the oscillator's frequency. Note that x is the displacement of a particle in simple harmonic motion from the equilibrium position, which should not be confused with the spatial label x of a field.
For a quantum harmonic oscillator, x(t) is promoted to a linear operator :
The vacuum state , which is the lowest energy state, is defined by
Any quantum state of a single harmonic oscillator can be obtained from by successively applying the creation operator ::20
By the same token, the aforementioned real scalar field ϕ, which corresponds to x in the single harmonic oscillator, is also promoted to an operator , while ap and ap* are replaced by the annihilation operator and the creation operator for a particular p, respectively:
Their commutation relations are::21
where δ is the Dirac delta function. The vacuum state is defined by
Any quantum state of the field can be obtained from by successively applying creation operators , e.g.:22
Although the field appearing in the Lagrangian is spatially continuous, the quantum states of the field are discrete. While the state space of a single quantum harmonic oscillator contains all the discrete energy states of one oscillating particle, the state space of a quantum field contains the discrete energy levels of an arbitrary number of particles. The latter space is known as a Fock space, which can account for the fact that particle numbers are not fixed in relativistic quantum systems. The process of quantising an arbitrary number of particles instead of a single particle is often also called second quantisation.:19
The preceding procedure is a direct application of non-relativistic quantum mechanics and can be used to quantise (complex) scalar fields, Dirac fields,:52 vector fields (e.g. the electromagnetic field), and even strings. However, creation and annihilation operators are only well defined in the simplest theories that contain no interactions (so-called free theory). In the case of the real scalar field, the existence of these operators was a consequence of the decomposition of solutions of the classical equations of motion into a sum of normal modes. To perform calculations on any realistic interacting theory, perturbation theory would be necessary.
The Lagrangian of any quantum field in nature would contain interaction terms in addition to the free theory terms. For example, a quartic interaction term could be introduced to the Lagrangian of the real scalar field::77
where μ is a spacetime index, , etc. The summation over the index μ has been omitted following the Einstein notation. If the parameter λ is sufficiently small, then the interacting theory described by the above Lagrangian can be considered as a small perturbation from the free theory.
The path integral formulation of QFT is concerned with the direct computation of the scattering amplitude of a certain interaction process, rather than the establishment of operators and state spaces. To calculate the probability amplitude for a system to evolve from some initial state at time t = 0 to some final state at t = T, the total time T is divided into N small intervals. The overall amplitude is the product of the amplitude of evolution within each interval, integrated over all intermediate states. Let H be the Hamiltonian (i.e. generator of time evolution), then:10
where L is the Lagrangian involving ϕ and its derivatives with respect to spatial and time coordinates, obtained from the Hamiltonian H via Legendre transform. The initial and final conditions of the path integral are respectively
In other words, the overall amplitude is the sum over the amplitude of every possible path between the initial and final states, where the amplitude of a path is given by the exponential in the integrand.
Now we assume that the theory contains interactions whose Lagrangian terms are a small perturbation from the free theory.
In calculations, one often encounters such expressions:
where x and y are position four-vectors, T is the time ordering operator (namely, it orders x and y according to their time-component, later time on the left and earlier time on the right), and is the ground state (vacuum state) of the interacting theory. This expression, known as the two-point correlation function or the two-point Green's function, represents the probability amplitude for the field to propagate from y to x.:82
In canonical quantisation, the two-point correlation function can be written as::87
Since λ is a small parameter, the exponential function exp can be expanded into a Taylor series in λ and computed term by term. This equation is useful in that it expresses the field operator and ground state in the interacting theory, which are difficult to define, in terms of their counterparts in the free theory, which are well defined.
In the path integral formulation, the two-point correlation function can be written as::284
where is the Lagrangian density. As in the previous paragraph, the exponential factor involving the interaction term can also be expanded as a series in λ.
According to Wick's theorem, any n-point correlation function in the free theory can be written as a sum of products of two-point correlation functions. For example,
Since correlation functions in the interacting theory can be expressed in terms of those in the free theory, only the latter need to be evaluated in order to calculate all physical quantities in the (perturbative) interacting theory.:90
Either through canonical quantisation or path integrals, one can obtain:
Correlation functions in the interacting theory can be written as a perturbation series. Each term in the series is a product of Feynman propagators in the free theory and can be represented visually by a Feynman diagram. For example, the λ1 term in the two-point correlation function in the ϕ4 theory is
After applying Wick's theorem, one of the terms is
whose corresponding Feynman diagram is
Every point corresponds to a single ϕ field factor. Points labelled with x and y are called external points, while those in the interior are called internal points or vertices (there is one in this diagram). The value of the corresponding term can be obtained from the diagram by following "Feynman rules": assign to every vertex and the Feynman propagator to every line with end points x1 and x2. The product of factors corresponding to every element in the diagram, divided by the "symmetry factor" (2 for this diagram), gives the expression for the term in the perturbation series.:91-94
In order to compute the n-point correlation function to the k-th order, list all valid Feynman diagrams with n external points and k or fewer vertices, and then use Feynman rules to obtain the expression for each term. To be precise,
is equal to the sum of (expressions corresponding to) all connected diagrams with n external points. (Connected diagrams are those in which every vertex is connected to an external point through lines. Components that are totally disconnected from external lines are sometimes called "vacuum bubbles".) In the ϕ4 interaction theory discussed above, every vertex must have four legs.:98
In realistic applications, the scattering amplitude of a certain interaction or the decay rate of a particle can be computed from the S-matrix, which itself can be found using the Feynman diagram method.:102-115
Feynman diagrams devoid of "loops" are called tree-level diagrams, which describe the lowest-order interaction processes; those containing n loops are referred to as n-loop diagrams, which describe higher-order contributions, or radiative corrections, to the interaction.:44 Lines whose end points are vertices can be thought of as the propagation of virtual particles.:31
Feynman rules can be used to directly evaluate tree-level diagrams. However, naïve computation of loop diagrams such as the one shown above will result in divergent momentum integrals, which seems to imply that almost all terms in the perturbative expansion are infinite. The renormalisation procedure is a systematic process for removing such infinities.
Parameters appearing in the Lagrangian, such as the mass m and the coupling constant λ, have no physical meaning — m, λ, and the field strength ϕ are not experimentally measurable quantities and are referred to here as the bare mass, bare coupling constant, and bare field, respectively. The physical mass and coupling constant are measured in some interaction process and are generally different from the bare quantities. While computing physical quantities from this interaction process, one may limit the domain of divergent momentum integrals to be below some momentum cut-off Λ, obtain expressions for the physical quantities, and then take the limit Λ → ∞. This is an example of regularisation, a class of methods to treat divergences in QFT, with Λ being the regulator.
The approach illustrated above is called bare perturbation theory, as calculations involve only the bare quantities such as mass and coupling constant. A different approach, called renormalised perturbation theory, is to use physically meaningful quantities from the very beginning. In the case of ϕ4 theory, the field strength is first redefined:
where ϕ is the bare field, ϕr is the renormalised field, and Z is a constant to be determined. The Lagrangian density becomes:
where mr and λr are the experimentally measurable, renormalised, mass and coupling constant, respectively, and
are constants to be determined. The first three terms are the ϕ4 Lagrangian density written in terms of the renormalised quantities, while the latter three terms are referred to as "counterterms". As the Lagrangian now contains more terms, so the Feynman diagrams should include additional elements, each with their own Feynman rules. The procedure is outlined as follows. First select a regularisation scheme (such as the cut-off regularisation introduced above or dimensional regularization); call the regulator Λ. Compute Feynman diagrams, in which divergent terms will depend on Λ. Then, define δZ, δm, and δλ such that Feynman diagrams for the counterterms will exactly cancel the divergent terms in the normal Feynman diagrams when the limit Λ → ∞ is taken. In this way, meaningful finite quantities are obtained.:323-326
It is only possible to eliminate all infinities to obtain a finite result in renormalisable theories, whereas in non-renormalisable theories infinities cannot be removed by the redefinition of a small number of parameters. The Standard Model of elementary particles is a renormalisable QFT,:719–727 while quantum gravity is non-renormalisable.:798:421
The renormalisation group, developed by Kenneth Wilson, is a mathematical apparatus used to study the changes in physical parameters (coefficients in the Lagrangian) as the system is viewed at different scales.:393 The way in which each parameter changes with scale is described by its β function.:417 Correlation functions, which underlie quantitative physical predictions, change with scale according to the Callan–Symanzik equation.:410-411
As an example, the coupling constant in QED, namely the elementary charge e, has the following β function:
where Λ is the energy scale under which the measurement of e is performed. This differential equation implies that the observed elementary charge increases as the scale increases. The renormalized coupling constant, which changes with the energy scale, is also called the running coupling constant.:420
where Nf is the number of quark flavours. In the case where Nf ≤ 16 (the Standard Model has Nf = 6), the coupling constant g decreases as the energy scale increases. Hence, while the strong interaction is strong at low energies, it becomes very weak in high-energy interactions, a phenomenon known as asymptotic freedom.:531
Conformal field theories (CFTs) are special QFTs that admit conformal symmetry. They are insensitive to changes in the scale, as all their coupling constants have vanishing β function. (The converse is not true, however — the vanishing of all β functions does not imply conformal symmetry of the theory.) Examples include string theory and N = 4 supersymmetric Yang–Mills theory.
According to Wilson's picture, every QFT is fundamentally accompanied by its energy cut-off Λ, i.e. that the theory is no longer valid at energies higher than Λ, and all degrees of freedom above the scale Λ are to be omitted. For example, the cut-off could be the inverse of the atomic spacing in a condensed matter system, and in elementary particle physics it could be associated with the fundamental "graininess" of spacetime caused by quantum fluctuations in gravity. The cut-off scale of theories of particle interactions lies far beyond current experiments. Even if the theory were very complicated at that scale, as long as its couplings are sufficiently weak, it must be described at low energies by a renormalisable effective field theory.:402-403 The difference between renormalisable and non-renormalisable theories is that the former are insensitive to details at high energies, whereas the latter do depend of them.:2 According to this view, non-renormalisable theories are to be seen as low-energy effective theories of a more fundamental theory. The failure to remove the cut-off Λ from calculations in such a theory merely indicates that new physical phenomena appear at scales above Λ, where a new theory is necessary.:156
The quantisation and renormalisation procedures outlined in the preceding sections are performed for the free theory and ϕ4 theory of the real scalar field. A similar process can be done for other types of fields, including the complex scalar field, the vector field, and the Dirac field, as well as other types of interaction terms, including the electromagnetic interaction and the Yukawa interaction.
As an example, quantum electrodynamics contains a Dirac field ψ representing the electron field and a vector field Aμ representing the electromagnetic field (photon field). (Despite its name, the quantum electromagnetic "field" actually corresponds to the classical electromagnetic four-potential, rather than the classical electric and magnetic fields.) The full QED Lagrangian density is:
where γμ are Dirac matrices, , and is the electromagnetic field strength. The parameters in this theory are the (bare) electron mass m and the (bare) elementary charge e. The first and second terms in the Lagrangian density correspond to the free Dirac field and free vector fields, respectively. The last term describes the interaction between the electron and photon fields, which is treated as a perturbation from the free theories.:78
Shown above is an example of a tree-level Feynman diagram in QED. It describes an electron and a positron annihilating, creating an off-shell photon, and then decaying into a new pair of electron and positron. Time runs from left to right. Arrows pointing forward in time represent the propagation of positrons, while those pointing backward in time represent the propagation of electrons. A wavy line represents the propagation of a photon. Each vertex in QED Feynman diagrams must have an incoming and an outgoing fermion (positron/electron) leg as well as a photon leg.
If the following transformation to the fields is performed at every spacetime point x (a local transformation), then the QED Lagrangian remains unchanged, or invariant:
where α(x) is any function of spacetime coordinates. If a theory's Lagrangian (or more precisely the action) is invariant under a certain local transformation, then the transformation is referred to as a gauge symmetry of the theory.:482–483 Gauge symmetries form a group at every spacetime point. In the case of QED, the successive application of two different local symmetry transformations and is yet another symmetry transformation . For any α(x), is an element of the U(1) group, thus QED is said to have U(1) gauge symmetry.:496 The photon field Aμ may be referred to as the U(1) gauge boson.
U(1) is an Abelian group, meaning that the result is the same regardless of the order in which its elements are applied. QFTs can also be built on non-Abelian groups, giving rise to non-Abelian gauge theories (also known as Yang–Mills theories).:489 Quantum chromodynamics, which describes the strong interaction, is a non-Abelian gauge theory with an SU(3) gauge symmetry. It contains three Dirac fields ψi, i = 1,2,3 representing quark fields as well as eight vector fields Aa,μ, a = 1,...,8 representing gluon fields, which are the SU(3) gauge bosons.:547 The QCD Lagrangian density is::490-491
where Dμ is the gauge covariant derivative:
and fabc are the structure constants of SU(3). Repeated indices i,j,a are implicitly summed over following Einstein notation. This Lagrangian is invariant under the transformation:
where U(x) is an element of SU(3) at every spacetime point x:
The preceding discussion of symmetries is on the level of the Lagrangian. In other words, these are "classical" symmetries. After quantisation, some theories will no longer exhibit their classical symmetries, a phenomenon called anomaly. For instance, in the path integral formulation, despite the invariance of the Lagrangian density under a certain local transformation of the fields, the measure of the path integral may change.:243 For a theory describing nature to be consistent, it must not contain any anomaly in its gauge symmetry. The Standard Model of elementary particles is a gauge theory based on the group SU(3) × SU(2) × U(1), in which all anomalies exactly cancel.:705-707
The theoretical foundation of general relativity, the equivalence principle, can also be understood as a form of gauge symmetry, making general relativity a gauge theory based on the Lorentz group.
Noether's theorem states that every continuous symmetry, i.e. the parameter in the symmetry transformation being continuous rather than discrete, leads to a corresponding conservation law.:17-18:73 For example, the U(1) symmetry of QED implies charge conservation.
Gauge transformations do not relate distinct quantum states. Rather, it relates two equivalent mathematical descriptions of the same quantum state. As an example, the photon field Aμ, being a four-vector, has four apparent degrees of freedom, but the actual state of a photon is described by its two degrees of freedom corresponding to the polarisation. The remaining two degrees of freedom are said to be "redundant" — apparently different ways of writing Aμ can be related to each other by a gauge transformation and in fact describe the same state of the photon field. In this sense, gauge invariance is not a "real" symmetry, but are a reflection of the "redundancy" of the chosen mathematical description.:168
To account for the gauge redundancy in the path integral formulation, one must perform the so-called Faddeev–Popov gauge fixing procedure. In non-Abelian gauge theories, such a procedure introduces new fields called "ghosts". Particles corresponding to the ghost fields are called ghost particles, which cannot be detected externally.:512-515 A more rigorous generalisation of the Faddeev–Popov procedure is given by BRST quantization.:517
To illustrate the mechanism, consider a linear sigma model containing N real scalar fields, described by the Lagrangian density:
where μ and λ are real parameters. The theory admits an O(N) global symmetry:
The lowest energy state (ground state or vacuum state) of the classical theory is any uniform field ϕ0 satisfying
Without loss of generality, let the ground state be in the N-th direction:
The original N fields can be rewritten as:
and the original Lagrangian density as:
where k = 1,...,N-1. The original O(N) global symmetry is no longer manifest, leaving only the subgroup O(N-1). The larger symmetry before spontaneous symmetry breaking is said to be "hidden" or spontaneously broken.:349-350
Goldstone's theorem states that under spontaneous symmetry breaking, every broken continuous global symmetry leads to a massless field called the Goldstone boson. In the above example, O(N) has N(N-1)/2 continuous symmetries (the dimension of its Lie algebra), while O(N-1) has (N-1)(N-2)/2. The number of broken symmetries is their difference, N-1, which corresponds to the N-1 massless fields πk.:351
On the other hand, when a gauge (as opposed to global) symmetry is spontaneously broken, the resulting Goldstone boson is "eaten" by the corresponding gauge boson by becoming an additional degree of freedom for the gauge boson. The Goldstone boson equivalence theorem states that at high energy, the amplitude for emission or absorption of a longitudinally polarised massive gauge boson becomes equal to the amplitude for emission or absorption of the Goldstone boson that was eaten by the gauge boson.:743-744
In the QFT of ferromagnetism, spontaneous symmetry breaking can explain the alignment of magnetic dipoles at low temperatures.:199 In the Standard Model of elementary particles, the W and Z bosons, which would otherwise be massless as a result of gauge symmetry, acquire mass through spontaneous symmetry breaking of the Higgs boson, a process called the Higgs mechanism.:690
All experimentally known symmetries in nature relate bosons to bosons and fermions to fermions. Theorists have hypothesised the existence of a type of symmetry, called supersymmetry, that relates bosons and fermions.:795:443
The Standard Model obeys Poincaré symmetry, whose generators are spacetime translation Pμ and Lorentz transformation Jμν.:58–60 In addition to these generators, supersymmetry in (3+1)-dimensions includes additional generators Qα, called supercharges, which themselves transform as Weyl fermions.:795:444 The symmetry group generated by all these generators is known as the super-Poincaré group. In general there can be more than one set of supersymmetry generators, QαI, I = 1, ..., N, which generate the corresponding N = 1 supersymmetry, N = 2 supersymmetry, and so on.:795:450 Supersymmetry can also be constructed in other dimensions, most notably in (1+1) dimensions for its application in superstring theory.
The Lagrangian of a supersymmetric theory must be invariant under the action of the super-Poincaré group.:448 Examples of such theories include: Minimal Supersymmetric Standard Model (MSSM), N = 4 supersymmetric Yang–Mills theory,:450 and superstring theory. In a supersymmetric theory, every fermion has a bosonic superpartner and vice versa.:444
Supersymmetry is a potential solution to many current problems in physics. For example, the hierarchy problem of the Standard Model — why the mass of the Higgs boson is not radiatively corrected (under renormalisation) to a very high scale such as the grand unified scale or the Planck scale — can be resolved by relating the Higgs field and its superpartner, the Higgsino. Radiative corrections due to Higgs boson loops in Feynman diagrams are cancelled by corresponding Higgsino loops. Supersymmetry also offers answers to the grand unification of all gauge coupling constants in the Standard Model as well as the nature of dark matter.:796-797
Nevertheless, as of 2018, experiments have yet to provide evidence for the existence of supersymmetric particles. If supersymmetry were a true symmetry of nature, then it must be a broken symmetry, and the energy of symmetry breaking must be higher than those achievable by present-day experiments.:797:443
The ϕ4 theory, QED, QCD, as well as the whole Standard Model all assume a (3+1)-dimensional Minkowski space (3 spatial and 1 time dimensions) as the background on which the quantum fields are defined. However, QFT a priori imposes no restriction on the number of dimensions nor the geometry of spacetime.
In condensed matter physics, QFT is used to describe (2+1)-dimensional electron gases. In high-energy physics, string theory is a type of (1+1)-dimensional QFT,:452 while Kaluza–Klein theory uses gravity in extra dimensions to produce gauge theories in lower dimensions.:428-429
where gμν is the inverse of gμν. For a real scalar field, the Lagrangian density in a general spacetime background is
where g = det(gμν), and ∇μ denotes the covariant derivative. The Lagrangian of a QFT, hence its calculational results and physical predictions, depends on the geometry of the spacetime background.
The correlation functions and physical predictions of a QFT depend on the spacetime metric gμν. For a special class of QFTs called topological quantum field theories (TQFTs), all correlation functions are independent of continuous changes in the spacetime metric.:36 QFTs in curved spacetime generally change according to the geometry (local structure) of the spacetime background, while TQFTs are invariant under spacetime diffeomorphisms but are sensitive to the topology (global structure) of spacetime. This means that all calculational results of TQFTs are topological invariants of the underlying spacetime. Chern–Simons theory is an example of TQFT. Applications of TQFT include the fractional quantum Hall effect and topological quantum computers.:1–5
Using perturbation theory, the total effect of a small interaction term can be approximated order by order by a series expansion in the number of virtual particles participating in the interaction. Every term in the expansion may be understood as one possible way for (physical) particles to interact with each other via virtual particles, expressed visually using a Feynman diagram. The electromagnetic force between two electrons in QED is represented (to first order in perturbation theory) by the propagation of a virtual photon. In a similar manner, the W and Z bosons carry the weak interaction, while gluons carry the strong interaction. The interpretation of an interaction as a sum of intermediate states involving the exchange of various virtual particles only makes sense in the framework of perturbation theory. In contrast, non-perturbative methods in QFT treat the interacting Lagrangian as a whole without any series expansion. Instead of particles that carry interactions, these methods have spawned such concepts as 't Hooft–Polyakov monopole, domain wall, flux tube, and instanton.
In spite of its overwhelming success in particle physics and condensed matter physics, QFT itself lacks a formal mathematical foundation. For example, according to Haag's theorem, there does not exist a well-defined interaction picture for QFT, which implies that perturbation theory of QFT, which underlies the entire Feynman diagram method, is fundamentally not rigorous.
Since the 1950s, theoretical physicists and mathematicians have attempted to organise all QFTs into a set of axioms, in order to establish the existence of concrete models of relativistic QFT in a mathematically rigorous way and to study their properties. This line of study is called constructive quantum field theory, a subfield of mathematical physics,:2 which has led to such results as CPT theorem, spin-statistics theorem, and Goldstone's theorem.
Compared to ordinary QFT, topological quantum field theory and conformal field theory are better supported mathematically — both can be classified in the framework of representations of cobordisms.
Algebraic quantum field theory is another approach to the axiomatisation of QFT, in which the fundamental objects are local operators and the algebraic relations between them. Axiomatic systems following this approach include Wightman axioms and Haag-Kastler axioms.:2-3 One way to construct theories satisfying Wightman axioms is to use Osterwalder-Schrader axioms, which give the necessary and sufficient conditions for a real time theory to be obtained from an imaginary time theory by analytic continuation (Wick rotation).:10
Yang-Mills existence and mass gap, one of the Millenium Prize Problems, concerns the well-defined existence of Yang-Mills theories as set out by the above axioms. The full problem statement is as follows.
Axiomatic quantum field theory is a mathematical discipline which aims to describe quantum field theory in terms of rigorous axioms. It is strongly associated with functional analysis and operator algebras, but has also been studied in recent years from a more geometric and functorial perspective.
There are two main challenges in this discipline. First, one must propose a set of axioms which describe the general properties of any mathematical object that deserves to be called a "quantum field theory". Then, one gives rigorous mathematical constructions of examples satisfying these axioms.Correlation function (quantum field theory)
In quantum field theory, the (real space) n-point correlation function is defined as the functional average (functional expectation value) of a product of field operators at different positions
For time-dependent correlation functions, the time-ordering operator is included.
Correlation functions are also called simply correlators. Sometimes, the phrase Green's function is used not only for two-point functions, but for any correlators.
The correlation function can be interpreted physically as the amplitude for propagation of a particle or excitation between y and x. In the free theory, it is simply the Feynman propagator (for n=2).Effective field theory
In physics, an effective field theory is a type of approximation, or effective theory, for an underlying physical theory, such as a quantum field theory or a statistical mechanics model. An effective field theory includes the appropriate degrees of freedom to describe physical phenomena occurring at a chosen length scale or energy scale, while ignoring substructure and degrees of freedom at shorter distances (or, equivalently, at higher energies). Intuitively, one averages over the behavior of the underlying theory at shorter length scales to derive what is hoped to be a simplified model at longer length scales. Effective field theories typically work best when there is a large separation between length scale of interest and the length scale of the underlying dynamics. Effective field theories have found use in particle physics, statistical mechanics, condensed matter physics, general relativity, and hydrodynamics. They simplify calculations, and allow treatment of dissipation and radiation effects.Fermion
In particle physics, a fermion is a particle that follows Fermi–Dirac statistics. These particles obey the Pauli exclusion principle. Fermions include all quarks and leptons, as well as all composite particles made of an odd number of these, such as all baryons and many atoms and nuclei. Fermions differ from bosons, which obey Bose–Einstein statistics.
A fermion can be an elementary particle, such as the electron, or it can be a composite particle, such as the proton. According to the spin-statistics theorem in any reasonable relativistic quantum field theory, particles with integer spin are bosons, while particles with half-integer spin are fermions.
In addition to the spin characteristic, fermions have another specific property: they possess conserved baryon or lepton quantum numbers. Therefore, what is usually referred to as the spin statistics relation is in fact a spin statistics-quantum number relation.As a consequence of the Pauli exclusion principle, only one fermion can occupy a particular quantum state at any given time. If multiple fermions have the same spatial probability distribution, then at least one property of each fermion, such as its spin, must be different. Fermions are usually associated with matter, whereas bosons are generally force carrier particles, although in the current state of particle physics the distinction between the two concepts is unclear. Weakly interacting fermions can also display bosonic behavior under extreme conditions. At low temperature fermions show superfluidity for uncharged particles and superconductivity for charged particles.
Composite fermions, such as protons and neutrons, are the key building blocks of everyday matter.
The name fermion was coined by English theoretical physicist Paul Dirac from the surname of Italian physicist Enrico Fermi.History of quantum field theory
In particle physics, the history of quantum field theory starts with its creation by Paul Dirac, when he attempted to quantize the electromagnetic field in the late 1920s. Major advances in the theory were made in the 1940s and 1950s, and led to the introduction of renormalized quantum electrodynamics (QED). QED was so successful and accurately predictive that efforts were made to apply the same basic concepts for the other forces of nature. By the late 1970s, these efforts successfully utilized gauge theory in the strong nuclear force and weak nuclear force, producing the modern standard model of particle physics.
Efforts to describe gravity using the same techniques have, to date, failed. The study of quantum field theory is still flourishing, as are applications of its methods to many physical problems. It remains one of the most vital areas of theoretical physics today, providing a common language to several different branches of physics.Klein–Gordon equation
The Klein–Gordon equation (Klein–Fock–Gordon equation or sometimes Klein–Gordon–Fock equation) is a relativistic wave equation, related to the Schrödinger equation. It is second order in space and time and manifestly Lorentz covariant. It is a quantized version of the relativistic energy–momentum relation. Its solutions include a quantum scalar or pseudoscalar field, a field whose quanta are spinless particles. Its theoretical relevance is similar to that of the Dirac equation. Electromagnetic interactions can be incorporated, forming the topic of scalar electrodynamics, but because common spinless particles like the pi mesons are unstable and also experience the strong interaction (with unknown interaction term in the Hamiltonian), the practical utility is limited.
The equation can be put into the form of a Schrödinger equation. In this form it is expressed as two coupled differential equations, each of first order in time. The solutions have two components, reflecting the charge degree of freedom in relativity. It admits a conserved quantity, but this is not positive definite. The wave function cannot therefore be interpreted as a probability amplitude. The conserved quantity is instead interpreted as electric charge and the norm squared of the wave function is interpreted as a charge density. The equation describes all spinless particles with positive, negative as well as zero charge.
Any solution of the free Dirac equation is, component-wise, a solution of the free Klein–Gordon equation.
The equation does not form the basis of a consistent quantum relativistic one-particle theory. There is no known such theory for particles of any spin. For full reconciliation of quantum mechanics with special relativity quantum field theory is needed, in which the Klein–Gordon equation reemerges as the equation obeyed by the components of all free quantum fields. In quantum field theory, the solutions of the free (noninteracting) versions of the original equations still play a role. They are needed to build the Hilbert space (Fock space) and to express quantum field by using complete sets (spanning sets of Hilbert space) of wave functions.Local quantum field theory
The Haag–Kastler axiomatic framework for quantum field theory, introduced by Haag and Kastler (1964), is an application to local quantum physics of C*-algebra theory. Because of this it is also known as algebraic quantum field theory (AQFT). The axioms are stated in terms of an algebra given for every open set in Minkowski space, and mappings between those.Noncommutative quantum field theory
In mathematical physics, noncommutative quantum field theory (or quantum field theory on noncommutative spacetime) is an application of noncommutative mathematics to the spacetime of quantum field theory that is an outgrowth of noncommutative geometry and index theory in which the coordinate functions are noncommutative. One commonly studied version of such theories has the "canonical" commutation relation:
which means that (with any given set of axes), it is impossible to accurately measure the position of a particle with respect to more than one axis. In fact, this leads to an uncertainty relation for the coordinates analogous to the Heisenberg uncertainty principle.
Various lower limits have been claimed for the noncommutative scale, (i.e. how accurately positions can be measured) but there is currently no experimental evidence in favour of such a theory or grounds for ruling them out.
One of the novel features of noncommutative field theories is the UV/IR mixing phenomenon in which the physics at high energies affects the physics at low energies which does not occur in quantum field theories in which the coordinates commute.
Other features include violation of Lorentz invariance due to the preferred direction of noncommutativity. Relativistic invariance can however be retained in the sense of twisted Poincaré invariance of the theory. The causality condition is modified from that of the commutative theories.Partition function (quantum field theory)
In quantum field theory, the partition function is the generating functional of all correlation functions, generalizing the characteristic function of probability theory.
It is usually expressed by the following functional integral:
where S is the action functional.
The partition function in quantum field theory is a special case of the mathematical partition function, and is related to the statistical partition function in statistical mechanics. The primary difference is that the countable collection of random variables seen in the definition of such simpler partition functions has been replaced by an uncountable set, thus necessitating the use of functional integrals over a field .Quantization (physics)
In physics, quantization is the process of transition from a classical understanding of physical phenomena to a newer understanding known as quantum mechanics. (It is a procedure for constructing a quantum field theory starting from a classical field theory.) This is a generalization of the procedure for building quantum mechanics from classical mechanics. One also speaks of field quantization, as in the "quantization of the electromagnetic field", where one refers to photons as field "quanta" (for instance as light quanta). This procedure is basic to theories of particle physics, nuclear physics, condensed matter physics, and quantum optics.Quantum field theory in curved spacetime
In particle physics, quantum field theory in curved spacetime is an extension of standard, Minkowski space quantum field theory to curved spacetime. A general prediction of this theory is that particles can be created by time-dependent gravitational fields (multigraviton pair production), or by time-independent gravitational fields that contain horizons.Quantum gravity
Quantum gravity (QG) is a field of theoretical physics that seeks to describe gravity according to the principles of quantum mechanics, and where quantum effects cannot be ignored, such as near compact astrophysical objects where the effects of gravity are strong.
The current understanding of gravity is based on Albert Einstein's general theory of relativity, which is formulated within the framework of classical physics. On the other hand, the other three fundamental forces of physics are described within the framework of quantum mechanics and quantum field theory, radically different formalisms for describing physical phenomena. It is sometimes argued that a quantum mechanical description of gravity is necessary on the grounds that one cannot consistently couple a classical system to a quantum one.While a quantum theory of gravity may be needed to reconcile general relativity with the principles of quantum mechanics, difficulties arise when applying the usual prescriptions of quantum field theory to the force of gravity via graviton bosons. The problem is that the theory one gets in this way is not renormalizable (it predicts infinite values for some observable properties such as the mass of particles) and therefore cannot be used to make meaningful physical predictions. As a result, theorists have taken up more radical approaches to the problem of quantum gravity, the most popular approaches being string theory and loop quantum gravity. Although some quantum gravity theories, such as string theory, try to unify gravity with the other fundamental forces, others, such as loop quantum gravity, make no such attempt; instead, they make an effort to quantize the gravitational field while it is kept separate from the other forces.
Strictly speaking, the aim of quantum gravity is only to describe the quantum behavior of the gravitational field and should not be confused with the objective of unifying all fundamental interactions into a single mathematical framework. A quantum field theory of gravity that is unified with a grand unified theory is sometimes referred to as a theory of everything (TOE). While any substantial improvement into the present understanding of gravity would aid further work towards unification, the study of quantum gravity is a field in its own right with various branches having different approaches to unification.
One of the difficulties of formulating a quantum gravity theory is that quantum gravitational effects only appear at length scales near the Planck scale, around 10−35 meter, a scale far smaller, and equivalently far larger in energy, than those currently accessible by high energy particle accelerators. Therefore physicists lack experimental data which could distinguish between the competing theories which have been proposed and thus thought experiment approaches are suggested as a testing tool for these theories.S-duality
In theoretical physics, S-duality (short for strong–weak duality) is an equivalence of two physical theories, which may be either quantum field theories or string theories. S-duality is useful for doing calculations in theoretical physics because it relates a theory in which calculations are difficult to a theory in which they are easier.In quantum field theory, S-duality generalizes a well established fact from classical electrodynamics, namely the invariance of Maxwell's equations under the interchange of electric and magnetic fields. One of the earliest known examples of S-duality in quantum field theory is Montonen–Olive duality which relates two versions of a quantum field theory called N = 4 supersymmetric Yang–Mills theory. Recent work of Anton Kapustin and Edward Witten suggests that Montonen–Olive duality is closely related to a research program in mathematics called the geometric Langlands program. Another realization of S-duality in quantum field theory is Seiberg duality, which relates two versions of a theory called N=1 supersymmetric Yang–Mills theory.
There are also many examples of S-duality in string theory. The existence of these string dualities implies that seemingly different formulations of string theory are actually physically equivalent. This led to the realization, in the mid-1990s, that all of the five consistent superstring theories are just different limiting cases of a single eleven-dimensional theory called M-theory.Scalar boson
A scalar boson is a boson whose spin equals zero. Boson means that it has an integer-valued spin; the scalar fixes this value to 0.
The name scalar boson arises from quantum field theory. It refers to the particular transformation properties under Lorentz transformation.Thermal quantum field theory
In theoretical physics, thermal quantum field theory (thermal field theory for short) or finite temperature field theory is a set of methods to calculate expectation values of physical observables of a quantum field theory at finite temperature.
In the Matsubara formalism, the basic idea (due to Felix Bloch) is that the expectation values of operators in a canonical ensemble
may be written as expectation values in ordinary quantum field theory where the configuration is evolved by an imaginary time . One can therefore switch to a spacetime with Euclidean signature, where the above trace (Tr) leads to the requirement that all bosonic and fermionic fields be periodic and antiperiodic, respectively, with respect to the Euclidean time direction with periodicity (we are assuming natural units ). This allows one to perform calculations with the same tools as in ordinary quantum field theory, such as functional integrals and Feynman diagrams, but with compact Euclidean time. Note that the definition of normal ordering has to be altered. In momentum space, this leads to the replacement of continuous frequencies by discrete imaginary (Matsubara) frequencies and, through the de Broglie relation, to a discretized thermal energy spectrum . This has been shown to be a useful tool in studying the behavior of quantum field theories at finite temperature. It has been generalized to theories with gauge invariance and was a central tool in the study of a conjectured deconfining phase transition of Yang-Mills theory. In this Euclidean field theory, real-time observables can be retrieved by analytic continuation.
The alternative to the use of fictitious imaginary times is to use a real-time formalism which come in two forms. A path-ordered approach to real-time formalisms includes the Schwinger-Keldysh formalism and more modern variants. The latter involves replacing a straight time contour from (large negative) real initial time to by one that first runs to (large positive) real time and then suitably back to . In fact all that is needed is one section running along the real time axis as the route to the end point, , is less important. The piecewise composition of the resulting complex time contour leads to a doubling of fields and more complicated Feynman rules, but obviates the need of analytic continuations of the imaginary-time formalism. The alternative approach to real-time formalisms is an operator based approach using Bogoliubov transformations, known as thermo field dynamics. As well as Feynman diagrams and perturbation theory, other techniques such as dispersion relations and the finite temperature analog of Cutkosky rules can also be used in the real time formulation.
An alternative approach which is of interest to mathematical physics is to work with KMS states.Topological quantum field theory
A topological quantum field theory (or topological field theory or TQFT) is a quantum field theory which computes topological invariants.
Although TQFTs were invented by physicists, they are also of mathematical interest, being related to, among other things, knot theory and the theory of four-manifolds in algebraic topology, and to the theory of moduli spaces in algebraic geometry. Donaldson, Jones, Witten, and Kontsevich have all won Fields Medals for mathematical work related to topological field theory.
In condensed matter physics, topological quantum field theories are the low-energy effective theories of topologically ordered states, such as fractional quantum Hall states, string-net condensed states, and other strongly correlated quantum liquid states.
In dynamics, all continuous time dynamical systems, with and without noise, are Witten-type TQFTs and the phenomenon of the spontaneous breakdown of the corresponding topological supersymmetry encompasses such well-established concepts as chaos, turbulence, 1/f and crackling noises, self-organized criticality etc.Vacuum expectation value
In quantum field theory the vacuum expectation value (also called condensate or simply VEV) of an operator is its average, expected value in the vacuum. The vacuum expectation value of an operator O is usually denoted by One of the most widely used, but controversial, examples of an observable physical effect that results from the vacuum expectation value of an operator is the Casimir effect.
This concept is important for working with correlation functions in quantum field theory. It is also important in spontaneous symmetry breaking. Examples are:
The observed Lorentz invariance of space-time allows only the formation of condensates which are Lorentz scalars and have vanishing charge. Thus fermion condensates must be of the form , where ψ is the fermion field. Similarly a tensor field, Gμν, can only have a scalar expectation value such as .
In some vacua of string theory, however, non-scalar condensates are found.[which?] If these describe our universe, then Lorentz symmetry violation may be observable.Virtual particle
In physics, a virtual particle is a transient fluctuation that exhibits some of the characteristics of an ordinary particle, while having its existence limited by the uncertainty principle. The concept of virtual particles arises in perturbation theory of quantum field theory where interactions between ordinary particles are described in terms of exchanges of virtual particles. A process involving virtual particles can be described by a schematic representation known as a Feynman diagram, in which virtual particles are represented by internal lines.Virtual particles do not necessarily carry the same mass as the corresponding real particle, although they always conserve energy and momentum. The longer the virtual particle exists, the closer its characteristics come to those of ordinary particles. They are important in the physics of many processes, including particle scattering and Casimir forces. In quantum field theory, even classical forces—such as the electromagnetic repulsion or attraction between two charges—can be thought of as due to the exchange of many virtual photons between the charges. Virtual photons are the exchange particle for the electromagnetic interaction.
The term is somewhat loose and vaguely defined, in that it refers to the view that the world is made up of "real particles": it is not; rather, "real particles" are better understood to be excitations of the underlying quantum fields. Virtual particles are also excitations of the underlying fields, but are "temporary" in the sense that they appear in calculations of interactions, but never as asymptotic states or indices to the scattering matrix. The accuracy and use of virtual particles in calculations is firmly established, but as they cannot be detected in experiments, deciding how to precisely describe them is a topic of debate.Wightman axioms
In physics, the Wightman axioms (also called Gårding–Wightman axioms), named after Lars Gårding and Arthur Wightman, are an attempt at a mathematically rigorous formulation of quantum field theory. Arthur Wightman formulated the axioms in the early 1950s, but they were first published only in 1964 after Haag–Ruelle scattering theory affirmed their significance.
The axioms exist in the context of constructive quantum field theory, and they are meant to provide a basis for rigorous treatment of quantum fields, and strict foundation for the perturbative methods used. One of the Millennium Problems is to realize the Wightman axioms in the case of Yang–Mills fields.
Quantum field theories
|Quantum field theory|
in curved spacetime