The photon is a type of elementary particle, the quantum of the electromagnetic field including electromagnetic radiation such as light, and the force carrier for the electromagnetic force (even when static via virtual particles). The photon has zero rest mass and always moves at the speed of light within a vacuum.
Like all elementary particles, photons are currently best explained by quantum mechanics and exhibit wave–particle duality, exhibiting properties of both waves and particles. For example, a single photon may be refracted by a lens and exhibit wave interference with itself, and it can behave as a particle with definite and finite measurable position or momentum, though not both at the same time as per Heisenberg's uncertainty principle. The photon's wave and quantum qualities are two observable aspects of a single phenomenon—they cannot be described by any mechanical model; a representation of this dual property of light that assumes certain points on the wavefront to be the seat of the energy is not possible. The quanta in a light wave are not spatially localized.
The modern concept of the photon was developed gradually by Albert Einstein in the early 20th century to explain experimental observations that did not fit the classical wave model of light. The benefit of the photon model is that it accounts for the frequency dependence of light's energy, and explains the ability of matter and electromagnetic radiation to be in thermal equilibrium. The photon model accounts for anomalous observations, including the properties of black-body radiation, that others (notably Max Planck) had tried to explain using semiclassical models. In that model, light is described by Maxwell's equations, but material objects emit and absorb light in quantized amounts (i.e., they change energy only by certain particular discrete amounts). Although these semiclassical models contributed to the development of quantum mechanics, many further experiments beginning with the phenomenon of Compton scattering of single photons by electrons, validated Einstein's hypothesis that light itself is quantized. In December 1926, American physical chemist Gilbert N. Lewis coined the widely-adopted name "photon" for these particles in a letter to Nature. After Arthur H. Compton won the Nobel Prize in 1927 for his scattering studies, most scientists accepted that light quanta have an independent existence, and the term "photon" was accepted.
In the Standard Model of particle physics, photons and other elementary particles are described as a necessary consequence of physical laws having a certain symmetry at every point in spacetime. The intrinsic properties of particles, such as charge, mass, and spin, are determined by this gauge symmetry. The photon concept has led to momentous advances in experimental and theoretical physics, including lasers, Bose–Einstein condensation, quantum field theory, and the probabilistic interpretation of quantum mechanics. It has been applied to photochemistry, high-resolution microscopy, and measurements of molecular distances. Recently, photons have been studied as elements of quantum computers, and for applications in optical imaging and optical communication such as quantum cryptography.
|Interactions||Electromagnetic, Weak, Gravity|
|Theorized||Albert Einstein (1905) |
The name of "photon" is generally attributed to Gilbert N. Lewis (1926)
< 1×10−18 eV/c2 
< 1×10−35 e
The word quanta (singular quantum, Latin for how much) was used before 1900 to mean particles or amounts of different quantities, including electricity. In 1900, the German physicist Max Planck was studying black-body radiation, and specifically the Ultraviolet Catastrophe: he suggested that the experimental observations would be explained if the energy carried by electromagnetic waves could only be released in "packets" of energy. In his 1901 article  in Annalen der Physik he called these packets "energy elements". In 1905, Albert Einstein published a paper in which he proposed that many light-related phenomena—including black-body radiation and the photoelectric effect—would be better explained by modelling electromagnetic waves as consisting of spatially localized, discrete wave-packets. He called such a wave-packet the light quantum (German: das Lichtquant).[a]
The name photon derives from the Greek word for light, φῶς (transliterated phôs). Arthur Compton used photon in 1928, referring to Gilbert N. Lewis, who coined the term in a letter to Nature on December 18, 1926. In fact, the same name was used earlier but was never widely adopted before Lewis: in 1916 by the American physicist and psychologist Leonard T. Troland, in 1921 by the Irish physicist John Joly, in 1924 by the French physiologist René Wurmser (1890–1993), and in 1926 by the French physicist Frithiof Wolfers (1891–1971). The name was suggested initially as a unit related to the illumination of the eye and the resulting sensation of light and was used later in a physiological context. Although Wolfers's and Lewis's theories were contradicted by many experiments and never accepted, the new name was adopted very soon by most physicists after Compton used it.[b]
In physics, a photon is usually denoted by the symbol γ (the Greek letter gamma). This symbol for the photon probably derives from gamma rays, which were discovered in 1900 by Paul Villard, named by Ernest Rutherford in 1903, and shown to be a form of electromagnetic radiation in 1914 by Rutherford and Edward Andrade. In chemistry and optical engineering, photons are usually symbolized by hν, which is the photon energy, where h is Planck constant and the Greek letter ν (nu) is the photon's frequency. Much less commonly, the photon can be symbolized by hf, where its frequency is denoted by f.
A photon is massless,[c] has no electric charge, and is a stable particle. A photon has two possible polarization states. In the momentum representation of the photon, which is preferred in quantum field theory, a photon is described by its wave vector, which determines its wavelength λ and its direction of propagation. A photon's wave vector may not be zero and can be represented either as a spatial 3-vector or as a (relativistic) four-vector; in the latter case it belongs to the light cone (pictured). Different signs of the four-vector denote different circular polarizations, but in the 3-vector representation one should account for the polarization state separately; it actually is a spin quantum number. In both cases the space of possible wave vectors is three-dimensional.
The photon is the gauge boson for electromagnetism,:29–30 and therefore all other quantum numbers of the photon (such as lepton number, baryon number, and flavour quantum numbers) are zero. Also, the photon does not obey the Pauli exclusion principle,:1221 but instead obeys Bose–Einstein statistics.
Photons are emitted in many natural processes. For example, when a charge is accelerated it emits synchrotron radiation. During a molecular, atomic or nuclear transition to a lower energy level, photons of various energy will be emitted, ranging from radio waves to gamma rays. Photons can also be emitted when a particle and its corresponding antiparticle are annihilated (for example, electron–positron annihilation).:572,1114,1172
In empty space, the photon moves at c (the speed of light) and its energy and momentum are related by E = pc, where p is the magnitude of the momentum vector p. This derives from the following relativistic relation, with m = 0:
Since p points in the direction of the photon's propagation, the magnitude of the momentum is
The photon also carries a quantity called spin angular momentum that does not depend on its frequency. The magnitude of its spin is √ħ and the component measured along its direction of motion, its helicity, must be ±ħ. These two possible helicities, called right-handed and left-handed, correspond to the two possible circular polarization states of the photon.
To illustrate the significance of these formulae, the annihilation of a particle with its antiparticle in free space must result in the creation of at least two photons for the following reason. In the center of momentum frame, the colliding antiparticles have no net momentum, whereas a single photon always has momentum (since, as we have seen, it is determined by the photon's frequency or wavelength, which cannot be zero). Hence, conservation of momentum (or equivalently, translational invariance) requires that at least two photons are created, with zero net momentum. (However, it is possible if the system interacts with another particle or field for the annihilation to produce one photon, as when a positron annihilates with a bound atomic electron, it is possible for only one photon to be emitted, as the nuclear Coulomb field breaks translational symmetry.):64–65 The energy of the two photons, or, equivalently, their frequency, may be determined from conservation of four-momentum. Seen another way, the photon can be considered as its own antiparticle. The reverse process, pair production, is the dominant mechanism by which high-energy photons such as gamma rays lose energy while passing through matter. That process is the reverse of "annihilation to one photon" allowed in the electric field of an atomic nucleus.
The classical formulae for the energy and momentum of electromagnetic radiation can be re-expressed in terms of photon events. For example, the pressure of electromagnetic radiation on an object derives from the transfer of photon momentum per unit time and unit area to that object, since pressure is force per unit area and force is the change in momentum per unit time.
Each photon carries two distinct and independent forms of angular momentum of light. The spin angular momentum of light of a particular photon is always either +ħ or −ħ. The light orbital angular momentum of a particular photon can be any integer N, including zero.
Current commonly accepted physical theories imply or assume the photon to be strictly massless. If the photon is not a strictly massless particle, it would not move at the exact speed of light, c, in vacuum. Its speed would be lower and depend on its frequency. Relativity would be unaffected by this; the so-called speed of light, c, would then not be the actual speed at which light moves, but a constant of nature which is the upper bound on speed that any object could theoretically attain in spacetime. Thus, it would still be the speed of spacetime ripples (gravitational waves and gravitons), but it would not be the speed of photons.
If a photon did have non-zero mass, there would be other effects as well. Coulomb's law would be modified and the electromagnetic field would have an extra physical degree of freedom. These effects yield more sensitive experimental probes of the photon mass than the frequency dependence of the speed of light. If Coulomb's law is not exactly valid, then that would allow the presence of an electric field to exist within a hollow conductor when it is subjected to an external electric field. This thus allows one to test Coulomb's law to very high precision. A null result of such an experiment has set a limit of m ≲ 10−14 eV/c2.
Sharper upper limits on the speed of light have been obtained in experiments designed to detect effects caused by the galactic vector potential. Although the galactic vector potential is very large because the galactic magnetic field exists on very great length scales, only the magnetic field would be observable if the photon is massless. In the case that the photon has mass, the mass term 1/m2AμAμ would affect the galactic plasma. The fact that no such effects are seen implies an upper bound on the photon mass of m < 3×10−27 eV/c2. The galactic vector potential can also be probed directly by measuring the torque exerted on a magnetized ring. Such methods were used to obtain the sharper upper limit of 10−18 eV/c2 (the equivalent of 1.07×10−27 atomic mass units) given by the Particle Data Group.
These sharp limits from the non-observation of the effects caused by the galactic vector potential have been shown to be model-dependent. If the photon mass is generated via the Higgs mechanism then the upper limit of m ≲ 10−14 eV/c2 from the test of Coulomb's law is valid.
In most theories up to the eighteenth century, light was pictured as being made up of particles. Since particle models cannot easily account for the refraction, diffraction and birefringence of light, wave theories of light were proposed by René Descartes (1637), Robert Hooke (1665), and Christiaan Huygens (1678); however, particle models remained dominant, chiefly due to the influence of Isaac Newton. In the early nineteenth century, Thomas Young and August Fresnel clearly demonstrated the interference and diffraction of light and by 1850 wave models were generally accepted. In 1865, James Clerk Maxwell's prediction that light was an electromagnetic wave—which was confirmed experimentally in 1888 by Heinrich Hertz's detection of radio waves—seemed to be the final blow to particle models of light.
The Maxwell wave theory, however, does not account for all properties of light. The Maxwell theory predicts that the energy of a light wave depends only on its intensity, not on its frequency; nevertheless, several independent types of experiments show that the energy imparted by light to atoms depends only on the light's frequency, not on its intensity. For example, some chemical reactions are provoked only by light of frequency higher than a certain threshold; light of frequency lower than the threshold, no matter how intense, does not initiate the reaction. Similarly, electrons can be ejected from a metal plate by shining light of sufficiently high frequency on it (the photoelectric effect); the energy of the ejected electron is related only to the light's frequency, not to its intensity.[d]
At the same time, investigations of blackbody radiation carried out over four decades (1860–1900) by various researchers culminated in Max Planck's hypothesis that the energy of any system that absorbs or emits electromagnetic radiation of frequency ν is an integer multiple of an energy quantum E = hν. As shown by Albert Einstein, some form of energy quantization must be assumed to account for the thermal equilibrium observed between matter and electromagnetic radiation; for this explanation of the photoelectric effect, Einstein received the 1921 Nobel Prize in physics.
Since the Maxwell theory of light allows for all possible energies of electromagnetic radiation, most physicists assumed initially that the energy quantization resulted from some unknown constraint on the matter that absorbs or emits the radiation. In 1905, Einstein was the first to propose that energy quantization was a property of electromagnetic radiation itself. Although he accepted the validity of Maxwell's theory, Einstein pointed out that many anomalous experiments could be explained if the energy of a Maxwellian light wave were localized into point-like quanta that move independently of one another, even if the wave itself is spread continuously over space. In 1909 and 1916, Einstein showed that, if Planck's law of black-body radiation is accepted, the energy quanta must also carry momentum p = h/λ, making them full-fledged particles. This photon momentum was observed experimentally by Arthur Compton, for which he received the Nobel Prize in 1927. The pivotal question was then: how to unify Maxwell's wave theory of light with its experimentally observed particle nature? The answer to this question occupied Albert Einstein for the rest of his life, and was solved in quantum electrodynamics and its successor, the Standard Model (see § Second quantization and § The photon as a gauge boson, below).
Unlike Planck, Einstein entertained the possibility that there might be actual physical quanta of light—what we now call photons. He noticed that a light quantum with energy proportional to its frequency would explain a number of troubling puzzles and paradoxes, including an unpublished law by Stokes, the ultraviolet catastrophe, and the photoelectric effect. Stokes's law said simply that the frequency of fluorescent light cannot be greater than the frequency of the light (usually ultraviolet) inducing it. Einstein eliminated the ultraviolet catastrophe by imagining a gas of photons behaving like a gas of electrons that he had previously considered. He was advised by a colleague to be careful how he wrote up this paper, in order to not challenge Planck, a powerful figure in physics, too directly, and indeed the warning was justified, as Planck never forgave him for writing it.
Einstein's 1905 predictions were verified experimentally in several ways in the first two decades of the 20th century, as recounted in Robert Millikan's Nobel lecture. However, before Compton's experiment showed that photons carried momentum proportional to their wave number (1922), most physicists were reluctant to believe that electromagnetic radiation itself might be particulate. (See, for example, the Nobel lectures of Wien, Planck and Millikan.) Instead, there was a widespread belief that energy quantization resulted from some unknown constraint on the matter that absorbed or emitted radiation. Attitudes changed over time. In part, the change can be traced to experiments such as Compton scattering, where it was much more difficult not to ascribe quantization to light itself to explain the observed results.
Even after Compton's experiment, Niels Bohr, Hendrik Kramers and John Slater made one last attempt to preserve the Maxwellian continuous electromagnetic field model of light, the so-called BKS model. To account for the data then available, two drastic hypotheses had to be made:
However, refined Compton experiments showed that energy–momentum is conserved extraordinarily well in elementary processes; and also that the jolting of the electron and the generation of a new photon in Compton scattering obey causality to within 10 ps. Accordingly, Bohr and his co-workers gave their model "as honorable a funeral as possible". Nevertheless, the failures of the BKS model inspired Werner Heisenberg in his development of matrix mechanics.
A few physicists persisted in developing semiclassical models in which electromagnetic radiation is not quantized, but matter appears to obey the laws of quantum mechanics. Although the evidence from chemical and physical experiments for the existence of photons was overwhelming by the 1970s, this evidence could not be considered as absolutely definitive; since it relied on the interaction of light with matter, and a sufficiently complete theory of matter could in principle account for the evidence. Nevertheless, all semiclassical theories were refuted definitively in the 1970s and 1980s by photon-correlation experiments.[e] Hence, Einstein's hypothesis that quantization is a property of light itself is considered to be proven.
Photons, like all quantum objects, exhibit wave-like and particle-like properties. Their dual wave–particle nature can be difficult to visualize. The photon displays clearly wave-like phenomena such as diffraction and interference on the length scale of its wavelength. For example, a single photon passing through a double-slit experiment exhibits interference phenomena but only if no measure was made at the slit. A single photon passing through a double-slit experiment lands on the screen with a probability distribution given by its interference pattern determined by Maxwell's equations. However, experiments confirm that the photon is not a short pulse of electromagnetic radiation; it does not spread out as it propagates, nor does it divide when it encounters a beam splitter. Rather, the photon seems to be a point-like particle since it is absorbed or emitted as a whole by arbitrarily small systems, systems much smaller than its wavelength, such as an atomic nucleus (≈10−15 m across) or even the point-like electron. Nevertheless, the photon is not a point-like particle whose trajectory is shaped probabilistically by the electromagnetic field, as conceived by Einstein and others; that hypothesis was also refuted by the photon-correlation experiments cited above. According to our present understanding, the electromagnetic field itself is produced by photons, which in turn result from a local gauge symmetry and the laws of quantum field theory (see § Second quantization and § The photon as a gauge boson below).
A key element of quantum mechanics is Heisenberg's uncertainty principle, which forbids the simultaneous measurement of the position and momentum of a particle along the same direction. Remarkably, the uncertainty principle for charged, material particles requires the quantization of light into photons, and even the frequency dependence of the photon's energy and momentum.
An elegant illustration of the uncertainty principle is Heisenberg's thought experiment for locating an electron with an ideal microscope. The position of the electron can be determined to within the resolving power of the microscope, which is given by a formula from classical optics
where θ is the aperture angle of the microscope and λ is the wavelength of the light used to observe the electron. Thus, the position uncertainty can be made arbitrarily small by reducing the wavelength λ. Even if the momentum of the electron is initially known, the light impinging on the electron will give it a momentum "kick" of some unknown amount, rendering the momentum of the electron uncertain. If light were not quantized into photons, the uncertainty could be made arbitrarily small by reducing the light's intensity. In that case, since the wavelength and intensity of light can be varied independently, one could simultaneously determine the position and momentum to arbitrarily high accuracy, violating the uncertainty principle. By contrast, Einstein's formula for photon momentum preserves the uncertainty principle; since the photon is scattered anywhere within the aperture, the uncertainty of momentum transferred equals
giving the product , which is Heisenberg's uncertainty principle. Thus, the entire world is quantized; both matter and fields must obey a consistent set of quantum laws, if either one is to be quantized.
The analogous uncertainty principle for photons forbids the simultaneous measurement of the number of photons (see Fock state and the Second quantization section below) in an electromagnetic wave and the phase of that wave
Both photons and electrons create analogous interference patterns when passed through a double-slit experiment. For photons, this corresponds to the interference of a Maxwell light wave whereas, for material particles (electron), this corresponds to the interference of the Schrödinger wave equation. Although this similarity might suggest that Maxwell's equations describing the photon's electromagnetic wave are simply Schrödinger's equation for photons, most physicists do not agree. For one thing, they are mathematically different; most obviously, Schrödinger's one equation for the electron solves for a complex field, whereas Maxwell's four equations solve for real fields. More generally, the normal concept of a Schrödinger probability wave function cannot be applied to photons. As photons are massless, they cannot be localized without being destroyed; technically, photons cannot have a position eigenstate , and, thus, the normal Heisenberg uncertainty principle does not pertain to photons. A few substitute wave functions have been suggested for the photon, but they have not come into general use. Instead, physicists generally accept the second-quantized theory of photons described below, quantum electrodynamics, in which photons are quantized excitations of electromagnetic modes.
Another interpretation, that avoids duality, is the De Broglie–Bohm theory: known also as the pilot-wave model. In that theory, the photon is both, wave and particle. "This idea seems to me so natural and simple, to resolve the wave-particle dilemma in such a clear and ordinary way, that it is a great mystery to me that it was so generally ignored", J.S. Bell.
In 1924, Satyendra Nath Bose derived Planck's law of black-body radiation without using any electromagnetism, but rather by using a modification of coarse-grained counting of phase space. Einstein showed that this modification is equivalent to assuming that photons are rigorously identical and that it implied a "mysterious non-local interaction", now understood as the requirement for a symmetric quantum mechanical state. This work led to the concept of coherent states and the development of the laser. In the same papers, Einstein extended Bose's formalism to material particles (bosons) and predicted that they would condense into their lowest quantum state at low enough temperatures; this Bose–Einstein condensation was observed experimentally in 1995. It was later used by Lene Hau to slow, and then completely stop, light in 1999 and 2001.
The modern view on this is that photons are, by virtue of their integer spin, bosons (as opposed to fermions with half-integer spin). By the spin-statistics theorem, all bosons obey Bose–Einstein statistics (whereas all fermions obey Fermi–Dirac statistics).
In 1916, Albert Einstein showed that Planck's radiation law could be derived from a semi-classical, statistical treatment of photons and atoms, which implies a link between the rates at which atoms emit and absorb photons. The condition follows from the assumption that functions of the emission and absorption of radiation by the atoms are independent of each other, and that thermal equilibrium is made by way of the radiation's interaction with the atoms. Consider a cavity in thermal equilibrium with all parts of itself and filled with electromagnetic radiation and that the atoms can emit and absorb that radiation. Thermal equilibrium requires that the energy density of photons with frequency (which is proportional to their number density) is, on average, constant in time; hence, the rate at which photons of any particular frequency are emitted must equal the rate at which they absorb them.
Einstein began by postulating simple proportionality relations for the different reaction rates involved. In his model, the rate for a system to absorb a photon of frequency and transition from a lower energy to a higher energy is proportional to the number of atoms with energy and to the energy density of ambient photons of that frequency,
where is the rate constant for absorption. For the reverse process, there are two possibilities: spontaneous emission of a photon, or the emission of a photon initiated by the interaction of the atom with a passing photon and the return of the atom to the lower-energy state. Following Einstein's approach, the corresponding rate for the emission of photons of frequency and transition from a higher energy to a lower energy is
where is the rate constant for emitting a photon spontaneously, and is the rate constant for emissions in response to ambient photons (induced or stimulated emission). In thermodynamic equilibrium, the number of atoms in state i and those in state j must, on average, be constant; hence, the rates and must be equal. Also, by arguments analogous to the derivation of Boltzmann statistics, the ratio of and is where are the degeneracy of the state i and that of j, respectively, their energies, k the Boltzmann constant and T the system's temperature. From this, it is readily derived that and
The A and Bs are collectively known as the Einstein coefficients.
Einstein could not fully justify his rate equations, but claimed that it should be possible to calculate the coefficients , and once physicists had obtained "mechanics and electrodynamics modified to accommodate the quantum hypothesis". In fact, in 1926, Paul Dirac derived the rate constants by using a semiclassical approach, and, in 1927, succeeded in deriving all the rate constants from first principles within the framework of quantum theory. Dirac's work was the foundation of quantum electrodynamics, i.e., the quantization of the electromagnetic field itself. Dirac's approach is also called second quantization or quantum field theory; earlier quantum mechanical treatments only treat material particles as quantum mechanical, not the electromagnetic field.
Einstein was troubled by the fact that his theory seemed incomplete, since it did not determine the direction of a spontaneously emitted photon. A probabilistic nature of light-particle motion was first considered by Newton in his treatment of birefringence and, more generally, of the splitting of light beams at interfaces into a transmitted beam and a reflected beam. Newton hypothesized that hidden variables in the light particle determined which of the two paths a single photon would take. Similarly, Einstein hoped for a more complete theory that would leave nothing to chance, beginning his separation from quantum mechanics. Ironically, Max Born's probabilistic interpretation of the wave function was inspired by Einstein's later work searching for a more complete theory.
In 1910, Peter Debye derived Planck's law of black-body radiation from a relatively simple assumption. He correctly decomposed the electromagnetic field in a cavity into its Fourier modes, and assumed that the energy in any mode was an integer multiple of , where is the frequency of the electromagnetic mode. Planck's law of black-body radiation follows immediately as a geometric sum. However, Debye's approach failed to give the correct formula for the energy fluctuations of blackbody radiation, which were derived by Einstein in 1909.
In 1925, Born, Heisenberg and Jordan reinterpreted Debye's concept in a key way. As may be shown classically, the Fourier modes of the electromagnetic field—a complete set of electromagnetic plane waves indexed by their wave vector k and polarization state—are equivalent to a set of uncoupled simple harmonic oscillators. Treated quantum mechanically, the energy levels of such oscillators are known to be , where is the oscillator frequency. The key new step was to identify an electromagnetic mode with energy as a state with photons, each of energy . This approach gives the correct energy fluctuation formula.
Dirac took this one step further. He treated the interaction between a charge and an electromagnetic field as a small perturbation that induces transitions in the photon states, changing the numbers of photons in the modes, while conserving energy and momentum overall. Dirac was able to derive Einstein's and coefficients from first principles, and showed that the Bose–Einstein statistics of photons is a natural consequence of quantizing the electromagnetic field correctly (Bose's reasoning went in the opposite direction; he derived Planck's law of black-body radiation by assuming B–E statistics). In Dirac's time, it was not yet known that all bosons, including photons, must obey Bose–Einstein statistics.
Dirac's second-order perturbation theory can involve virtual photons, transient intermediate states of the electromagnetic field; the static electric and magnetic interactions are mediated by such virtual photons. In such quantum field theories, the probability amplitude of observable events is calculated by summing over all possible intermediate steps, even ones that are unphysical; hence, virtual photons are not constrained to satisfy , and may have extra polarization states; depending on the gauge used, virtual photons may have three or four polarization states, instead of the two states of real photons. Although these transient virtual photons can never be observed, they contribute measurably to the probabilities of observable events. Indeed, such second-order and higher-order perturbation calculations can give apparently infinite contributions to the sum. Such unphysical results are corrected for using the technique of renormalization.
Other virtual particles may contribute to the summation as well; for example, two photons may interact indirectly through virtual electron–positron pairs. In fact, such photon–photon scattering (see two-photon physics), as well as electron–photon scattering, is meant to be one of the modes of operations of the planned particle accelerator, the International Linear Collider.
where represents the state in which photons are in the mode . In this notation, the creation of a new photon in mode (e.g., emitted from an atomic transition) is written as . This notation merely expresses the concept of Born, Heisenberg and Jordan described above, and does not add any physics.
Measurements of the interaction between energetic photons and hadrons show that the interaction is much more intense than expected by the interaction of merely photons with the hadron's electric charge. Furthermore, the interaction of energetic photons with protons is similar to the interaction of photons with neutrons in spite of the fact that the electric charge structures of protons and neutrons are substantially different. A theory called Vector Meson Dominance (VMD) was developed to explain this effect. According to VMD, the photon is a superposition of the pure electromagnetic photon which interacts only with electric charges and vector mesons. However, if experimentally probed at very short distances, the intrinsic structure of the photon is recognized as a flux of quark and gluon components, quasi-free according to asymptotic freedom in QCD and described by the photon structure function. A comprehensive comparison of data with theoretical predictions was presented in a review in 2000.
The electromagnetic field can be understood as a gauge field, i.e., as a field that results from requiring that a gauge symmetry holds independently at every position in spacetime. For the electromagnetic field, this gauge symmetry is the Abelian U(1) symmetry of complex numbers of absolute value 1, which reflects the ability to vary the phase of a complex field without affecting observables or real valued functions made from it, such as the energy or the Lagrangian.
The quanta of an Abelian gauge field must be massless, uncharged bosons, as long as the symmetry is not broken; hence, the photon is predicted to be massless, and to have zero electric charge and integer spin. The particular form of the electromagnetic interaction specifies that the photon must have spin ±1; thus, its helicity must be . These two spin components correspond to the classical concepts of right-handed and left-handed circularly polarized light. However, the transient virtual photons of quantum electrodynamics may also adopt unphysical polarization states.
In the prevailing Standard Model of physics, the photon is one of four gauge bosons in the electroweak interaction; the other three are denoted W+, W− and Z0 and are responsible for the weak interaction. Unlike the photon, these gauge bosons have mass, owing to a mechanism that breaks their SU(2) gauge symmetry. The unification of the photon with W and Z gauge bosons in the electroweak interaction was accomplished by Sheldon Glashow, Abdus Salam and Steven Weinberg, for which they were awarded the 1979 Nobel Prize in physics. Physicists continue to hypothesize grand unified theories that connect these four gauge bosons with the eight gluon gauge bosons of quantum chromodynamics; however, key predictions of these theories, such as proton decay, have not been observed experimentally.
The energy of a system that emits a photon is decreased by the energy of the photon as measured in the rest frame of the emitting system, which may result in a reduction in mass in the amount . Similarly, the mass of a system that absorbs a photon is increased by a corresponding amount. As an application, the energy balance of nuclear reactions involving photons is commonly written in terms of the masses of the nuclei involved, and terms of the form for the gamma photons (and for other relevant energies, such as the recoil energy of nuclei).
This concept is applied in key predictions of quantum electrodynamics (QED, see above). In that theory, the mass of electrons (or, more generally, leptons) is modified by including the mass contributions of virtual photons, in a technique known as renormalization. Such "radiative corrections" contribute to a number of predictions of QED, such as the magnetic dipole moment of leptons, the Lamb shift, and the hyperfine structure of bound lepton pairs, such as muonium and positronium.
Since photons contribute to the stress–energy tensor, they exert a gravitational attraction on other objects, according to the theory of general relativity. Conversely, photons are themselves affected by gravity; their normally straight trajectories may be bent by warped spacetime, as in gravitational lensing, and their frequencies may be lowered by moving to a higher gravitational potential, as in the Pound–Rebka experiment. However, these effects are not specific to photons; exactly the same effects would be predicted for classical electromagnetic waves.
Light that travels through transparent matter does so at a lower speed than c, the speed of light in a vacuum. For example, photons engage in so many collisions on the way from the core of the sun that radiant energy can take about a million years to reach the surface; however, once in open space, a photon takes only 8.3 minutes to reach Earth. The factor by which the speed is decreased is called the refractive index of the material. In a classical wave picture, the slowing can be explained by the light inducing electric polarization in the matter, the polarized matter radiating new light, and that new light interfering with the original light wave to form a delayed wave. In a particle picture, the slowing can instead be described as a blending of the photon with quantum excitations of the matter to produce quasi-particles known as polariton (other quasi-particles are phonons and excitons); this polariton has a nonzero effective mass, which means that it cannot travel at c. Light of different frequencies may travel through matter at different speeds; this is called dispersion (not to be confused with scattering). In some cases, it can result in extremely slow speeds of light in matter. The effects of photon interactions with other quasi-particles may be observed directly in Raman scattering and Brillouin scattering.
Photons can also be absorbed by nuclei, atoms or molecules, provoking transitions between their energy levels. A classic example is the molecular transition of retinal (C20H28O), which is responsible for vision, as discovered in 1958 by Nobel laureate biochemist George Wald and co-workers. The absorption provokes a cis-trans isomerization that, in combination with other such transitions, is transduced into nerve impulses. The absorption of photons can even break chemical bonds, as in the photodissociation of chlorine; this is the subject of photochemistry.
Photons have many applications in technology. These examples are chosen to illustrate applications of photons per se, rather than general optical devices such as lenses, etc. that could operate under a classical theory of light. The laser is an extremely important application and is discussed above under stimulated emission.
Individual photons can be detected by several methods. The classic photomultiplier tube exploits the photoelectric effect: a photon of sufficient energy strikes a metal plate and knocks free an electron, initiating an ever-amplifying avalanche of electrons. Semiconductor charge-coupled device chips use a similar effect: an incident photon generates a charge on a microscopic capacitor that can be detected. Other detectors such as Geiger counters use the ability of photons to ionize gas molecules contained in the device, causing a detectable change of conductivity of the gas.
Planck's energy formula is often used by engineers and chemists in design, both to compute the change in energy resulting from a photon absorption and to determine the frequency of the light emitted from a given photon emission. For example, the emission spectrum of a gas-discharge lamp can be altered by filling it with (mixtures of) gases with different electronic energy level configurations.
Under some conditions, an energy transition can be excited by "two" photons that individually would be insufficient. This allows for higher resolution microscopy, because the sample absorbs energy only in the spectrum where two beams of different colors overlap significantly, which can be made much smaller than the excitation volume of a single beam (see two-photon excitation microscopy). Moreover, these photons cause less damage to the sample, since they are of lower energy.
In some cases, two energy transitions can be coupled so that, as one system absorbs a photon, another nearby system "steals" its energy and re-emits a photon of a different frequency. This is the basis of fluorescence resonance energy transfer, a technique that is used in molecular biology to study the interaction of suitable proteins.
Several different kinds of hardware random number generators involve the detection of single photons. In one example, for each bit in the random sequence that is to be produced, a photon is sent to a beam-splitter. In such a situation, there are two possible outcomes of equal probability. The actual outcome is used to determine whether the next bit in the sequence is "0" or "1".
Much research has been devoted to applications of photons in the field of quantum optics. Photons seem well-suited to be elements of an extremely fast quantum computer, and the quantum entanglement of photons is a focus of research. Nonlinear optical processes are another active research area, with topics such as two-photon absorption, self-phase modulation, modulational instability and optical parametric oscillators. However, such processes generally do not require the assumption of photons per se; they may often be modeled by treating atoms as nonlinear oscillators. The nonlinear process of spontaneous parametric down conversion is often used to produce single-photon states. Finally, photons are essential in some aspects of optical communication, especially for quantum cryptography.[f]
p. 322: Die Konstanten and würden sich direkt berechnen lassen, wenn wir im Besitz einer im Sinne der Quantenhypothese modifizierten Elektrodynamik und Mechanik wären."
By date of publication:
Education with single photons:
3D optical data storage is any form of optical data storage in which information can be recorded or read with three-dimensional resolution (as opposed to the two-dimensional resolution afforded, for example, by CD).This innovation has the potential to provide petabyte-level mass storage on DVD-sized discs (120 mm). Data recording and readback are achieved by focusing lasers within the medium. However, because of the volumetric nature of the data structure, the laser light must travel through other data points before it reaches the point where reading or recording is desired. Therefore, some kind of nonlinearity is required to ensure that these other data points do not interfere with the addressing of the desired point.
No commercial product based on 3D optical data storage has yet arrived on the mass market, although several companies are actively developing the technology and claim that it may become available 'soon'.Compton scattering
Compton scattering, discovered by Arthur Holly Compton, is the scattering of a photon by a charged particle, usually an electron. It results in a decrease in energy (increase in wavelength) of the photon (which may be an X-ray or gamma ray photon), called the Compton effect. Part of the energy of the photon is transferred to the recoiling electron. Inverse Compton scattering occurs when a charged particle transfers part of its energy to a photon.Delayed-choice quantum eraser
A delayed-choice quantum eraser experiment, first performed by Yoon-Ho Kim, R. Yu, S. P. Kulik, Y. H. Shih and Marlan O. Scully, and reported in early 1999, is an elaboration on the quantum eraser experiment that incorporates concepts considered in Wheeler's delayed-choice experiment. The experiment was designed to investigate peculiar consequences of the well-known double-slit experiment in quantum mechanics, as well as the consequences of quantum entanglement.
The delayed-choice quantum eraser experiment investigates a paradox. If a photon manifests itself as though it had come by a single path to the detector, then "common sense" (which Wheeler and others challenge) says that it must have entered the double-slit device as a particle. If a photon manifests itself as though it had come by two indistinguishable paths, then it must have entered the double-slit device as a wave. If the experimental apparatus is changed while the photon is in mid‑flight, then the photon should reverse its original "decision" as to whether to be a wave or a particle. Wheeler pointed out that when these assumptions are applied to a device of interstellar dimensions, a last-minute decision made on Earth on how to observe a photon could alter a decision made millions or even billions of years ago.
While delayed-choice experiments have confirmed the seeming ability of measurements made on photons in the present to alter events occurring in the past, this requires a non-standard view of quantum mechanics. If a photon in flight is interpreted as being in a so-called "superposition of states", i.e. if it is interpreted as something that has the potentiality to manifest as a particle or wave, but during its time in flight is neither, then there is no time paradox. This is the standard view, and recent experiments have supported it.Electromagnetic radiation
In physics, electromagnetic radiation (EM radiation or EMR) refers to the waves (or their quanta, photons) of the electromagnetic field, propagating (radiating) through space, carrying electromagnetic radiant energy. It includes radio waves, microwaves, infrared, (visible) light, ultraviolet, X-rays, and gamma rays.Classically, electromagnetic radiation consists of electromagnetic waves, which are synchronized oscillations of electric and magnetic fields that propagate at the speed of light, which, in a vacuum, is commonly denoted c. In homogeneous, isotropic media, the oscillations of the two fields are perpendicular to each other and perpendicular to the direction of energy and wave propagation, forming a transverse wave. The wavefront of electromagnetic waves emitted from a point source (such as a light bulb) is a sphere. The position of an electromagnetic wave within the electromagnetic spectrum can be characterized by either its frequency of oscillation or its wavelength. Electromagnetic waves of different frequency are called by different names since they have different sources and effects on matter. In order of increasing frequency and decreasing wavelength these are: radio waves, microwaves, infrared radiation, visible light, ultraviolet radiation, X-rays and gamma rays.Electromagnetic waves are emitted by electrically charged particles undergoing acceleration, and these waves can subsequently interact with other charged particles, exerting force on them. EM waves carry energy, momentum and angular momentum away from their source particle and can impart those quantities to matter with which they interact. Electromagnetic radiation is associated with those EM waves that are free to propagate themselves ("radiate") without the continuing influence of the moving charges that produced them, because they have achieved sufficient distance from those charges. Thus, EMR is sometimes referred to as the far field. In this language, the near field refers to EM fields near the charges and current that directly produced them specifically, electromagnetic induction and electrostatic induction phenomena.
In quantum mechanics, an alternate way of viewing EMR is that it consists of photons, uncharged elementary particles with zero rest mass which are the quanta of the electromagnetic force, responsible for all electromagnetic interactions. Quantum electrodynamics is the theory of how EMR interacts with matter on an atomic level. Quantum effects provide additional sources of EMR, such as the transition of electrons to lower energy levels in an atom and black-body radiation. The energy of an individual photon is quantized and is greater for photons of higher frequency. This relationship is given by Planck's equation E = hν, where E is the energy per photon, ν is the frequency of the photon, and h is Planck's constant. A single gamma ray photon, for example, might carry ~100,000 times the energy of a single photon of visible light.
The effects of EMR upon chemical compounds and biological organisms depend both upon the radiation's power and its frequency. EMR of visible or lower frequencies (i.e., visible light, infrared, microwaves, and radio waves) is called non-ionizing radiation, because its photons do not individually have enough energy to ionize atoms or molecules or break chemical bonds. The effects of these radiations on chemical systems and living tissue are caused primarily by heating effects from the combined energy transfer of many photons. In contrast, high frequency ultraviolet, X-rays and gamma rays are called ionizing radiation, since individual photons of such high frequency have enough energy to ionize molecules or break chemical bonds. These radiations have the ability to cause chemical reactions and damage living cells beyond that resulting from simple heating, and can be a health hazard.Elitzur–Vaidman bomb tester
The Elitzur–Vaidman bomb-tester is a quantum mechanics thought experiment that uses interaction-free measurements to verify that a bomb is functional without having to detonate it. It was conceived in 1993 by Avshalom Elitzur and Lev Vaidman. Since their publication, real-world experiments have confirmed that their theoretical method works as predicted.The bomb tester takes advantage of two characteristics of elementary particles, such as photons or electrons: nonlocality and wave-particle duality. By placing the particle in a quantum superposition, the experiment can verify that the bomb works without ever triggering its detonation, although there is a 50% chance that the bomb will explode in the effort.Gamma ray
A gamma ray or gamma radiation (symbol γ or ), is a penetrating electromagnetic radiation arising from the radioactive decay of atomic nuclei. It consists of the shortest wavelength electromagnetic waves and so imparts the highest photon energy. Paul Villard, a French chemist and physicist, discovered gamma radiation in 1900 while studying radiation emitted by radium. In 1903, Ernest Rutherford named this radiation gamma rays based on their relatively strong penetration of matter; he had previously discovered two less penetrating types of decay radiation, which he named alpha rays and beta rays in ascending order of penetrating power.
Gamma rays from radioactive decay are in the energy range from a few keV to ~8 MeV, corresponding to the typical energy levels in nuclei with reasonably long lifetimes. The energy spectrum of gamma rays can be used to identify the decaying radionuclides using gamma spectroscopy. Very-high-energy gamma rays in the 100–1000 TeV range have been observed from sources such as the Cygnus X-3 microquasar.
Natural sources of gamma rays originating on Earth are mostly as a result of radioactive decay and secondary radiation from atmospheric interactions with cosmic ray particles. However there are other rare natural sources, such as terrestrial gamma-ray flashes, that produce gamma rays from electron action upon the nucleus. Notable artificial sources of gamma rays include fission, such as occurs in nuclear reactors, as well as high energy physics experiments, such as neutral pion decay and nuclear fusion.
Gamma rays and X-rays are both electromagnetic radiation and they overlap in the electromagnetic spectrum, the terminology varies between scientific disciplines. In some fields of physics, they are distinguished by their origin: Gamma rays are created by nuclear decay, while in the case of X-rays, the origin is outside the nucleus. In astrophysics, gamma rays are conventionally defined as having photon energies above 100 keV and are the subject of gamma ray astronomy, while radiation below 100 keV is classified as X-rays and is the subject of X-ray astronomy. This convention stems from the early man-made X-rays, which had energies only up to 100 keV, whereas many gamma rays could go to higher energies. A large fraction of astronomical gamma rays are screened by Earth's atmosphere.
Gamma rays are ionizing radiation and are thus biologically hazardous. Due to their high penetration power, they can damage bone marrow and internal organs. Unlike alpha and beta rays, they pass easily through the body and thus pose a formidable radiation protection challenge, requiring shielding made from dense materials such as lead or concrete.Monica Rambeau
Monica Rambeau is a fictional character and superhero appearing in American comic books published by Marvel Comics. Initially known as Captain Marvel, the character joined and eventually became leader of the Avengers for a time. She later used the codenames Photon, Pulsar, and beginning in 2013, Spectrum.
Akira Akbar portrays a young Monica Rambeau as a child in the film Captain Marvel, which is set in the Marvel Cinematic Universe. This was the cinematic debut of the character.QNX
QNX ( or ) is a commercial Unix-like real-time operating system, aimed primarily at the embedded systems market. The product was originally developed in the early 1980s by Canadian company Quantum Software Systems, later renamed QNX Software Systems and ultimately acquired by BlackBerry in 2010. QNX was one of the first commercially successful microkernel operating systems and is used in a variety of devices including cars and mobile phones.Quantum key distribution
Quantum key distribution (QKD) is a secure communication method which implements a cryptographic protocol involving components of quantum mechanics. It enables two parties to produce a shared random secret key known only to them, which can then be used to encrypt and decrypt messages. It is often incorrectly called quantum cryptography, as it is the best-known example of a quantum cryptographic task.
An important and unique property of quantum key distribution is the ability of the two communicating users to detect the presence of any third party trying to gain knowledge of the key. This results from a fundamental aspect of quantum mechanics: the process of measuring a quantum system in general disturbs the system. A third party trying to eavesdrop on the key must in some way measure it, thus introducing detectable anomalies. By using quantum superpositions or quantum entanglement and transmitting information in quantum states, a communication system can be implemented that detects eavesdropping. If the level of eavesdropping is below a certain threshold, a key can be produced that is guaranteed to be secure (i.e. the eavesdropper has no information about it), otherwise no secure key is possible and communication is aborted.
The security of encryption that uses quantum key distribution relies on the foundations of quantum mechanics, in contrast to traditional public key cryptography, which relies on the computational difficulty of certain mathematical functions, and cannot provide any mathematical proof as to the actual complexity of reversing the one-way functions used. QKD has provable security based on information theory, and forward secrecy.
Quantum key distribution is only used to produce and distribute a key, not to transmit any message data. This key can then be used with any chosen encryption algorithm to encrypt (and decrypt) a message, which can then be transmitted over a standard communication channel. The algorithm most commonly associated with QKD is the one-time pad, as it is provably secure when used with a secret, random key. In real-world situations, it is often also used with encryption using symmetric key algorithms like the Advanced Encryption Standard algorithm.Single-photon avalanche diode
A single-photon avalanche diode (SPAD)
is a solid-state photodetector in which a photon-generated carrier (via the internal photoelectric effect) can trigger a short-duration but relatively large avalanche current. This avalanche is created through a mechanism called impact ionization, whereby carriers (electrons and/or holes) are accelerated to high kinetic energies through a large potential gradient (voltage). If the kinetic energy of a carrier is sufficient (as a function of the ionization energy of the bulk material) further carriers are liberated from the atomic lattice. The number of carriers thus increases exponentially from, in some cases, as few as a single carrier. This mechanism was observed and modeled by John Townsend for trace-gas vacuum tubes, becoming known as a Townsend discharge, and later being attributed to solid-state breakdown by K. McAfee. This device is able to detect low-intensity ionizing radiation, including: gamma, X-ray, beta, and alpha-particle radiation along with electromagnetic signals in the UV, Visible and IR (in the optical case this can be down to the single photon level). SPADs are also able to distinguish the arrival times of events (photons) with a timing jitter of a few tens of picoseconds.
SPADs, like avalanche photodiodes (APDs), exploit the incident radiation triggered avalanche current of a p–n junction when reverse biased. The fundamental difference between SPADs and APDs is that SPADs are specifically designed to operate with a reverse-bias voltage well above the breakdown voltage. This kind of operation is also called Geiger-mode in the literature (as opposed to the linear-mode for the case of an APD). This is in analogy with the Geiger counter.
Since the 1970s, the applications of SPADs have increased significantly. Recent examples of their use include LIDAR, Time of Flight (ToF) 3D Imaging, PET scanning, single-photon experimentation within physics, fluorescence lifetime microscopy and optical communications (particularly quantum key distribution). Notable companies that have commercialized SPAD technology include: ST Microelectronics, Tower Jazz, Phillips and Micro Photon Devices (MPD). The related technologies of solid-state silicon photomultipliers (Si-PMs) and multi-pixel photon counters (MPPCs) have been commercialized and available through companies such as SensL (currently part of ON Semiconductor) and Hamamatsu.Single-photon emission computed tomography
Single-photon emission computed tomography (SPECT, or less commonly, SPET) is a nuclear medicine tomographic imaging technique using gamma rays. It is very similar to conventional nuclear medicine planar imaging using a gamma camera (that is, scintigraphy). but is able to provide true 3D information. This information is typically presented as cross-sectional slices through the patient, but can be freely reformatted or manipulated as required.
The technique requires delivery of a gamma-emitting radioisotope (a radionuclide) into the patient, normally through injection into the bloodstream. On occasion, the radioisotope is a simple soluble dissolved ion, such as an isotope of gallium(III). Most of the time, though, a marker radioisotope is attached to a specific ligand to create a radioligand, whose properties bind it to certain types of tissues. This marriage allows the combination of ligand and radiopharmaceutical to be carried and bound to a place of interest in the body, where the ligand concentration is seen by a gamma camera.Solar sail
Solar sails (also called light sails or photon sails) are a proposed method of spacecraft propulsion using radiation pressure exerted by sunlight on large mirrors. A useful analogy may be a sailing boat; the light exerting a force on the mirrors is akin to a sail being blown by the wind. High-energy laser beams could be used as an alternative light source to exert much greater force than would be possible using sunlight, a concept known as beam sailing.
Solar sail craft offer the possibility of low-cost operations combined with long operating lifetimes. Since they have few moving parts and use no propellant, they can potentially be used numerous times for delivery of payloads.
Solar sails use a phenomenon that has a proven, measured effect on spacecraft. Solar pressure affects all spacecraft, whether in interplanetary space or in orbit around a planet or small body. A typical spacecraft going to Mars, for example, will be displaced thousands of kilometers by solar pressure, so the effects must be accounted for in trajectory planning, which has been done since the time of the earliest interplanetary spacecraft of the 1960s. Solar pressure also affects the orientation of a craft, a factor that must be included in spacecraft design.The total force exerted on an 800 by 800 meter solar sail, for example, is about 5 newtons (1.1 lbf) at Earth's distance from the Sun, making it a low-thrust propulsion system, similar to spacecraft propelled by electric engines, but as it uses no propellant, that force is exerted almost constantly and the collective effect over time is great enough to be considered a potential manner of propelling spacecraft.Spontaneous parametric down-conversion
Spontaneous parametric down-conversion (also known as SPDC, parametric fluorescence or parametric scattering) is a nonlinear instant optical process that converts one photon of higher energy (namely, a pump photon), into a pair of photons (namely, a signal photon, and an idler photon) of lower energy, in accordance with the law of conservation of energy and law of conservation of momentum. It is an important process in quantum optics, for the generation of entangled photon pairs, and of single photons.Superconducting nanowire single-photon detector
The superconducting nanowire single-photon detector (SNSPD) is a type of near-infrared and optical single-photon detector based on a current-biased superconducting nanowire. It was first developed by scientists at Moscow State Pedagogical University and at the University of Rochester in 2001.As of 2018, a superconducting nanowire single-photon detector is the fastest single-photon detector (SPD) for photon counting.Two-photon excitation microscopy
Two-photon excitation microscopy is a fluorescence imaging technique that allows imaging of living tissue up to about one millimeter in depth. It differs from traditional fluorescence microscopy, in which the excitation wavelength is shorter than the emission wavelength, as the wavelengths of the two exciting photons are longer than the wavelength of the resulting emitted light. Two-photon excitation microscopy typically uses near-infrared excitation light which can also excite fluorescent dyes. However, for each excitation, two photons of infrared light are absorbed. Using infrared light minimizes scattering in the tissue. Due to the multiphoton absorption, the background signal is strongly suppressed. Both effects lead to an increased penetration depth for these microscopes. Two-photon excitation can be a superior alternative to confocal microscopy due to its deeper tissue penetration, efficient light detection, and reduced photobleaching.Virtual particle
In physics, a virtual particle is a transient fluctuation that exhibits some of the characteristics of an ordinary particle, while having its existence limited by the uncertainty principle. The concept of virtual particles arises in perturbation theory of quantum field theory where interactions between ordinary particles are described in terms of exchanges of virtual particles. A process involving virtual particles can be described by a schematic representation known as a Feynman diagram, in which virtual particles are represented by internal lines.Virtual particles do not necessarily carry the same mass as the corresponding real particle, although they always conserve energy and momentum. The longer the virtual particle exists, the closer its characteristics come to those of ordinary particles. They are important in the physics of many processes, including particle scattering and Casimir forces. In quantum field theory, even classical forces—such as the electromagnetic repulsion or attraction between two charges—can be thought of as due to the exchange of many virtual photons between the charges. Virtual photons are the exchange particle for the electromagnetic interaction.
The term is somewhat loose and vaguely defined, in that it refers to the view that the world is made up of "real particles": it is not; rather, "real particles" are better understood to be excitations of the underlying quantum fields. Virtual particles are also excitations of the underlying fields, but are "temporary" in the sense that they appear in calculations of interactions, but never as asymptotic states or indices to the scattering matrix. The accuracy and use of virtual particles in calculations is firmly established, but as they cannot be detected in experiments, deciding how to precisely describe them is a topic of debate.Visible-light photon counter
VLPC stands for visible-light photon counter and refers to a new high quantum efficiency multi-photon counting detector, which operates at visible wavelengths. The ability to be capable of counting the exact number of photons that are detected is extremely important for QKD. The device is being used extensively in the central tracking detector of the D0 experiment, and for muon cooling studies for a muon collider (MICE).Weapons in Star Trek
The Star Trek fictional universe contains a variety of weapons, ranging from missiles (the classic photon torpedo) to melee (primarily used by the Klingons, a race of aliens in the Star Trek universe). The Star Trek franchise consists primarily of several multi-season television shows and a dozen movies, as well as various video games and inspired merchandise. Many aspects of the fictional universe impact modern popular culture, especially the lingo and the idea of a spacecraft launching space torpedoes and firing lasers, and have had a wide influence in the late 20th to early 21st century. Star Trek is popular enough that its science fiction concepts have even been studied by real scientists, and NASA described its science in relation to the real world as "entertaining combination of real science, imaginary science gathered from lots of earlier stories, and stuff the writers make up week-by-week to give each new episode novelty." For example, NASA noted that the Star Trek "phasers" were a fictional extrapolation of real-life lasers, and compared them to real-life microwave based weapons that have a stunning effect.Zillion (TV series)
Zillion (Japanese: 赤い光弾ジリオン, Hepburn: Akai Kōdan Jirion, literally Red Photon Bullet Zillion, fully titled Red Photon Zillion) is a Japanese anime television series that ran from April 12, 1987 to December 13, 1987 on Nippon Television in Japan and was produced by Tatsunoko Production. After the production of the anime, Tatsunoko Production and Mitsuhisa Ishikawa, the producer of Zillion, established IG Tatsunoko (which later became Production I.G) to obstruct the dispersing of the excellent staffs of Tatsunoko branch which had done actual production. Therefore, Zillion is considered to be Production I.G's first work.Five of the 31 episodes were dubbed into English and released on VHS in the United States by Streamline Pictures. This anime was featured in the music video for Michael and Janet Jackson's collaboration "Scream". Samples from the English dub of the anime were also featured in Del the Funky Homosapien's single "Cyberpunks".
In October, 2018, Funimation has released the complete series and the OVA on a Blu-ray/DVD set with Japanese audio and English subtitles.