Wave–particle duality is the concept in quantum mechanics that every particle or quantum entity may be partly described in terms not only of particles, but also of waves. It expresses the inability of the classical concepts "particle" or "wave" to fully describe the behaviour of quantum-scale objects. As Albert Einstein wrote:
It seems as though we must use sometimes the one theory and sometimes the other, while at times we may use either. We are faced with a new kind of difficulty. We have two contradictory pictures of reality; separately neither of them fully explains the phenomena of light, but together they do.
Through the work of Max Planck, Albert Einstein, Louis de Broglie, Arthur Compton, Niels Bohr, and many others, current scientific theory holds that all particles exhibit a wave nature and vice versa. This phenomenon has been verified not only for elementary particles, but also for compound particles like atoms and even molecules. For macroscopic particles, because of their extremely short wavelengths, wave properties usually cannot be detected.
Although the use of the wave-particle duality has worked well in physics, the meaning or interpretation has not been satisfactorily resolved; see Interpretations of quantum mechanics.
Bohr regarded the "duality paradox" as a fundamental or metaphysical fact of nature. A given kind of quantum object will exhibit sometimes wave, sometimes particle, character, in respectively different physical settings. He saw such duality as one aspect of the concept of complementarity. Bohr regarded renunciation of the cause-effect relation, or complementarity, of the space-time picture, as essential to the quantum mechanical account.
Werner Heisenberg considered the question further. He saw the duality as present for all quantic entities, but not quite in the usual quantum mechanical account considered by Bohr. He saw it in what is called second quantization, which generates an entirely new concept of fields which exist in ordinary space-time, causality still being visualizable. Classical field values (e.g. the electric and magnetic field strengths of Maxwell) are replaced by an entirely new kind of field value, as considered in quantum field theory. Turning the reasoning around, ordinary quantum mechanics can be deduced as a specialized consequence of quantum field theory.
Democritus argued that all things in the universe, including light, are composed of indivisible sub-components. At the beginning of the 11th Century, the Arabic scientist Ibn al-Haytham wrote the first comprehensive Book of optics describing reflection, refraction, and the operation of a pinhole lens via rays of light traveling from the point of emission to the eye. He asserted that these rays were composed of particles of light. In 1630, René Descartes popularized and accredited the opposing wave description in his treatise on light, The World (Descartes), showing that the behavior of light could be re-created by modeling wave-like disturbances in a universal medium i.e. luminiferous aether. Beginning in 1670 and progressing over three decades, Isaac Newton developed and championed his corpuscular theory, arguing that the perfectly straight lines of reflection demonstrated light's particle nature, only particles could travel in such straight lines. He explained refraction by positing that particles of light accelerated laterally upon entering a denser medium. Around the same time, Newton's contemporaries Robert Hooke and Christiaan Huygens, and later Augustin-Jean Fresnel, mathematically refined the wave viewpoint, showing that if light traveled at different speeds in different media, refraction could be easily explained as the medium-dependent propagation of light waves. The resulting Huygens–Fresnel principle was extremely successful at reproducing light's behavior and was subsequently supported by Thomas Young's discovery of wave interference of light by his double-slit experiment in 1801. The wave view did not immediately displace the ray and particle view, but began to dominate scientific thinking about light in the mid 19th century, since it could explain polarization phenomena that the alternatives could not.
James Clerk Maxwell discovered that he could apply his previously discovered Maxwell's equations, along with a slight modification to describe self-propagating waves of oscillating electric and magnetic fields. It quickly became apparent that visible light, ultraviolet light, and infrared light were all electromagnetic waves of differing frequency.
In 1901, Max Planck published an analysis that succeeded in reproducing the observed spectrum of light emitted by a glowing object. To accomplish this, Planck had to make a mathematical assumption of quantized energy of the oscillators i.e. atoms of the black body that emit radiation. Einstein later proposed that electromagnetic radiation itself is quantized, not the energy of radiating atoms.
Black-body radiation, the emission of electromagnetic energy due to an object's heat, could not be explained from classical arguments alone. The equipartition theorem of classical mechanics, the basis of all classical thermodynamic theories, stated that an object's energy is partitioned equally among the object's vibrational modes. But applying the same reasoning to the electromagnetic emission of such a thermal object was not so successful. That thermal objects emit light had been long known. Since light was known to be waves of electromagnetism, physicists hoped to describe this emission via classical laws. This became known as the black body problem. Since the equipartition theorem worked so well in describing the vibrational modes of the thermal object itself, it was natural to assume that it would perform equally well in describing the radiative emission of such objects. But a problem quickly arose if each mode received an equal partition of energy, the short wavelength modes would consume all the energy. This became clear when plotting the Rayleigh–Jeans law which, while correctly predicting the intensity of long wavelength emissions, predicted infinite total energy as the intensity diverges to infinity for short wavelengths. This became known as the ultraviolet catastrophe.
In 1900, Max Planck hypothesized that the frequency of light emitted by the black body depended on the frequency of the oscillator that emitted it, and the energy of these oscillators increased linearly with frequency (according E = hf where h is Planck's constant and f is the frequency). This was not an unsound proposal considering that macroscopic oscillators operate similarly when studying five simple harmonic oscillators of equal amplitude but different frequency, the oscillator with the highest frequency possesses the highest energy (though this relationship is not linear like Planck's). By demanding that high-frequency light must be emitted by an oscillator of equal frequency, and further requiring that this oscillator occupy higher energy than one of a lesser frequency, Planck avoided any catastrophe, giving an equal partition to high-frequency oscillators produced successively fewer oscillators and less emitted light. And as in the Maxwell–Boltzmann distribution, the low-frequency, low-energy oscillators were suppressed by the onslaught of thermal jiggling from higher energy oscillators, which necessarily increased their energy and frequency.
The most revolutionary aspect of Planck's treatment of the black body is that it inherently relies on an integer number of oscillators in thermal equilibrium with the electromagnetic field. These oscillators give their entire energy to the electromagnetic field, creating a quantum of light, as often as they are excited by the electromagnetic field, absorbing a quantum of light and beginning to oscillate at the corresponding frequency. Planck had intentionally created an atomic theory of the black body, but had unintentionally generated an atomic theory of light, where the black body never generates quanta of light at a given frequency with an energy less than hν. However, once realizing that he had quantized the electromagnetic field, he denounced particles of light as a limitation of his approximation, not a property of reality.
While Planck had solved the ultraviolet catastrophe by using atoms and a quantized electromagnetic field, most contemporary physicists agreed that Planck's "light quanta" represented only flaws in his model. A more-complete derivation of black body radiation would yield a fully continuous and "wave-like" electromagnetic field with no quantization. However, in 1905 Albert Einstein took Planck's black body model to produce his solution to another outstanding problem of the day: the photoelectric effect, wherein electrons are emitted from atoms when they absorb energy from light. Since their existence was theorized eight years previously, phenomena had been studied with the electron model in mind in physics laboratories worldwide.
In 1902 Philipp Lenard discovered that the energy of these ejected electrons did not depend on the intensity of the incoming light, but instead on its frequency. So if one shines a little low-frequency light upon a metal, a few low energy electrons are ejected. If one now shines a very intense beam of low-frequency light upon the same metal, a whole slew of electrons are ejected; however they possess the same low energy, there are merely more of them. The more light there is, the more electrons are ejected. Whereas in order to get high energy electrons, one must illuminate the metal with high-frequency light. Like blackbody radiation, this was at odds with a theory invoking continuous transfer of energy between radiation and matter. However, it can still be explained using a fully classical description of light, as long as matter is quantum mechanical in nature.
If one used Planck's energy quanta, and demanded that electromagnetic radiation at a given frequency could only transfer energy to matter in integer multiples of an energy quantum hν, then the photoelectric effect could be explained very simply. Low-frequency light only ejects low-energy electrons because each electron is excited by the absorption of a single photon. Increasing the intensity of the low-frequency light (increasing the number of photons) only increases the number of excited electrons, not their energy, because the energy of each photon remains low. Only by increasing the frequency of the light, and thus increasing the energy of the photons, can one eject electrons with higher energy. Thus, using Planck's constant h to determine the energy of the photons based upon their frequency, the energy of ejected electrons should also increase linearly with frequency, the gradient of the line being Planck's constant. These results were not confirmed until 1915, when Robert Andrews Millikan produced experimental results in perfect accord with Einstein's predictions.
While energy of ejected electrons reflected Planck's constant, the existence of photons was not explicitly proven until the discovery of the photon antibunching effect, of which a modern experiment can be performed in undergraduate-level labs. This phenomenon could only be explained via photons. Einstein's "light quanta" would not be called photons until 1925, but even in 1905 they represented the quintessential example of wave-particle duality. Electromagnetic radiation propagates following linear wave equations, but can only be emitted or absorbed as discrete elements, thus acting as a wave and a particle simultaneously.
In 1905, Albert Einstein provided an explanation of the photoelectric effect, an experiment that the wave theory of light failed to explain. He did so by postulating the existence of photons, quanta of light energy with particulate qualities.
In the photoelectric effect, it was observed that shining a light on certain metals would lead to an electric current in a circuit. Presumably, the light was knocking electrons out of the metal, causing current to flow. However, using the case of potassium as an example, it was also observed that while a dim blue light was enough to cause a current, even the strongest, brightest red light available with the technology of the time caused no current at all. According to the classical theory of light and matter, the strength or amplitude of a light wave was in proportion to its brightness: a bright light should have been easily strong enough to create a large current. Yet, oddly, this was not so.
Einstein explained this enigma by postulating that the electrons can receive energy from electromagnetic field only in discrete units (quanta or photons): an amount of energy E that was related to the frequency f of the light by
where h is Planck's constant (6.626 × 10−34 Js). Only photons of a high enough frequency (above a certain threshold value) could knock an electron free. For example, photons of blue light had sufficient energy to free an electron from the metal, but photons of red light did not. One photon of light above the threshold frequency could release only one electron; the higher the frequency of a photon, the higher the kinetic energy of the emitted electron, but no amount of light below the threshold frequency could release an electron. To violate this law would require extremely high-intensity lasers which had not yet been invented. Intensity-dependent phenomena have now been studied in detail with such lasers.
Einstein was awarded the Nobel Prize in Physics in 1921 for his discovery of the law of the photoelectric effect.
This is a generalization of Einstein's equation above, since the momentum of a photon is given by p = and the wavelength (in a vacuum) by λ = , where c is the speed of light in vacuum.
De Broglie's formula was confirmed three years later for electrons with the observation of electron diffraction in two independent experiments. At the University of Aberdeen, George Paget Thomson passed a beam of electrons through a thin metal film and observed the predicted interference patterns. At Bell Labs, Clinton Joseph Davisson and Lester Halbert Germer guided the electron beam through a crystalline grid in their experiment popularly known as Davisson–Germer experiment.
De Broglie was awarded the Nobel Prize for Physics in 1929 for his hypothesis. Thomson and Davisson shared the Nobel Prize for Physics in 1937 for their experimental work.
Heisenberg originally explained this as a consequence of the process of measuring: Measuring position accurately would disturb momentum and vice versa, offering an example (the "gamma-ray microscope") that depended crucially on the de Broglie hypothesis. The thought is now, however, that this only partly explains the phenomenon, but that the uncertainty also exists in the particle itself, even before the measurement is made.
In fact, the modern explanation of the uncertainty principle, extending the Copenhagen interpretation first put forward by Bohr and Heisenberg, depends even more centrally on the wave nature of a particle. Just as it is nonsensical to discuss the precise location of a wave on a string, particles do not have perfectly precise positions; likewise, just as it is nonsensical to discuss the wavelength of a "pulse" wave traveling down a string, particles do not have perfectly precise momenta which corresponds to the inverse of wavelength. Moreover, when position is relatively well defined, the wave is pulse-like and has a very ill-defined wavelength, and thus momentum. And conversely, when momentum, and thus wavelength, is relatively well defined, the wave looks long and sinusoidal, and therefore it has a very ill-defined position.
De Broglie himself had proposed a pilot wave construct to explain the observed wave-particle duality. In this view, each particle has a well-defined position and momentum, but is guided by a wave function derived from Schrödinger's equation. The pilot wave theory was initially rejected because it generated non-local effects when applied to systems involving more than one particle. Non-locality, however, soon became established as an integral feature of quantum theory and David Bohm extended de Broglie's model to explicitly include it.
In the resulting representation, also called the de Broglie–Bohm theory or Bohmian mechanics, the wave-particle duality vanishes, and explains the wave behaviour as a scattering with wave appearance, because the particle's motion is subject to a guiding equation or quantum potential.
Since the demonstrations of wave-like properties in photons and electrons, similar experiments have been conducted with neutrons and protons. Among the most famous experiments are those of Estermann and Otto Stern in 1929. Authors of similar recent experiments with atoms and molecules, described below, claim that these larger particles also act like waves.
A dramatic series of experiments emphasizing the action of gravity in relation to wave–particle duality was conducted in the 1970s using the neutron interferometer. Neutrons, one of the components of the atomic nucleus, provide much of the mass of a nucleus and thus of ordinary matter. In the neutron interferometer, they act as quantum-mechanical waves directly subject to the force of gravity. While the results were not surprising since gravity was known to act on everything, including light (see tests of general relativity and the Pound–Rebka falling photon experiment), the self-interference of the quantum mechanical wave of a massive fermion in a gravitational field had never been experimentally confirmed before.
In 1999, the diffraction of C60 fullerenes by researchers from the University of Vienna was reported. Fullerenes are comparatively large and massive objects, having an atomic mass of about 720 u. The de Broglie wavelength of the incident beam was about 2.5 pm, whereas the diameter of the molecule is about 1 nm, about 400 times larger. In 2012, these far-field diffraction experiments could be extended to phthalocyanine molecules and their heavier derivatives, which are composed of 58 and 114 atoms respectively. In these experiments the build-up of such interference patterns could be recorded in real time and with single molecule sensitivity.
In 2003, the Vienna group also demonstrated the wave nature of tetraphenylporphyrin—a flat biodye with an extension of about 2 nm and a mass of 614 u. For this demonstration they employed a near-field Talbot Lau interferometer. In the same interferometer they also found interference fringes for C60F48., a fluorinated buckyball with a mass of about 1600 u, composed of 108 atoms. Large molecules are already so complex that they give experimental access to some aspects of the quantum-classical interface, i.e., to certain decoherence mechanisms. In 2011, the interference of molecules as heavy as 6910 u could be demonstrated in a Kapitza–Dirac–Talbot–Lau interferometer. In 2013, the interference of molecules beyond 10,000 u has been demonstrated.
Whether objects heavier than the Planck mass (about the weight of a large bacterium) have a de Broglie wavelength is theoretically unclear and experimentally unreachable; above the Planck mass a particle's Compton wavelength would be smaller than the Planck length and its own Schwarzschild radius, a scale at which current theories of physics may break down or need to be replaced by more general ones.
Recently Couder, Fort, et al. showed that we can use macroscopic oil droplets on a vibrating surface as a model of wave–particle duality—localized droplet creates periodical waves around and interaction with them leads to quantum-like phenomena: interference in double-slit experiment, unpredictable tunneling (depending in complicated way on practically hidden state of field), orbit quantization (that particle has to 'find a resonance' with field perturbations it creates—after one orbit, its internal phase has to return to the initial state) and Zeeman effect.
Wave–particle duality is deeply embedded into the foundations of quantum mechanics. In the formalism of the theory, all the information about a particle is encoded in its wave function, a complex-valued function roughly analogous to the amplitude of a wave at each point in space. This function evolves according to Schrödinger equation. For particles with mass this equation has solutions that follow the form of the wave equation. Propagation of such waves leads to wave-like phenomena such as interference and diffraction. Particles without mass, like photons, have no solutions of the Schrödinger equation so have another wave.
The particle-like behaviour is most evident due to phenomena associated with measurement in quantum mechanics. Upon measuring the location of the particle, the particle will be forced into a more localized state as given by the uncertainty principle. When viewed through this formalism, the measurement of the wave function will randomly lead to wave function collapse, or rather quantum decoherence, to a sharply peaked function at some location. For particles with mass, the likelihood of detecting the particle at any particular location is equal to the squared amplitude of the wave function there. The measurement will return a well-defined position, and is subject to Heisenberg's uncertainty principle. A measurement is only a particular type of interaction where some data is recorded and the measured quantity is forced into a particular quantum state. The act of measurement is therefore not fundamentally different from any other interaction.
Following the development of quantum field theory the ambiguity disappeared. The field permits solutions that follow the wave equation, which are referred to as the wave functions. The term particle is used to label the irreducible representations of the Lorentz group that are permitted by the field. An interaction as in a Feynman diagram is accepted as a calculationally convenient approximation where the outgoing legs are known to be simplifications of the propagation and the internal lines are for some order in an expansion of the field interaction. Since the field is non-local and quantized, the phenomena which previously were thought of as paradoxes are explained. Within the limits of the wave-particle duality the quantum field theory gives the same results.
There are two ways to visualize the wave-particle behaviour by the standard model and by the de Broglie–Bohm theory.
Below is an illustration of wave–particle duality as it relates to de Broglie's hypothesis and Heisenberg's Uncertainty principle, in terms of the position and momentum space wavefunctions for one spinless particle with mass in one dimension. These wavefunctions are Fourier transforms of each other.
The more localized the position-space wavefunction, the more likely the particle is to be found with the position coordinates in that region, and correspondingly the momentum-space wavefunction is less localized so the possible momentum components the particle could have are more widespread.
Conversely the more localized the momentum-space wavefunction, the more likely the particle is to be found with those values of momentum components in that region, and correspondingly the less localized the position-space wavefunction, so the position coordinates the particle could occupy are more widespread.
Wave–particle duality is an ongoing conundrum in modern physics. Most physicists accept wave-particle duality as the best explanation for a broad range of observed phenomena; however, it is not without controversy. Alternative views are also presented here. These views are not generally accepted by mainstream physics, but serve as a basis for valuable discussion within the community.
The pilot wave model, originally developed by Louis de Broglie and further developed by David Bohm into the hidden variable theory proposes that there is no duality, but rather a system exhibits both particle properties and wave properties simultaneously, and particles are guided, in a deterministic fashion, by the pilot wave (or its "quantum potential") which will direct them to areas of constructive interference in preference to areas of destructive interference. This idea is held by a significant minority within the physics community.
At least one physicist considers the "wave-duality" as not being an incomprehensible mystery. L.E. Ballentine, Quantum Mechanics, A Modern Development, p. 4, explains:
When first discovered, particle diffraction was a source of great puzzlement. Are "particles" really "waves?" In the early experiments, the diffraction patterns were detected holistically by means of a photographic plate, which could not detect individual particles. As a result, the notion grew that particle and wave properties were mutually incompatible, or complementary, in the sense that different measurement apparatuses would be required to observe them. That idea, however, was only an unfortunate generalization from a technological limitation. Today it is possible to detect the arrival of individual electrons, and to see the diffraction pattern emerge as a statistical pattern made up of many small spots (Tonomura et al., 1989). Evidently, quantum particles are indeed particles, but whose behaviour is very different from classical physics would have us to expect.
The Afshar experiment (2007) may suggest that it is possible to simultaneously observe both wave and particle properties of photons. This claim is, however, disputed by other scientists.
Carver Mead, an American scientist and professor at Caltech, proposes that the duality can be replaced by a "wave-only" view. In his book Collective Electrodynamics: Quantum Foundations of Electromagnetism (2000), Mead purports to analyze the behavior of electrons and photons purely in terms of electron wave functions, and attributes the apparent particle-like behavior to quantization effects and eigenstates. According to reviewer David Haddon:
Mead has cut the Gordian knot of quantum complementarity. He claims that atoms, with their neutrons, protons, and electrons, are not particles at all but pure waves of matter. Mead cites as the gross evidence of the exclusively wave nature of both light and matter the discovery between 1933 and 1996 of ten examples of pure wave phenomena, including the ubiquitous laser of CD players, the self-propagating electrical currents of superconductors, and the Bose–Einstein condensate of atoms.
This double nature of radiation (and of material corpuscles) ... has been interpreted by quantum-mechanics in an ingenious and amazingly successful fashion. This interpretation ... appears to me as only a temporary way out...
The three wave hypothesis of R. Horodecki relates the particle to wave. The hypothesis implies that a massive particle is an intrinsically spatially, as well as temporally extended, wave phenomenon by a nonlinear law.
The deterministic collapse theory considers collapse and measurement as two independent physical processes. Collapse occurs when two wavepackets spatially overlap and satisfy a mathemetical criterion, which depends on the parameters of both wavepackets. It is a contraction to the overlap volume. In a measurement apparatus one of the two wavepackets is one of the atomic clusters, which constitute the apparatus, and the wavepackets collapse to at most the volume of such a cluster. This mimics the action of a point particle.
Still in the days of the old quantum theory, a pre-quantum-mechanical version of wave–particle duality was pioneered by William Duane, and developed by others including Alfred Landé. Duane explained diffraction of x-rays by a crystal in terms solely of their particle aspect. The deflection of the trajectory of each diffracted photon was explained as due to quantized momentum transfer from the spatially regular structure of the diffracting crystal.
It has been argued that there are never exact particles or waves, but only some compromise or intermediate between them. For this reason, in 1928 Arthur Eddington coined the name "wavicle" to describe the objects although it is not regularly used today. One consideration is that zero-dimensional mathematical points cannot be observed. Another is that the formal representation of such points, the Dirac delta function is unphysical, because it cannot be normalized. Parallel arguments apply to pure wave states. Roger Penrose states:
"Such 'position states' are idealized wavefunctions in the opposite sense from the momentum states. Whereas the momentum states are infinitely spread out, the position states are infinitely concentrated. Neither is normalizable [...]."
Relational quantum mechanics has been developed as a point of view that regards the event of particle detection as having established a relationship between the quantized field and the detector. The inherent ambiguity associated with applying Heisenberg’s uncertainty principle is consequently avoided; hence there is no wave-particle duality.
Although it is difficult to draw a line separating wave–particle duality from the rest of quantum mechanics, it is nevertheless possible to list some applications of this basic idea.
For both large and small wavelengths, both matter and radiation have both particle and wave aspects.... But the wave aspects of their motion become more difficult to observe as their wavelengths become shorter.... For ordinary macroscopic particles the mass is so large that the momentum is always sufficiently large to make the de Broglie wavelength small enough to be beyond the range of experimental detection, and classical mechanics reigns supreme.
The Davisson–Germer experiment was a 1923-7 experiment by Clinton Davisson and Lester Germer at Western Electric (later Bell Labs), in which electrons, scattered by the surface of a crystal of nickel metal, displayed a diffraction pattern. This confirmed the hypothesis, advanced by Louis de Broglie in 1924, of wave-particle duality, and was an experimental milestone in the creation of quantum mechanics.Double-slit experiment
In modern physics, the double-slit experiment is a demonstration that light and matter can display characteristics of both classically defined waves and particles; moreover, it displays the fundamentally probabilistic nature of quantum mechanical phenomena. The experiment was first performed with light by Thomas Young in 1801. In 1927, Davisson and Germer demonstrated that electrons show the same behavior, which was later extended to atoms and molecules.Thomas Young's experiment with light was part of classical physics well before quantum mechanics, and the concept of wave-particle duality. He believed it demonstrated that the wave theory of light was correct, and his experiment is sometimes referred to as Young's experiment or Young's slits.
The experiment belongs to a general class of "double path" experiments, in which a wave is split into two separate waves that later combine into a single wave. Changes in the path lengths of both waves result in a phase shift, creating an interference pattern. Another version is the Mach–Zehnder interferometer, which splits the beam with a mirror. In the basic version of this experiment, a coherent light source, such as a laser beam, illuminates a plate pierced by two parallel slits, and the light passing through the slits is observed on a screen behind the plate. The wave nature of light causes the light waves passing through the two slits to interfere, producing bright and dark bands on the screen — a result that would not be expected if light consisted of classical particles. However, the light is always found to be absorbed at the screen at discrete points, as individual particles (not waves), the interference pattern appearing via the varying density of these particle hits on the screen. Furthermore, versions of the experiment that include detectors at the slits find that each detected photon passes through one slit (as would a classical particle), and not through both slits (as would a wave). However, such experiments demonstrate that particles do not form the interference pattern if one detects which slit they pass through. These results demonstrate the principle of wave–particle duality.Other atomic-scale entities, such as electrons, are found to exhibit the same behavior when fired towards a double slit. Additionally, the detection of individual discrete impacts is observed to be inherently probabilistic, which is inexplicable using classical mechanics.The experiment can be done with entities much larger than electrons and photons, although it becomes more difficult as size increases. The largest entities for which the double-slit experiment has been performed were molecules that each comprised 810 atoms (whose total mass was over 10,000 atomic mass units).The double-slit experiment (and its variations) has become a classic thought experiment, for its clarity in expressing the central puzzles of quantum mechanics. Because it demonstrates the fundamental limitation of the ability of the observer to predict experimental results, Richard Feynman called it "a phenomenon which is impossible […] to explain in any classical way, and which has in it the heart of quantum mechanics. In reality, it contains the only mystery [of quantum mechanics]."Dualism
Dualism may refer to:
Mind–body dualism, a philosophical set of views about the relationship between mind and matter, which begins with the claim that mental phenomena are, in some respects, non-physical
Property dualism, a philosophy of mind and a subbranch of emergent materialism
Epistemological dualism, a philosophical concept also known as representative realism, indirect realism, and the veil of perception
Dualism (Indian philosophy), views in Hindu and Buddhist philosophy that are similar to but distinct from Western mind–body dualism
Dualistic cosmology, the moral, spiritual, or religious belief that two fundamental concepts exist, which often oppose each other
Soul dualism, the belief that a person has two (or more) kinds of souls
Ethical dualism, the attribution of good solely to one group of people and evil to another
Dualism (law), a principle in contending that international and domestic law are distinct systems of law, and that international law only applies to the extent that it does not conflict with domestic law
Dualism (politics), the separation of the responsibilities of cabinet and parliament
Duality (physics), media with properties that can be associated with the mechanics of two different phenomena, such as wave-particle duality
Dualism (cybernetics), systems or problems in which an intelligent adversary attempts to exploit the weaknesses of the investigatorDuality principle
Duality principle or principle of duality may refer to:
Duality (projective geometry)
Duality (order theory)
Duality principle (Boolean algebra)
Duality principle for sets
Duality principle (optimization theory)
Duality principle in functional analysis, used in large sieve method of analytic number theory
Wave–particle dualityElitzur–Vaidman bomb tester
The Elitzur–Vaidman bomb-tester is a quantum mechanics thought experiment that uses interaction-free measurements to verify that a bomb is functional without having to detonate it. It was conceived in 1993 by Avshalom Elitzur and Lev Vaidman. Since their publication, real-world experiments have confirmed that their theoretical method works as predicted.The bomb tester takes advantage of two characteristics of elementary particles, such as photons or electrons: nonlocality and wave-particle duality. By placing the particle in a quantum superposition, the experiment can verify that the bomb works without ever triggering its detonation, although there is a 50% chance that the bomb will explode in the effort.Femtosecond
A femtosecond is the SI unit of time equal to 10−15 or 1/1,000,000,000,000,000 of a second; that is, one quadrillionth, or one millionth of one billionth, of a second. For context, a femtosecond is to a second as a second is to about 31.71 million years; a ray of light travels approximately 0.3 µm (micrometers) in 1 femtosecond, a distance comparable to the diameter of a virus.
The word femtosecond is formed by the SI prefix femto and the SI unit second. Its symbol is fs.
A femtosecond is equal to 1000 attoseconds, or 1/1000 picosecond. Because the next higher SI unit is 1000 times larger, times of 10−14 and 10−13 seconds are typically expressed as tens or hundreds of femtoseconds.Joan Vaccaro
Joan Vaccaro is a physicist at Griffith University and a former student of David Pegg (physicist). Her work in quantum physics includes quantum phase, nonclassical states of light, coherent laser excitation of atomic gases, cold atomic gases, stochastic Schrödinger equations, quantum information theory, quantum references, wave–particle duality, quantum thermodynamics, and the physical nature of time.Kapitsa–Dirac effect
The Kapitza–Dirac effect is a quantum mechanical effect consisting of the diffraction of matter by a standing wave of light.
The effect was first predicted as the diffraction of electrons from a standing wave of light by Paul Dirac and Pyotr Kapitsa (or Peter Kapitza) in 1933. The effect relies on the wave–particle duality of matter as stated by the de Broglie hypothesis in 1924.Lester Germer
Lester Halbert Germer (October 10, 1896 – October 3, 1971) was an American physicist. With Clinton Davisson, he proved the wave-particle duality of matter in the Davisson–Germer experiment, which was important to the development of the electron microscope. These studies supported the theoretical work of De Broglie. He also studied thermionics, erosion of metals, and contact physics. He was awarded the Elliott Cresson Medal in 1931.
A former fighter pilot in World War I, Germer subsequently worked at Bell Labs in New Jersey.
In 1945 (at the age of 49), Germer launched a side career as a rock climber. He climbed widely around the Northeast United States, and especially at New York's Shawangunk Ridge. Although the Appalachian Mountain Club was dominant in the area at the time, and strictly regulated rock climbing, Lester was never associated with the club, and found himself in conflict with the leading climber in the area, Hans Kraus, who was head of the AMC's Safety Committee. He was once turned down for climbing certification with the comment "Likes people too much and is too enthusiastic." Lester was known for being generous and friendly. He was once called "A one man climbing school."
In 1971, one week before his 75th birthday, Lester Germer died of a massive heart attack while lead climbing a rock climb at the Shawangunk Ridge (Eyebrow, 5.6). Until that moment, Lester had a 26-year perfect safety record in rock climbing; he had never even taken a leader fall.Louis de Broglie
Louis Victor Pierre Raymond de Broglie, duc de Broglie (; French: [dəbʁɔj] or [dəbʁœj] (listen); 15 August 1892 – 19 March 1987) was a French physicist who made groundbreaking contributions to quantum theory. In his 1924 PhD thesis, he postulated the wave nature of electrons and suggested that all matter has wave properties. This concept is known as the de Broglie hypothesis, an example of wave–particle duality, and forms a central part of the theory of quantum mechanics.
De Broglie won the Nobel Prize for Physics in 1929, after the wave-like behaviour of matter was first experimentally demonstrated in 1927.
The 1925 pilot-wave model, and the wave-like behaviour of particles discovered by de Broglie was used by Erwin Schrödinger in his formulation of wave mechanics. The pilot-wave model and interpretation was then abandoned, in favor of the quantum formalism, until 1952 when it was rediscovered and enhanced by David Bohm.Louis de Broglie was the sixteenth member elected to occupy seat 1 of the Académie française in 1944, and served as Perpetual Secretary of the French Academy of Sciences. De Broglie became the first high-level scientist to call for establishment of a multi-national laboratory, a proposal that led to the establishment of the European Organization for Nuclear Research (CERN).Matter wave
Matter waves are a central part of the theory of quantum mechanics, being an example of wave–particle duality. All matter can exhibit wave-like behavior. For example, a beam of electrons can be diffracted just like a beam of light or a water wave. The concept that matter behaves like a wave was proposed by Louis de Broglie (/dəˈbrɔɪ/) in 1924. It is also referred to as the de Broglie hypothesis. Matter waves are referred to as de Broglie waves.
The de Broglie wavelength is the wavelength, λ, associated with a massive particle and is related to its momentum, p, through the Planck constant, h:
Wave-like behavior of matter was first experimentally demonstrated by George Paget Thomson's thin metal diffraction experiment, and independently in the Davisson–Germer experiment both using electrons, and it has also been confirmed for other elementary particles, neutral atoms and even molecules.Particle radiation
Particle radiation is the radiation of energy by means of fast-moving subatomic particles. Particle radiation is referred to as a particle beam if the particles are all moving in the same direction, similar to a light beam.
Due to the wave–particle duality, all moving particles also have wave character. Higher energy particles more easily exhibit particle characteristics, while lower energy particles more easily exhibit wave characteristics.Pilot wave theory
In theoretical physics, the pilot wave theory, also known as Bohmian mechanics, was the first known example of a hidden-variable theory, presented by Louis de Broglie in 1927.
Its more modern version, the de Broglie–Bohm theory, interprets quantum mechanics as a deterministic theory, avoiding troublesome notions such as wave–particle duality, instantaneous wave function collapse, and the paradox of Schrödinger's cat. To solve these problems, the theory is inherently nonlocal and non-relativistic.
The de Broglie–Bohm pilot wave theory is one of several interpretations of (non-relativistic) quantum mechanics.
An extension to the relativistic case has been developed since the 1990s.Quantum healing
Quantum healing is a pseudo-scientific mixture of ideas which purportedly draw on quantum mechanics, psychology, philosophy, and neurophysiology. Advocates of quantum healing assert that quantum phenomena govern health and wellbeing. There are a number of different versions, which allude to various quantum ideas including wave particle duality and virtual particles, and more generally to "energy" and to vibrations. Quantum healing is a form of alternative medicine.
Deepak Chopra coined the term "quantum healing". His discussions of quantum healing have been characterised as technobabble - "incoherent babbling strewn with scientific terms" which drives those who actually understand physics "crazy" and as "redefining Wrong".Quantum healing has a number of vocal followers, but the scientific community widely regards it as nonsensical. The main criticism revolves around its systematic misinterpretations of modern physics, especially of the fact that macroscopic objects (such as the human body or individual cells) are much too large to exhibit inherently quantum properties like interference and wave function collapse. Most literature on quantum healing is almost entirely philosophical, omitting the rigorous mathematics that makes quantum electrodynamics possible.Physicist Brian Cox argues that misuse of the word "quantum", such as its use in the phrase quantum healing, has a negative effect on society as it undermines genuine science and discourages people from engaging with conventional medicine. He states that "for some scientists, the unfortunate distortion and misappropriation of scientific ideas that often accompanies their integration into popular culture is an unacceptable price to pay."Quantum system
A quantum system is a portion of the whole Universe (environment or physical world) which is taken under consideration to make analysis or to study for quantum mechanics pertaining to the wave-particle duality in that system. Everything outside this system (i.e. environment) is studied only to observe its effects on the system. A quantum system involves the wave function and its constituents, such as the momentum and wavelength of the wave for which wave function is being defined.Russell Stannard
Russell Stannard is a retired high-energy particle physicist, who was born in London, England, on 24 December 1931. He currently holds the position of Professor Emeritus of Physics at the Open University. In 1986, he was awarded the Templeton UK Project Award for "significant contributions to the field of spiritual values; in particular for contributions to greater understanding of science and religion". He was awarded the OBE for "contributions to physics, the Open University, and the popularisation of science" (1998) and the Bragg Medal and Prize of the Institute of Physics for "distinguished contributions to the teaching of physics" (1999). He was admitted as a Fellow of University College London in 2000.
Stannard is also a sculptor; two of his pieces were until recently on display in the main quadrangle of the Open University site at Milton Keynes.
In 2010, he helmed a series of ten short programmes collectively entitled "Boundaries of the knowable", dealing with subjects from both scientific and philosophical perspectives, ranging from the nature of consciousness, the nature of matter, space and time, the wave-particle duality of matter, the (alleged) existence of extra-terrestrial life and the question of "What caused the Big Bang?".Spin magnetic moment
In physics, mainly quantum mechanics and particle physics, a spin magnetic moment is the magnetic moment caused by the spin of elementary particles. For example, the electron is an elementary spin-1/2 fermion. Quantum electrodynamics gives the most accurate prediction of the anomalous magnetic moment of the electron.
"Spin" is a non-classical property of elementary particles, since classically the "spin angular momentum" of a material object is really just the total orbital angular momenta of the object's constituents about the rotation axis. Elementary particles are conceived as concepts which have no axis to "spin" around (see wave–particle duality).
In general, a magnetic moment can be defined in terms of an electric current and the area enclosed by the current loop. Since angular momentum corresponds to rotational motion, the magnetic moment can be related to the orbital angular momentum of the charge carriers in the constituting current. However, in magnetic materials, the atomic and molecular dipoles have magnetic moments not just because of their quantized orbital angular momentum, but, due to the spin of elementary particles constituting them (electrons, and the quarks in the protons and neutrons of the atomic nuclei). A particle may have a spin magnetic moment without having an electric charge. For example, the neutron is electrically neutral but has a non-zero magnetic moment because of its internal quark structure.Subatomic particle
In the physical sciences, subatomic particles are particles much smaller than atoms. The two types of subatomic particles are: elementary particles, which according to current theories are not made of other particles; and composite particles. Particle physics and nuclear physics study these particles and how they interact.
The idea of a particle underwent serious rethinking when experiments showed that light could behave like a stream of particles (called photons) as well as exhibiting wave-like properties. This led to the new concept of wave–particle duality to reflect that quantum-scale "particles" behave like both particles and waves (they are sometimes described as wavicles to reflect this). Another new concept, the uncertainty principle, states that some of their properties taken together, such as their simultaneous position and momentum, cannot be measured exactly. In more recent times, wave–particle duality has been shown to apply not only to photons but to increasingly massive particles as well.Interactions of particles in the framework of quantum field theory are understood as creation and annihilation of quanta of corresponding fundamental interactions. This blends particle physics with field theory.Transformation theory (quantum mechanics)
The term transformation theory refers to a procedure and a "picture" used by P. A. M. Dirac in his early formulation of quantum theory, from around 1927.This "transformation" idea refers to the changes a quantum state undergoes in the course of time, whereby its vector "moves" between "positions" or "orientations" in its Hilbert space. Time evolution, quantum transitions, and symmetry transformations in Quantum mechanics may thus be viewed as the systematic theory of abstract, generalized rotations in this space of quantum state vectors.
Remaining in full use today, it would be regarded as a topic in the mathematics of Hilbert space, although, technically speaking, it is somewhat more general in scope. While the terminology is reminiscent of rotations of vectors in ordinary space, the Hilbert space of a quantum object is more general, and holds its entire quantum state.
(The term further sometimes evokes the wave–particle duality, according to which a particle (a "small" physical object) may display either particle or wave aspects, depending on the observational situation. Or, indeed, a variety of intermediate aspects, as the situation demands.)