Quantum

In physics, a quantum (plural: quanta) is the minimum amount of any physical entity (physical property) involved in an interaction. The fundamental notion that a physical property may be "quantized" is referred to as "the hypothesis of quantization".[1] This means that the magnitude of the physical property can take on only discrete values consisting of integer multiples of one quantum.

For example, a photon is a single quantum of light (or of any other form of electromagnetic radiation). Similarly, the energy of an electron bound within an atom is quantized and can exist only in certain discrete values. (Indeed, atoms and matter in general are stable because electrons can exist only at discrete energy levels within an atom.) Quantization is one of the foundations of the much broader physics of quantum mechanics. Quantization of energy and its influence on how energy and matter interact (quantum electrodynamics) is part of the fundamental framework for understanding and describing nature.

Etymology and discovery

The word quantum comes from the Latin quantus, meaning "how great". "Quanta", short for "quanta of electricity" (electrons), was used in a 1902 article on the photoelectric effect by Philipp Lenard, who credited Hermann von Helmholtz for using the word in the area of electricity. However, the word quantum in general was well known before 1900.[2] It was often used by physicians, such as in the term quantum satis. Both Helmholtz and Julius von Mayer were physicians as well as physicists. Helmholtz used quantum with reference to heat in his article[3] on Mayer's work, and the word quantum can be found in the formulation of the first law of thermodynamics by Mayer in his letter[4] dated July 24, 1841.

In 1901, Max Planck used quanta to mean "quanta of matter and electricity",[5] gas, and heat.[6] In 1905, in response to Planck's work and the experimental work of Lenard (who explained his results by using the term quanta of electricity), Albert Einstein suggested that radiation existed in spatially localized packets which he called "quanta of light" ("Lichtquanta").[7]

The concept of quantization of radiation was discovered in 1900 by Max Planck, who had been trying to understand the emission of radiation from heated objects, known as black-body radiation. By assuming that energy can be absorbed or released only in tiny, differential, discrete packets (which he called "bundles", or "energy elements"),[8] Planck accounted for certain objects changing colour when heated.[9] On December 14, 1900, Planck reported his findings to the German Physical Society, and introduced the idea of quantization for the first time as a part of his research on black-body radiation.[10] As a result of his experiments, Planck deduced the numerical value of h, known as the Planck constant, and reported more precise values for the unit of electrical charge and the Avogadro–Loschmidt number, the number of real molecules in a mole, to the German Physical Society. After his theory was validated, Planck was awarded the Nobel Prize in Physics for his discovery in 1918.

Beyond electromagnetic radiation

While quantization was first discovered in electromagnetic radiation, it describes a fundamental aspect of energy not just restricted to photons.[11] In the attempt to bring theory into agreement with experiment, Max Planck postulated that electromagnetic energy is absorbed or emitted in discrete packets, or quanta.[12]

See also

References

  1. ^ Wiener, N. (1966). Differential Space, Quantum Systems, and Prediction. Cambridge: The Massachusetts Institute of Technology Press
  2. ^ E. Cobham Brewer 1810–1897. Dictionary of Phrase and Fable. 1898.
  3. ^ E. Helmholtz, Robert Mayer's Priorität (in German)
  4. ^ Herrmann, Armin (1991). "Heimatseite von Robert J. Mayer" (in German). Weltreich der Physik, GNT-Verlag. Archived from the original on 1998-02-09.CS1 maint: BOT: original-url status unknown (link)
  5. ^ Planck, M. (1901). "Ueber die Elementarquanta der Materie und der Elektricität". Annalen der Physik (in German). 309 (3): 564–566. Bibcode:1901AnP...309..564P. doi:10.1002/andp.19013090311.
  6. ^ Planck, Max (1883). "Ueber das thermodynamische Gleichgewicht von Gasgemengen". Annalen der Physik (in German). 255 (6): 358. Bibcode:1883AnP...255..358P. doi:10.1002/andp.18832550612.
  7. ^ Einstein, A. (1905). "Über einen die Erzeugung und Verwandlung des Lichtes betreffenden heuristischen Gesichtspunkt" (PDF). Annalen der Physik (in German). 17 (6): 132–148. Bibcode:1905AnP...322..132E. doi:10.1002/andp.19053220607.. A partial English translation is available from Wikisource.
  8. ^ Max Planck (1901). "Ueber das Gesetz der Energieverteilung im Normalspectrum (On the Law of Distribution of Energy in the Normal Spectrum)". Annalen der Physik. 309 (3): 553. Bibcode:1901AnP...309..553P. doi:10.1002/andp.19013090310. Archived from the original on 2008-04-18.
  9. ^ Brown, T., LeMay, H., Bursten, B. (2008). Chemistry: The Central Science Upper Saddle River, NJ: Pearson Education ISBN 0-13-600617-5
  10. ^ Klein, Martin J. (1961). "Max Planck and the beginnings of the quantum theory". Archive for History of Exact Sciences. 1 (5): 459. doi:10.1007/BF00327765.
  11. ^ Melville, K. (2005, February 11). Real-World Quantum Effects Demonstrated
  12. ^ Modern Applied Physics-Tippens third edition; McGraw-Hill.

Further reading

  • B. Hoffmann, The Strange Story of the Quantum, Pelican 1963.
  • Lucretius, On the Nature of the Universe, transl. from the Latin by R.E. Latham, Penguin Books Ltd., Harmondsworth 1951.
  • J. Mehra and H. Rechenberg, The Historical Development of Quantum Theory, Vol.1, Part 1, Springer-Verlag New York Inc., New York 1982.
  • M. Planck, A Survey of Physical Theory, transl. by R. Jones and D.H. Williams, Methuen & Co., Ltd., London 1925 (Dover editions 1960 and 1993) including the Nobel lecture.
  • Rodney, Brooks (2011) Fields of Color: The theory that escaped Einstein. Allegra Print & Imaging.
Deepak Chopra

Deepak Chopra (; Hindi: [d̪iːpək tʃoːpraː]; born October 22, 1946) is an Indian-born American author, public speaker, alternative medicine advocate, and a prominent figure in the New Age movement. Through his books and videos, he has become one of the best-known and wealthiest figures in alternative medicine.Chopra studied medicine in India before emigrating to the United States in 1970 where he completed residencies in internal medicine and endocrinology. As a licensed physician, he became chief of staff at the New England Memorial Hospital (NEMH) in 1980. He met Maharishi Mahesh Yogi in 1985 and became involved with the Transcendental Meditation movement (TM). He resigned his position at NEMH shortly thereafter to establish the Maharishi Ayurveda Health Center. Chopra gained a following in 1993 after he was interviewed on The Oprah Winfrey Show about his books. He then left the TM movement to become the executive director of Sharp HealthCare's Center for Mind-Body Medicine and in 1996 he co-founded the Chopra Center for Wellbeing.Chopra believes that a person may attain "perfect health", a condition "that is free from disease, that never feels pain", and "that cannot age or die". Seeing the human body as being undergirded by a "quantum mechanical body" composed not of matter but of energy and information, he believes that "human aging is fluid and changeable; it can speed up, slow down, stop for a time, and even reverse itself," as determined by one's state of mind. He claims that his practices can also treat chronic disease.The ideas Chopra promotes have been regularly criticized by medical and scientific professionals as pseudoscience. This criticism has been described as ranging "from dismissive [to] damning". Philosopher Robert Carroll states Chopra attempts to integrate Ayurveda with quantum mechanics to justify his teachings. Chopra argues that what he calls "quantum healing" cures any manner of ailments, including cancer, through effects that he claims are literally based on the same principles as quantum mechanics. This has led physicists to object to his use of the term quantum in reference to medical conditions and the human body. Evolutionary biologist Richard Dawkins has said that Chopra uses "quantum jargon as plausible-sounding hocus pocus". Chopra's treatments generally elicit nothing but a placebo response, and have drawn criticism that the unwarranted claims made for them may raise "false hope" and lure sick people away from legitimate medical treatments.

Many-worlds interpretation

The many-worlds interpretation is an interpretation of quantum mechanics that asserts the objective reality of the universal wavefunction and denies the actuality of wavefunction collapse. The existence of the other worlds makes it possible to remove randomness and action at a distance from quantum theory and thus from all physics. Many-worlds implies that all possible alternate histories and futures are real, each representing an actual "world" (or "universe"). In layman's terms, the hypothesis states there is a very large—perhaps infinite—number of universes, and everything that could possibly have happened in our past, but did not, has occurred in the past of some other universe or universes. The theory is also referred to as MWI, the relative state formulation, the Everett interpretation, the theory of the universal wavefunction, many-universes interpretation, multiverse theory or just many-worlds.

The original relative state formulation is due to Hugh Everett in 1957. Later, this formulation was popularized and renamed many-worlds by Bryce Seligman DeWitt in the 1960s and 1970s. The decoherence approaches to interpreting quantum theory have been further explored and developed, becoming quite popular. MWI is one of many multiverse hypotheses in physics and philosophy. It is currently considered a mainstream interpretation along with the other decoherence interpretations, collapse theories (including the historical Copenhagen interpretation), and hidden variable theories such as the Bohmian mechanics.

Before many-worlds, reality had always been viewed as a single unfolding history. Many-worlds, however, views historical reality as a many-branched tree, wherein every possible quantum outcome is realised. Many-worlds reconciles the observation of non-deterministic events, such as random radioactive decay, with the fully deterministic equations of quantum physics.

In many-worlds, the subjective appearance of wavefunction collapse is explained by the mechanism of quantum decoherence, and this is supposed to resolve all of the correlation paradoxes of quantum theory, such as the EPR paradox and Schrödinger's cat, since every possible outcome of every event defines or exists in its own "history" or "world".

Multiverse

The multiverse is a hypothetical group of multiple universes including the universe in which we live. Together, these universes comprise everything that exists: the entirety of space, time, matter, energy, and the physical laws and constants that describe them. The different universes within the multiverse are called "parallel universes", "other universes", or "alternate universes".

Photon

The photon is a type of elementary particle, the quantum of the electromagnetic field including electromagnetic radiation such as light, and the force carrier for the electromagnetic force (even when static via virtual particles). The photon has zero rest mass and always moves at the speed of light within a vacuum.

Like all elementary particles, photons are currently best explained by quantum mechanics and exhibit wave–particle duality, exhibiting properties of both waves and particles. For example, a single photon may be refracted by a lens and exhibit wave interference with itself, and it can behave as a particle with definite and finite measurable position or momentum, though not both at the same time as per Heisenberg's uncertainty principle. The photon's wave and quantum qualities are two observable aspects of a single phenomenon—they cannot be described by any mechanical model; a representation of this dual property of light that assumes certain points on the wavefront to be the seat of the energy is not possible. The quanta in a light wave are not spatially localized.

The modern concept of the photon was developed gradually by Albert Einstein in the early 20th century to explain experimental observations that did not fit the classical wave model of light. The benefit of the photon model is that it accounts for the frequency dependence of light's energy, and explains the ability of matter and electromagnetic radiation to be in thermal equilibrium. The photon model accounts for anomalous observations, including the properties of black-body radiation, that others (notably Max Planck) had tried to explain using semiclassical models. In that model, light is described by Maxwell's equations, but material objects emit and absorb light in quantized amounts (i.e., they change energy only by certain particular discrete amounts). Although these semiclassical models contributed to the development of quantum mechanics, many further experiments beginning with the phenomenon of Compton scattering of single photons by electrons, validated Einstein's hypothesis that light itself is quantized. In December 1926, American physical chemist Gilbert N. Lewis coined the widely-adopted name "photon" for these particles in a letter to Nature. After Arthur H. Compton won the Nobel Prize in 1927 for his scattering studies, most scientists accepted that light quanta have an independent existence, and the term "photon" was accepted.

In the Standard Model of particle physics, photons and other elementary particles are described as a necessary consequence of physical laws having a certain symmetry at every point in spacetime. The intrinsic properties of particles, such as charge, mass, and spin, are determined by this gauge symmetry. The photon concept has led to momentous advances in experimental and theoretical physics, including lasers, Bose–Einstein condensation, quantum field theory, and the probabilistic interpretation of quantum mechanics. It has been applied to photochemistry, high-resolution microscopy, and measurements of molecular distances. Recently, photons have been studied as elements of quantum computers, and for applications in optical imaging and optical communication such as quantum cryptography.

Planck constant

The Planck constant (denoted h, also called Planck's constant) is a physical constant that is the quantum of electromagnetic action, which relates the energy carried by a photon to its frequency. A photon's energy is equal to its frequency multiplied by the Planck constant. The Planck constant is of fundamental importance in quantum mechanics, and in metrology it is the basis for the definition of the kilogram.

At the end of the 19th century, physicists were unable to explain why the observed spectrum of black body radiation, which by then had been accurately measured, diverged significantly at higher frequencies from that predicted by existing theories. In 1900, Max Planck empirically derived a formula for the observed spectrum. He assumed that a hypothetical electrically charged oscillator in a cavity that contained black body radiation could only change its energy in a minimal increment, E, that was proportional to the frequency of its associated electromagnetic wave. He was able to calculate the proportionality constant, h, from the experimental measurements, and that constant is named in his honor. In 1905, the value E was associated by Albert Einstein with a "quantum" or minimal element of the energy of the electromagnetic wave itself. The light quantum behaved in some respects as an electrically neutral particle, as opposed to an electromagnetic wave. It was eventually called a photon. Max Planck received the 1918 Nobel Prize in Physics "in recognition of the services he rendered to the advancement of Physics by his discovery of energy quanta".

Since energy and mass are equivalent, the Planck constant also relates mass to frequency. By 2017, the Planck constant had been measured with sufficient accuracy in terms of the SI base units, that it was central to replacing the metal cylinder, called the International Prototype of the Kilogram (IPK), that had defined the kilogram since 1889. The new definition was unanimously approved at the General Conference on Weights and Measures (CGPM) on 16 November 2018 as part of the 2019 redefinition of SI base units. For this new definition of the kilogram, the Planck constant, as defined by the ISO standard, was set to 6.62607015×10−34 J⋅s exactly. The kilogram was the last SI base unit to be re-defined by a fundamental physical property to replace a physical artefact.

Quantum Leap

Quantum Leap is an American science-fiction television series that originally aired on NBC for five seasons, from March 1989 through May 1993. Created by Donald P. Bellisario, it starred Scott Bakula as Dr. Sam Beckett, a physicist who leaps through spacetime during an experiment in time travel, by temporarily taking the place of other people to correct historical mistakes. Dean Stockwell co-stars as Admiral Al Calavicci, Sam's womanizing, cigar-smoking companion and best friend, who appears to him as a hologram.

The series features a mix of humor, drama, romance, social commentary, and science fiction. The show was ranked #19 on TV Guide's "Top Cult Shows Ever".

Quantum computing

Quantum computing is the use of quantum-mechanical phenomena such as superposition and entanglement to perform computation. A quantum computer is used to perform such computation, which can be implemented theoretically or physically.The field of quantum computing is actually a sub-field of quantum information science, which includes quantum cryptography and quantum communication. Quantum Computing was started in the early 1980s when Richard Feynman and Yuri Manin expressed the idea that a quantum computer had the potential to simulate things that a classical computer could not. In 1994, Peter Shor shocked the world with an algorithm that had the potential to decrypt all secured communications.There are two main approaches to physically implementing a quantum computer currently, analog and digital. Analog approaches are further divided into quantum simulation, quantum annealing, and adiabatic quantum computation. Digital quantum computers use quantum logic gates to do computation. Both approaches use quantum bits or qubits.Qubits are fundamental to quantum computing and are somewhat analogous to bits in a classical computer. Qubits can be in a 1 or 0 quantum state. But they can also be in a superposition of the 1 and 0 states. However, when qubits are measured they always give a 0 or a 1 based on the quantum state they were in.

Today's physical quantum computers are very noisy and quantum error correction is a burgeoning field of research. Quantum supremacy is hopefully the next milestone that quantum computing will achieve soon. While there is much hope, money, and research in the field of quantum computing, as of March 2019 there have been no commercially useful algorithms published for today's noisy quantum computers.

Quantum entanglement

Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance.

Measurements of physical properties such as position, momentum, spin, and polarization, performed on entangled particles are found to be correlated. For example, if a pair of particles is generated in such a way that their total spin is known to be zero, and one particle is found to have clockwise spin on a certain axis, the spin of the other particle, measured on the same axis, will be found to be counterclockwise, as is to be expected due to their entanglement. However, this behavior gives rise to seemingly paradoxical effects: any measurement of a property of a particle performs an irreversible collapse on that particle and will change the original quantum state. In the case of entangled particles, such a measurement will be on the entangled system as a whole.

Such phenomena were the subject of a 1935 paper by Albert Einstein, Boris Podolsky, and Nathan Rosen, and several papers by Erwin Schrödinger shortly thereafter, describing what came to be known as the EPR paradox. Einstein and others considered such behavior to be impossible, as it violated the local realism view of causality (Einstein referring to it as "spooky action at a distance") and argued that the accepted formulation of quantum mechanics must therefore be incomplete.

Later, however, the counterintuitive predictions of quantum mechanics were verified experimentally in tests where the polarization or spin of entangled particles were measured at separate locations, statistically violating Bell's inequality. In earlier tests it couldn't be absolutely ruled out that the test result at one point could have been subtly transmitted to the remote point, affecting the outcome at the second location. However so-called "loophole-free" Bell tests have been performed in which the locations were separated such that communications at the speed of light would have taken longer—in one case 10,000 times longer—than the interval between the measurements.According to some interpretations of quantum mechanics, the effect of one measurement occurs instantly. Other interpretations which don't recognize wavefunction collapse dispute that there is any "effect" at all. However, all interpretations agree that entanglement produces correlation between the measurements and that the mutual information between the entangled particles can be exploited, but that any transmission of information at faster-than-light speeds is impossible.Quantum entanglement has been demonstrated experimentally with photons, neutrinos, electrons, molecules as large as buckyballs, and even small diamonds. The utilization of entanglement in communication and computation is a very active area of research.

Quantum field theory

In theoretical physics, quantum field theory (QFT) is a theoretical framework that combines classical field theory, special relativity, and quantum mechanics and is used to construct physical models of subatomic particles (in particle physics) and quasiparticles (in condensed matter physics).

QFT treats particles as excited states (also called quanta) of their underlying fields, which are—in a sense—more fundamental than the basic particles. Interactions between particles are described by interaction terms in the Lagrangian involving their corresponding fields. Each interaction can be visually represented by Feynman diagrams, which are formal computational tools, in the process of relativistic perturbation theory.

Quantum gravity

Quantum gravity (QG) is a field of theoretical physics that seeks to describe gravity according to the principles of quantum mechanics, and where quantum effects cannot be ignored, such as near compact astrophysical objects where the effects of gravity are strong.

The current understanding of gravity is based on Albert Einstein's general theory of relativity, which is formulated within the framework of classical physics. On the other hand, the other three fundamental forces of physics are described within the framework of quantum mechanics and quantum field theory, radically different formalisms for describing physical phenomena. It is sometimes argued that a quantum mechanical description of gravity is necessary on the grounds that one cannot consistently couple a classical system to a quantum one.While a quantum theory of gravity may be needed to reconcile general relativity with the principles of quantum mechanics, difficulties arise when applying the usual prescriptions of quantum field theory to the force of gravity via graviton bosons. The problem is that the theory one gets in this way is not renormalizable (it predicts infinite values for some observable properties such as the mass of particles) and therefore cannot be used to make meaningful physical predictions. As a result, theorists have taken up more radical approaches to the problem of quantum gravity, the most popular approaches being string theory and loop quantum gravity. Although some quantum gravity theories, such as string theory, try to unify gravity with the other fundamental forces, others, such as loop quantum gravity, make no such attempt; instead, they make an effort to quantize the gravitational field while it is kept separate from the other forces.

Strictly speaking, the aim of quantum gravity is only to describe the quantum behavior of the gravitational field and should not be confused with the objective of unifying all fundamental interactions into a single mathematical framework. A quantum field theory of gravity that is unified with a grand unified theory is sometimes referred to as a theory of everything (TOE). While any substantial improvement into the present understanding of gravity would aid further work towards unification, the study of quantum gravity is a field in its own right with various branches having different approaches to unification.

One of the difficulties of formulating a quantum gravity theory is that quantum gravitational effects only appear at length scales near the Planck scale, around 10−35 meter, a scale far smaller, and equivalently far larger in energy, than those currently accessible by high energy particle accelerators. Therefore physicists lack experimental data which could distinguish between the competing theories which have been proposed and thus gedanken experimental approaches are suggested as a testing tool for these theories.

Quantum mechanics

Quantum mechanics (QM; also known as quantum physics, quantum theory, the wave mechanical model, or matrix mechanics), including quantum field theory, is a fundamental theory in physics which describes nature at the smallest scales of energy levels of atoms and subatomic particles.Classical physics, the physics existing before quantum mechanics, describes nature at ordinary (macroscopic) scale. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large (macroscopic) scale.

Quantum mechanics differs from classical physics in that energy, momentum, angular momentum and other quantities of a bound system are restricted to discrete values (quantization); objects have characteristics of both particles and waves (wave-particle duality); and there are limits to the precision with which quantities can be measured (uncertainty principle).Quantum mechanics gradually arose from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, and from the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Erwin Schrödinger, Werner Heisenberg, Max Born and others. The modern theory is formulated in various specially developed mathematical formalisms. In one of them, a mathematical function, the wave function, provides information about the probability amplitude of position, momentum, and other physical properties of a particle.

Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, and the laser, the transistor and semiconductors such as the microprocessor, medical and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA.

Quantum of Solace

Quantum of Solace is a 2008 spy film, the twenty-second in the James Bond series produced by Eon Productions, directed by Marc Forster and written by Paul Haggis, Neal Purvis and Robert Wade. It is the second film to star Daniel Craig as the fictional MI6 agent James Bond. The film also stars Olga Kurylenko, Mathieu Amalric, Gemma Arterton, Jeffrey Wright, and Judi Dench. In the film, Bond seeks revenge for the death of his lover, Vesper Lynd, and is assisted by Camille Montes, who is plotting revenge for the murder of her own family. The trail eventually leads them to wealthy businessman Dominic Greene, a member of the Quantum organisation, who intends to stage a coup d'état in Bolivia to seize control of their water supply.

Producer Michael G. Wilson developed the film's plot while the previous film in the series, Casino Royale, was being shot. Purvis, Wade, and Haggis contributed to the script. Craig and Forster had to write some sections themselves due to the Writers' Strike, though they were not given the screenwriter credit in the final cut. The title was chosen from a 1959 short story in Ian Fleming's For Your Eyes Only, though the film does not contain any elements of that story. Location filming took place in Mexico, Panama, Chile, Italy, Austria and Wales, while interior sets were built and filmed at Pinewood Studios. Forster aimed to make a modern film that also featured classic cinema motifs: a vintage Douglas DC-3 was used for a flight sequence, and Dennis Gassner's set designs are reminiscent of Ken Adam's work on several early Bond films. Taking a course away from the usual Bond villains, Forster rejected any grotesque appearance for the character Dominic Greene to emphasise the hidden and secret nature of the film's contemporary villains.

The film was also marked by its frequent depictions of violence, with a 2012 study by the University of Otago in New Zealand finding it to be the most violent film in the franchise. Whereas Dr. No featured 109 "trivial or severely violent" acts, Quantum of Solace had a count of 250—the most depictions of violence in any Bond film—even more prominent since it was also the shortest film in the franchise. Quantum of Solace premiered at the Odeon Leicester Square on 29 October 2008, gathering mixed reviews, which mainly praised Craig's gritty performance and the film's action sequences, but felt that the film was less impressive than its predecessor Casino Royale. As of September 2016, it is the fourth-highest-grossing James Bond film, without adjusting for inflation, earning $586 million worldwide, and becoming the seventh highest-grossing film of 2008.

Qubit

In quantum computing, a qubit () or quantum bit (sometimes qbit) is the basic unit of quantum information—the quantum version of the classical binary bit physically realized with a two-state device. A qubit is a two-state (or two-level) quantum-mechanical system, one of the simplest quantum systems displaying the peculiarity of quantum mechanics. Examples include: the spin of the electron in which the two levels can be taken as spin up and spin down; or the polarization of a single photon in which the two states can be taken to be the vertical polarization and the horizontal polarization. In a classical system, a bit would have to be in one state or the other. However, quantum mechanics allows the qubit to be in a coherent superposition of both states/levels simultaneously, a property which is fundamental to quantum mechanics and quantum computing.

Richard Feynman

Richard Phillips Feynman (; May 11, 1918 – February 15, 1988) was an American theoretical physicist, known for his work in the path integral formulation of quantum mechanics, the theory of quantum electrodynamics, and the physics of the superfluidity of supercooled liquid helium, as well as in particle physics for which he proposed the parton model. For his contributions to the development of quantum electrodynamics, Feynman, jointly with Julian Schwinger and Shin'ichirō Tomonaga, received the Nobel Prize in Physics in 1965.

Feynman developed a widely used pictorial representation scheme for the mathematical expressions describing the behavior of subatomic particles, which later became known as Feynman diagrams. During his lifetime, Feynman became one of the best-known scientists in the world. In a 1999 poll of 130 leading physicists worldwide by the British journal Physics World he was ranked as one of the ten greatest physicists of all time.He assisted in the development of the atomic bomb during World War II and became known to a wide public in the 1980s as a member of the Rogers Commission, the panel that investigated the Space Shuttle Challenger disaster. Along with his work in theoretical physics, Feynman has been credited with pioneering the field of quantum computing and introducing the concept of nanotechnology. He held the Richard C. Tolman professorship in theoretical physics at the California Institute of Technology.

Feynman was a keen popularizer of physics through both books and lectures including a 1959 talk on top-down nanotechnology called There's Plenty of Room at the Bottom and the three-volume publication of his undergraduate lectures, The Feynman Lectures on Physics. Feynman also became known through his semi-autobiographical books Surely You're Joking, Mr. Feynman! and What Do You Care What Other People Think? and books written about him such as Tuva or Bust! by Ralph Leighton and the biography Genius: The Life and Science of Richard Feynman by James Gleick.

Schrödinger's cat

Schrödinger's cat is a thought experiment, sometimes described as a paradox, devised by Austrian physicist Erwin Schrödinger in 1935. It illustrates what he saw as the problem of the Copenhagen interpretation of quantum mechanics applied to everyday objects. The scenario presents a hypothetical cat that may be simultaneously both alive and dead, a state known as a quantum superposition, as a result of being linked to a random subatomic event that may or may not occur.

The thought experiment is also often featured in theoretical discussions of the interpretations of quantum mechanics. Schrödinger coined the term Verschränkung (entanglement) in the course of developing the thought experiment.

Schrödinger equation

The Schrödinger equation is a linear partial differential equation that describes the wave function or state function of a quantum-mechanical system. It is a key result in quantum mechanics, and its discovery was a significant landmark in the development of the subject. The equation is named after Erwin Schrödinger, who derived the equation in 1925, and published it in 1926, forming the basis for the work that resulted in his Nobel Prize in Physics in 1933.

In classical mechanics, Newton's second law (F = ma) is used to make a mathematical prediction as to what path a given physical system will take over time following a set of known initial conditions. Solving this equation gives the position and the momentum of the physical system as a function of the external force on the system. Those two parameters are sufficient to describe its state at each time instant. In quantum mechanics, the analogue of Newton's law is Schrödinger's equation.

The concept of a wave function is a fundamental postulate of quantum mechanics; the wave function defines the state of the system at each spatial position, and time. Using these postulates, Schrödinger's equation can be derived from the fact that the time-evolution operator must be unitary, and must therefore be generated by the exponential of a self-adjoint operator, which is the quantum Hamiltonian. This derivation is explained below.

In the Copenhagen interpretation of quantum mechanics, the wave function is the most complete description that can be given of a physical system. Solutions to Schrödinger's equation describe not only molecular, atomic, and subatomic systems, but also macroscopic systems, possibly even the whole universe. Schrödinger's equation is central to all applications of quantum mechanics including quantum field theory which combines special relativity with quantum mechanics. Theories of quantum gravity, such as string theory, also do not modify Schrödinger's equation.[citation needed]

The Schrödinger equation is not the only way to study quantum mechanical systems and make predictions. The other formulations of quantum mechanics include matrix mechanics, introduced by Werner Heisenberg, and the path integral formulation, developed chiefly by Richard Feynman. Paul Dirac incorporated matrix mechanics and the Schrödinger equation into a single formulation.

String theory

In physics, string theory is a theoretical framework in which the point-like particles of particle physics are replaced by one-dimensional objects called strings. It describes how these strings propagate through space and interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force. Thus string theory is a theory of quantum gravity.

String theory is a broad and varied subject that attempts to address a number of deep questions of fundamental physics. String theory has been applied to a variety of problems in black hole physics, early universe cosmology, nuclear physics, and condensed matter physics, and it has stimulated a number of major developments in pure mathematics. Because string theory potentially provides a unified description of gravity and particle physics, it is a candidate for a theory of everything, a self-contained mathematical model that describes all fundamental forces and forms of matter. Despite much work on these problems, it is not known to what extent string theory describes the real world or how much freedom the theory allows in the choice of its details.

String theory was first studied in the late 1960s as a theory of the strong nuclear force, before being abandoned in favor of quantum chromodynamics. Subsequently, it was realized that the very properties that made string theory unsuitable as a theory of nuclear physics made it a promising candidate for a quantum theory of gravity. The earliest version of string theory, bosonic string theory, incorporated only the class of particles known as bosons. It later developed into superstring theory, which posits a connection called supersymmetry between bosons and the class of particles called fermions. Five consistent versions of superstring theory were developed before it was conjectured in the mid-1990s that they were all different limiting cases of a single theory in eleven dimensions known as M-theory. In late 1997, theorists discovered an important relationship called the AdS/CFT correspondence, which relates string theory to another type of physical theory called a quantum field theory.

One of the challenges of string theory is that the full theory does not have a satisfactory definition in all circumstances. Another issue is that the theory is thought to describe an enormous landscape of possible universes, and this has complicated efforts to develop theories of particle physics based on string theory. These issues have led some in the community to criticize these approaches to physics and question the value of continued research on string theory unification.

Uncertainty principle

In quantum mechanics, the uncertainty principle (also known as Heisenberg's uncertainty principle) is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, known as complementary variables or canonically conjugate variables such as position x and momentum p, can be known.

Introduced first in 1927, by the German physicist Werner Heisenberg, it states that the more precisely the position of some particle is determined, the less precisely its momentum can be known, and vice versa. The formal inequality relating the standard deviation of position σx and the standard deviation of momentum σp was derived by Earle Hesse Kennard later that year and by Hermann Weyl in 1928:

where ħ is the reduced Planck constant, h/(2π).

Historically, the uncertainty principle has been confused with a related effect in physics, called the observer effect, which notes that measurements of certain systems cannot be made without affecting the systems, that is, without changing something in a system. Heisenberg utilized such an observer effect at the quantum level (see below) as a physical "explanation" of quantum uncertainty. It has since become clearer, however, that the uncertainty principle is inherent in the properties of all wave-like systems, and that it arises in quantum mechanics simply due to the matter wave nature of all quantum objects. Thus, the uncertainty principle actually states a fundamental property of quantum systems and is not a statement about the observational success of current technology. It must be emphasized that measurement does not mean only a process in which a physicist-observer takes part, but rather any interaction between classical and quantum objects regardless of any observer.Since the uncertainty principle is such a basic result in quantum mechanics, typical experiments in quantum mechanics routinely observe aspects of it. Certain experiments, however, may deliberately test a particular form of the uncertainty principle as part of their main research program. These include, for example, tests of number–phase uncertainty relations in superconducting or quantum optics systems. Applications dependent on the uncertainty principle for their operation include extremely low-noise technology such as that required in gravitational wave interferometers.

Wave interference

In physics, interference is a phenomenon in which two waves superpose to form a resultant wave of greater, lower, or the same amplitude. Constructive and destructive interference result from the interaction of waves that are correlated or coherent with each other, either because they come from the same source or because they have the same or nearly the same frequency. Interference effects can be observed with all types of waves, for example, light, radio, acoustic, surface water waves, gravity waves, or matter waves. The resulting images or graphs are called interferograms.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.