Introduction to Quantum Mechanics · Quantum Metropolis software
Software based on Perturbation theory and Monte Carlo method of
calculation. Tool for work in laboratory, for research, for
industry and for the educational system. Applications in Physics,
Chemistry and Engineering.
[[Introduction to quantum mechanics]]Quantum mechanics (or quantum theory) is a branch of physics that deals with the behavior of matter and energy at the scale of atoms and subatomic particles. Quantum mechanics is fundamental to our understanding of all fundamental forces of nature except gravity.
Quantum mechanics underpins many branches of physics, including electromagnetism, particle physics, condensed matter physics, and even parts of cosmology. Quantum mechanics is also essential to the theory of chemical bonds (and therefore of all chemistry), structural biology, and technologies such as electronics, information technology, and nanotechnology. A century of experiments and work in applied physics has proven that quantum mechanics is correct and has practical uses.
Quantum mechanics began in the early 20th century with the pioneering work of Max Planck and Niels Bohr. Max Born coined the term "quantum mechanics" in 1924. The physics community soon accepted quantum mechanics because of its great accuracy in empirical predictions, especially in systems where classical mechanics fails. A great success of quantum mechanics in its early days was the explanation of wave-particle duality, that is, how at subatomic levels what humans came to call subatomic particles have wave properties and what was considered a wave has corpuscular property. Quantum mechanics can also be applied to a much wider range of situations than general relativity, such as systems where the scale is atomic or smaller, and those that have very low or very high energies or subject to lower temperatures.
[[An elegant example]]
The most elegant character on the quantum stage is the double slit experiment. It demonstrates wave-particle duality, and highlights several features of quantum mechanics. Photons emitted from some source such as a laser will behave differently depending on how many slits are in their path. When only one slit is present, the light observed on the screen will appear as a narrow diffraction pattern.
However, things start to get weird if two slits are introduced into the experiment. With two slits present, what will arrive on a remote sensing screen will be a quantum superposition of two waves. As the illustration shows, one wave from the top slit and one from the bottom will overlap on the detection screen, and so they are superimposed. The same basic experiment can be done by shooting an electron into a double slit. The wave nature of light causes light waves passing through both slits to interfere, creating an interference pattern of light and dark bands on the screen. However, on the screen, light is always absorbed in discrete particles called photons.
What's even stranger is what happens when the light source is reduced to the point where only one photon is emitted at a time. Normal intuition says that the photon will pass through either one or the other slit as a particle, and hit the screen as a particle. However, any lone photon passes through both slits as a wave, and creates a wave pattern that interferes with itself. And yet another level of weirdness - the photon is then detected as a particle on the screen.
Where a photon or electron will appear on the detection screen will depend on the probabilities calculated by adding the amplitudes of the two waves at each point, and squaring that sum. However, the location of where a photon, or an electron, will hit the screen will depend on a completely random process. The final result will be in accordance with the probabilities that can be calculated. How nature manages to accomplish this feat is a mystery.
Photons work like waves as they pass through the slits. When two slits are present, the "wave function" belonging to each photon passes through each slit. The wave functions are superimposed across the entire detection screen, yet on the screen, only one particle, a photon, appears and its position conforms to strict probability rules. So what men interpret as the wave nature of photons and the corpuscular nature of photons should appear in the final results.
In the late 19th century, classical physics seemed almost complete to some, but this perception was challenged by experimental findings that such physics could not explain. Physical theories that worked well for situations on the human scale of space and time failed to explain situations that were too small, too massive, or moving at too high speeds. A view of the universe that had been imposed by ordinary observations was being challenged by observations and theories that correctly predicted where classical mechanics had given impossible results. But the picture that emerged was that of a universe that refused to behave according to human common sense.
On large scales the theory of relativity said that time does not pass at the same rate for all observers, that mass could be converted into energy and vice versa, that two objects, moving at speeds greater than half the speed of light, could not approach at a speed exceeding that of light, that time progresses at slower rates near massive bodies, etc. Things didn't work out the way experiments with rulers and clocks here on earth had led humans to expect.
In the small ones, the wonders were even more plentiful. A photon or electron has neither a position nor a trajectory between the points where it is emitted and where it is detected. The points where such particles can be detected are not where one would expect them to be based on everyday experience. With a small probability, the detection point could even be on the other side of a solid barrier. Probability is a salient factor in interactions at this scale. The trajectory of any atomic-scale object is imprecise in the sense that any measurement that makes an object's position more precise reduces the precision with which we can observe its velocity, and vice versa.
In the age of classical physics, Isaac Newton and his followers believed that light consisted of a beam of particles, and others believed that light consisted of waves propagating in some medium. Rather than finding an experiment that proved one side was right, physicists found that an experiment designed to show the frequency of light or other "wave characteristics" will demonstrate the wave nature of light, while an experiment designed to show its linear momentum or other "corpuscular feature" will reveal the corpuscular nature of light. Even more, atom-sized objects, and even some molecules, have revealed their wave nature when properly observed.
The foundations of quantum mechanics began with the first work on the properties of light in the 17th century and the discovery of the properties of electricity and magnetism in the early 19th century. In 1690 Christiaan Huygens used wave theory to explain the reflection and refraction of light. Isaac Newton believed that light consisted of infinitesimally small particles that he called "corpuscles". In 1827, Thomas Young and Augustin Fresnel conducted experiments on the interference of light which found results that were inconsistent with the corpuscular theory of light. All theoretical and empirical results throughout the 19th century seemed inconsistent with Newton's corpuscular theory of light.
Later experiments identified phenomena, such as the photoelectric effect, that were consistent only with a packet, or quantum, model of light. When light strikes an electrical conductor, electrons appear to move away from their original positions. In a photoelectric material, such as the light meter in a camera, light falling on the metal detector causes electrons to move. Increasing the intensity of light that has only one frequency will cause more electrons to move. But getting electrons to move faster requires an increase in the frequency of light. Therefore, the intensity of the light controls the electric current through the circuit, while its frequency controls its voltage. These observations contradicted the wave theory of light derived from the study of sound waves and ocean waves, where the intensity of the initial impulse was enough to predict the energy of the resulting wave. In the case of light, energy was a function of frequency alone, a fact that needed an explanation. It was also necessary to reconcile experiments that showed the corpuscular nature of light with other experiments that revealed its wave nature.
In 1874, George Johnstone Stoney was the first to propose that a physical quantity, an electric charge, could not vary by less than an irreducible value. Therefore, electric charge was the first physical quantity to be theoretically quantized. In 1873, James Clerk Maxwell theoretically demonstrated that an oscillating electrical circuit should produce electromagnetic waves. Due to Maxwell's equations it was possible to calculate the speed of electromagnetic radiation purely through electrical and magnetic measurements, and the calculated value corresponded very closely to the measured speed of light. In 1888, Heinrich Hertz made an electrical device that produced radiation whose frequency was lower than that of visible light, radiation that we now call microwaves. Early researchers differed in the way they explained the fundamental nature of what is called electromagnetic radiation, some claiming that it was composed of particles, while others said it was a wave phenomenon. In classical physics these ideas are mutually exclusive.
Quantum mechanics began with Max Planck's pioneering paper in 1900 on blackbody radiation, marking the first appearance of the quantum hypothesis. In 1905, Albert Einstein extended Planck's theory to the photoelectric effect. In 1913, Niels Bohr released his atomic model, incorporating Planck's quantum theory in an essential way. These and other early 20th century works form the ancient quantum theory.
In 1924, Louis de Broglie created the hypothesis of wave-particle duality. This hypothesis proved to be a turning point, and quickly led to a more sophisticated and complete variant of quantum mechanics. Important contributors in the mid-1920s to what came to be called the "new quantum mechanics" or "new physics" were Max Born, Paul Dirac, Werner Heisenberg, Wolfgang Pauli, and Erwin Schrödinger. In the late 1940s and early 1950s, Julian Schwinger, Sin-Itiro Tomonaga, Richard Feynman, and Freeman Dyson discovered quantum electrodynamics, which significantly advanced our understanding of the quantum theory of electromagnetism and the electron. Later Murray Gell-Mann developed a related theory of the strong nuclear force called quantum chromodynamics.
[[Spectroscopy and beyond]]
When white light passes through a prism, or the edge of a mirror or a piece of glass, or raindrops to form a rainbow, the white light is broken down into a spectrum. This spectrum reveals that white light is composed of light of all colors and therefore of all frequencies.
When a sample composed of a pure chemical element emits light by heating or other agents, the spectrum of the emitted light, called the emission spectrum, is characteristic of that element and the temperature to which it is heated. Unlike the spectrum of white light, an emission spectrum is not a broad band composed of all colors from red to violet, but consists of narrow bands, one color each and separated from the others by bands of darkness. Such a figure is called the line spectrum. An emission spectrum may also contain lines outside the range of visible light, detectable only by special photographic film or electronic equipment.
It has been hypothesized that an atom emits electromagnetic radiation in the same way that a violin string "radiates" sound - not just with fundamental frequency, but also with higher harmonics. A mathematical description of the line spectrum was not created until 1885, when Johann Jakob Balmer proposed the a formula to describe the line spectrum of atomic hydrogen. The formula can be generalized to serve atoms other than hydrogen, a fact that will not stop us, except the realization that this is why the denominator in the first fraction is expressed as a square.
The next development was Pieter Zeeman's discovery of the Zeeman effect, which had the physical explanation worked out by Hendrik Antoon Lorentz. Lorentz hypothesized that the hydrogen line spectrum resulted from vibrating electrons. It is possible to obtain information about what happens inside the atom because the moving electrons generate a magnetic field. So an electron can be influenced by an external magnetic field, similar to the way a metallic magnet will attract or repel another.
The Zeeman effect could be interpreted to mean that the line spectrum results from electrons vibrating in their orbits, but classical physics could not explain why an electron does not spiral into the nucleus, nor why electron orbits have the required properties. to produce the observed line spectrum, describable by the Balmer formula.
[[Interpretations of quantum mechanics]]
An interpretation of quantum mechanics is an attempt to answer the question: What exactly is quantum mechanics about? The question has its historical roots in the nature of quantum mechanics, which from the beginning was regarded as a radically different theory from all other precedents; nevertheless constituting one of the "most proven and most successful theories in the history of science".
Quantum mechanics, as a scientific theory, has been very successful in predicting experimental results. This means, first, that there is a well-defined correspondence between the elements of the formalism (mathematical and abstract) and the experimental procedures and, second, that the results obtained in this experiment are extremely in accordance with the formalism. However, the basic questions about exactly what quantum mechanics means are a proposition in themselves, and require some additional philosophical explanations that clearly transcend the empirical limits of science.
[The scientific controversy between the interpretations]
With the rupture between Classical Physics and Quantum Physics that emerges from the quantum of action, there is the development of Quantum Mechanics and a clash in the scientific community about the meaning about the world that emerges with the quantum theory when interpreting the experimental results, mathematical formalism, probabilities, uncertainty and finally reality itself. The controversy is established in the initial construction of the theory and extends to the present day. Some interpretive positions were opposed, such as Einstein, who defended a realist position, and the Copenhagen Interpretation, which defended a positivist position.
Understanding the mathematical structure of the theory has gone through several preliminary stages of development. At first, Schrödinger did not understand the probabilistic nature of the wave function associated with the electron. It was Max Born who proposed the interpretation of the probability distribution in space for the position of the associated particle. Other leading scientists, such as Albert Einstein, had great difficulty in agreeing on the basic implications of the theory. Even though they are treated by some as problems already solved, they still have great importance for interpretation activities, since the nature of the answers sometimes far transcends scientific limits, and usually they are not unambiguously determined.
From this it should not be assumed, however, that most physicists consider that quantum mechanics needs an interpretation beyond the minimum contained in the instrumentalist interpretation, even if the majority nowadays show solidarity with the Copenhagen Interpretation; in popularity followed not so closely by the Interpretation of Consistent Stories and the Interpretation of Many Worlds. Most physicists advocating a different interpretation of the minimum bear with them, however, that non-instrumental questions - in particular ontological questions - are not ultimately relevant to physics; resorting, when bored by purely philosophical questions, often to Paul Dirac's - sometimes incorrectly attributed to Richard Faynman - view: "Shut up and calculate!" .
Thus, the different interpretations of quantum mechanics are grouped into three schools: the realist, the orthodox and the agnostic. The positions of each school are determined by answering the following question: suppose we measure a particle at point A, where was it just before the measurement?
According to the realist school, it was at point A. This implies that quantum mechanics is an incomplete theory and that there are therefore other variables (hidden variables) that are needed to describe the behavior of the particle. Einstein defended this idea.
According to the orthodox school, it was the act of measurement that collapsed the wave function, forcing the particle to obtain a defined position. This interpretation, known as the Copenhagen interpretation, is advocated by Bohr.
According to the agnostic school, you cannot say something that you cannot measure. Therefore, we would need to measure the position of the particle to know where it is, so it is not known where the particle was before measuring it and the proposed questioning is meaningless. As Pauli put it: “'We shouldn't torture ourselves trying to solve a problem about something we don't know whether it exists or not, just as it is useless to try to solve the age-old problem of how many angels can fit sitting on the end of a needle'.
[[Difficulties of a direct interpretation]]
Initially, the accepted mathematical framework of quantum mechanics was based heavily on mathematical abstractions such as Hilbert space and operators on Hilbert space. In classical mechanics and electromagnetism, on the other hand, the properties of a material point or those of a field are described by real numbers or functions defined in two or three dimensions. Clearly, locally speaking, for these theories it seems to be less necessary to provide a special interpretation for these numbers and functions.
Furthermore, measurement processes play an apparently essential role in this theory. They relate to abstract elements of the theory, such as the wave function, to operationally defined values, such as probabilities. Measurements interact with the state of the system in some peculiar ways, as illustrated in the double slit experiment.
The mathematical formalism used to describe the temporal evolution of a non-relativistic system proposes, in a way, two types of transformations:
Reversible transformations described by the unitary operator in state space. These transformations can be determined by solving the Schrödinger equation.
Non-reversible and non-deterministic transformations mathematically described by more complicated transformations (see quantum operators). Examples of these transformations are those experienced by the system as a result of the measurement.
A restricted version of the quantum mechanical interpretation problem consists in providing some kind of plausible picture, precisely for this second type of transformation. This problem must be addressed purely by mathematical reductions, for example by interpretation as in many worlds or consistent stories.
In addition to the non-deterministic and irreversible characteristics of the measurement process, there are other elements of quantum physics that profoundly distinguish it from classical physics and which cannot be represented by any classical figure. One of these is the phenomenon of entanglement, illustrated by the EPR paradox, which seems to violate the principle of local causality.
Another obstruction to direct interpretation is the phenomenon of complementarity, which seems to violate basic principles of propositional logic. Complementarity says that there is no logical figure (obeying classical propositional logic) that can simultaneously describe and be used to justify all the properties of a quantum system S. This can often be formulated by saying that there is a set of "complementary" propositions. A and B that can describe S, but not at the same time. Examples of A and B are propositions involving the description of S in wave and corpuscular form. The preceding statement is a part of Niels Bohr's original formulation, which is often equated with the principle of complementarity itself.
Completeness is not usually taken as a proof of the failure of classical logic, although Hilary Putnam has raised this point of view in her work Is logic empirical?. In contrast, complementarity means that the composition of physical properties of S (such as position and momentum varying over a certain range) have propositional connectivity that does not obey the laws of classical propositional logic. As is now well demonstrated (Omnès, 1999) "the origin of complementarity lies in the non-commutativity of operators describing observables in quantum mechanics."
[[Problematic state of visions and interpretations]]
The precise ontological status of each of the interpretive views remains a topic of philosophical argument.
In other words, if we interpret a formal quantum mechanical structure X through structure Y (via a mathematical equivalence of its structures), what is the state of Y? This is an old question of scientific formalism, seen from a new angle.
Some physicists, for example Asher Peres and Chris Fuchs, put forward the argument that an interpretation is nothing more than a form equivalence between a set of laws for operating on experimental data. This should suggest that any interpretation exercise is unnecessary.
Any modern scientific theory requires at least an instrumental description which can relate mathematical formalism to practical experiment. In the case of quantum mechanics, the most common instrumental description is an assertion of statistical regularity between the preparation process and the measurement process. To this is usually added the statement of statistical regularity of a measurement process performed on a system in a given state φ.
[[Properties of interpretations]]
An interpretation can be characterized by the fact that it satisfies certain properties, such as:
c) local reality
To exemplify these properties, we must be more explicit about the kind of insight that interpretation provides. To finally consider an interpretation as a correspondence between elements of the mathematical formalism M and the elements of an interpretive structure I, where:
The mathematical formalism consists of the Hilbertian ket-vector space mechanism, self-adjoint operators acting on the ket-vector space, with unitary temporal dependence of the ket-vectors and measurement operations. In this context a measurement operation can be considered as a transformation which takes a ket-vector into a probability distribution of ket-vectors. See also quantum operators for a formalization of this concept.
The interpretation framework includes states, transitions between states, measurement operations and possible information about the spatial extent of these elements. As a measurement operation which returns a value and results in a possible change of state in the system. Spatial information, for example, can be displayed by states represented as functions in the spatial configuration. The transition must be non-deterministic or probabilistic or have infinite states. In any case, the critical conception of an interpretation is that the elements of I are treated as physical reality.
In this sense, an interpretation can be interpreted as a semantics for mathematical formalism. In particular, the limited instrumentalist view of quantum mechanics outlined in the previous section is not a complete interpretation since it makes no reference to physical reality.
The current use in physics of "completeness" and "realism" is often considered to have been used originally in the work (Einstein, 1935) that proposed the EPR paradox. In this work the authors propose the concept of "element of reality" and "completeness" of a physical theory. Although they did not define "element of reality", they propose a good characterization for it, terming it as a quantity for which a value can be predicted before the measurement itself disturbs it in any way. The EPR defines a "complete physical theory" as one in which every element of physical reality is considered by the theory. From the semantic point of view of interpretation, a theory of interpretation is complete if every element of the structure of interpretation is considered by the mathematical formalism. Realism is a property of each of the mathematical elements of formalism; any element is real if it corresponds to something in the interpretation structure. For example, in some interpretations of quantum mechanics (such as the many worlds interpretation) the ket vector associated with the system is taken to correspond to an element of physical reality, while in others this is not the case.
Determinism is a property characterized by the change of state due to the passage of time, in other words, it indicates that the state at a given instant of time in the future is a function of the present state (see evolution). This should allow us to clarify whether or not a particular interpretive framework is deterministic, precisely because it has or does not have a clear choice of time frame. Furthermore, a given theory could have two interpretations, one of which is deterministic and the other not.
Local reality has two parts:
1) The value returned by the measurement corresponds to the value of some function in the state space. In other words, this value is an element of reality;
Measurement effects must have a propagation speed that does not exceed some universal barrier (ie the speed of light). In order to make sense, measurement operations must be spatially located in an interpretation framework.
A precise formulation of local realism in terms of a theory of hidden local variables was proposed by John Bell.
2) Bell's Theorem, and its experimental verification, constrains the kinds of properties that a quantum theory can have. For example, Bell's theorem implies that quantum mechanics cannot satisfy local realism.
R. Carnap, The interpretation of physics, Foundations of Logic and Mathematics of the International Encyclopedia of Unified Science, Univesity of Chicago Press, 1939.
D. Deutsch, The Fabric of Reality, Allen Lane, 1997. Though written for general audiences, in this book Deutsch argues forcefully against instrumentalism.
A. Einstein, B. Podolsky and N. Rosen, Can quantum-mechanical description of physical reality be considered complete? Phys. Rev. 47 777, 1935.
C. Fuchs and A. Peres, Quantum theory needs no ‘interpretation’ , Physics Today, March 2000.
N. Herbert. Quantum Reality: Beyond the New Physics, New York: Doubleday, ISBN 0-385-23569-0.
R. Jackiw and D. Kleppner, One Hundred Years of Quantum Physics, Science, Vol. 289 Issue 5481, p893, August 2000.
M. Jammer, The Conceptual Development of Quantum Mechanics. New York: McGraw-Hill, 1966.
M. Jammer, The Philosophy of Quantum Mechanics. New York: Wiley, 1974.
W. M. de Muynck, Foundations of quantum mechanics, an empiricist approach, Dordrecht: Kluwer Academic Publishers, 2002, ISBN 1-4020-0932-1
R. Omnès, Understanding Quantum Mechanics, Princeton, 1999.
H. Reichenbach, Philosophic Foundations of Quantum Mechanics, Berkeley: University of California Press, 1944.
J. A. Wheeler and H. Z. Wojciech (eds), Quantum Theory and Measurement, Princeton: Princeton University Press, ISBN 0-691-08316-9.
Quantum Metropolis, S.M.C.
28034 Madrid, Spain · https://fisica-cuantica.com/
Quantum Metropolis · Quantum Mechanics Software
Member of MAPO: European network
Quantum simulation software. Applications in quantum computing, decoherence studies, quantum entanglement, networks, quantum information systems, algorithms, quantum tunneling, programming and quantum dots and wells.