Close Modal

Quanta and Fields

The Biggest Ideas in the Universe

Look inside
Hardcover
$26.00 US
5.7"W x 8.5"H x 1"D   | 14 oz | 12 per carton
On sale May 14, 2024 | 304 Pages | 9780593186602
The instant New York Times bestseller

Quanta and Fields
, the second book of Sean Carroll’s already internationally acclaimed series The Biggest Ideas in the Universe, is an adventure into the bare stuff of reality.

 
Sean Carroll is creating a profoundly new approach to sharing physics with a broad audience, one that goes beyond analogies to show how physicists really think. He cuts to the bare mathematical essence of our most profound theories, explaining every step in a uniquely accessible way.  

Quantum field theory is how modern physics describes nature at its most profound level. Starting with the basics of quantum mechanics itself, Sean Carroll explains measurement and entanglement before explaining how the world is really made of fields. You will finally understand why matter is solid, why there is antimatter, where the sizes of atoms come from, and why the predictions of quantum field theory are so spectacularly successful. Fundamental ideas like spin, symmetry, Feynman diagrams, and the Higgs mechanism are explained for real, not just through amusing stories. Beyond Newton, beyond Einstein, and all the intuitive notions that have guided homo sapiens for millennia, this book is a journey to a once unimaginable truth about what our universe is.
Praise for Quanta and Fields:

“Readers will be electrified by his discussion of wave functions, entanglement, fields, and so much more. From the most infinitesimal of subatomic particles to the seemingly vast infinities of the universe’s great expanse, Carroll’s latest inquiry illuminates, well, everything.” Booklist
© Jennifer Ouellette


Sean Carroll is Homewood Professor of Natural Philosophy at Johns Hopkins University, and Fractal Faculty at the Santa Fe Institute. He is host of the Mindscape podcast, and author of From Eternity to Here, The Particle at the End of the Universe, The Big Picture, and Something Deeply Hidden. He has been awarded prizes and fellowships by the National Science Foundation, NASA, the American Institute of Physics, the Royal Society of London, and many others. He lives in Baltimore with his wife, writer Jennifer Ouellette.

View titles by Sean Carroll
ONE
Wave Functions

As the nineteenth century drew to a close, you would have forgiven physicists for hoping that they were on track to understand everything. The universe, according to this tentative picture, was made of particles that were pushed around by fields.

The idea of fields filling space had really taken off over the course of the 1800s. Earlier, Isaac Newton had presented a beautiful and compelling theory of motion and gravity, and Pierre-Simon Laplace had shown how we could reformulate that theory in terms of a gravitational field stretching between every object in the universe. A field is just something that has a value at each point in space. The value could be a simple number, or it could be a vector or something more complicated, but any field exists everywhere through space.

But if all you cared about was gravity, the field seemed optional-a point of view you could choose to take or not, depending on your preferences. It was equally okay to think as Newton did, directly in terms of the force created on one object by the gravitational pull of others without anything stretching between them.

That changed in the nineteenth century, as physicists came to grips with electricity and magnetism. Electrically charged objects exert forces on each other, which is natural to attribute to the existence of an electric field stretching between them. Experiments by Michael Faraday showed that a moving magnet could induce electrical current in a wire without actually touching it, pointing to the existence of a separate magnetic field, and James Clerk Maxwell managed to combine these two kinds of fields into single a theory of electromagnetism, published in 1873. This was an enormous triumph of unification, explaining a diverse set of electrical and magnetic phenomena in a single compact theory. "Maxwell's equations" bedevil undergraduate physics students to this very day.

One of the triumphant implications of Maxwell's theory was an understanding of the nature of light. Rather than a distinct kind of substance, light is a propagating wave in the electric and magnetic fields, also known as electromagnetic radiation. We think of electromagnetism as a "force," and it is, but Maxwell taught us that fields carrying forces can vibrate, and in the case of electric and magnetic fields those vibrations are what we perceive as light. The quanta of light are particles called photons, so we will sometimes say "photons carry the electromagnetic force." But at the moment we're still thinking classically.

Take a single charged particle, like an electron. Left sitting by itself, it will have an electric field surrounding it, with lines of force pointing toward the electron. The force will fall off as an inverse-square law, just as in Newtonian gravity. If we move the electron, two things happen: First, a charge in motion creates a magnetic field as well as an electric one. Second, the existing electric field will adjust how it is oriented in space, so that it remains pointing toward the particle. And together, these two effects (small magnetic field, small deviation in the existing electric field) ripple outward, like waves from a pebble thrown into a pond. Maxwell found that the speed of these ripples is precisely the speed of light-because it is light. Light, of any wavelength from radio to x-rays and gamma rays, is a propagating vibration in the electric and magnetic fields. Almost all the light you see around you right now has its origin in a charged particle being jiggled somewhere, whether it's in the filament of a lightbulb or the surface of the sun.

Simultaneously in the nineteenth century, the role of particles was also becoming clear. Chemists, led by John Dalton, championed the idea that matter was made of individual atoms, with one specific kind of atom associated with each chemical element. Physicists belatedly caught on, once they realized that thinking of gases as collections of bouncing atoms could explain things like temperature, pressure, and entropy.

But the term "atom," borrowed from the ancient Greek idea of an indivisible elementary unit of matter, turned out to be a bit premature. Though they are the building blocks of chemical elements, modern-day atoms are not indivisible. A quick-and-dirty overview, with details to be filled in later: atoms consist of a nucleus made of protons and neutrons, surrounded by orbiting electrons. Protons have positive electrical charge, neutrons have zero charge, and electrons have negative charge. We can make a neutral atom if we have equal numbers of protons and electrons, since their electrical charges will cancel each other out. Protons and neutrons have roughly the same mass, with neutrons being just a bit heavier, but electrons are much lighter, about 1/1,800th the mass of a proton. So most of the mass in a person or another macroscopic object comes from the protons and neutrons. The lightweight electrons are more able to move around and are therefore responsible for chemical reactions as well as the flow of electricity. These days we know that protons and neutrons are themselves made of smaller particles called quarks, which are held together by gluons, but there was no hint of that in the early 1900s.

This picture of atoms was put together gradually. Electrons were discovered in 1897 by British physicist J. J. Thompson, who measured their charge and established that they were much lighter than atoms. So somehow there must be two components in an atom: the lightweight, negatively charged electrons, and a heavier, positively charged piece. A few years later Thompson suggested a picture in which tiny electrons floated within a larger, positively charged volume. This came to be called the plum pudding model, with electrons playing the role of the plums.

The plum pudding model didn't flourish for long. A famous experiment by Ernest Rutherford, Hans Geiger, and Ernest Marsden shot alpha particles (now known to be nuclei of helium atoms) at a thin sheet of gold foil. The expectation was that they would mostly pass right through, with their trajectories slightly deflected if they happened to pass through an atom and interact with the electrons (the plums) or the diffuse positively charged blob (the pudding). Electrons are too light to disturb the alpha particles' trajectories, and a spread-out positive charge would be too diffuse to have much effect. But what happened was, while most of the particles did indeed zip through unaffected, some bounced off at wild angles, even straight back. That could only happen if there was something heavy and substantial for the particles to carom off of. In 1911 Rutherford correctly explained this result by positing that the positive charge was concentrated in a massive central nucleus. When an incoming alpha particle was lucky enough to score a direct hit on the small but heavy nucleus, it would be deflected at a sharp angle, which is what was observed. In 1920 Rutherford proposed the existence of protons (which were just hydrogen nuclei, so had already been discovered), and in 1921 he theorized the existence of neutrons (which were eventually discovered in 1932).

So far, so good, thinks our imagined fin de siècle physicist. Matter is made of particles, the particles interact via forces, and those forces are carried by fields. The entire mechanism would run according to rules established by the framework of classical physics. For particles this is pretty familiar: we specify the positions and the momenta of all the particles, then use one of our classical techniques (Newton's laws or their equivalent) to describe their dynamics. Fields work in essentially the same way, except that the "position" of a field is its value at every point in space, and its "momentum" is how fast it's changing at every point. The overall classical picture applies in either case.

The suspicion that physics was close to being all figured out was tempting. Albert Michelson, at the dedication of a new physics laboratory at the University of Chicago in 1894, proclaimed, "It seems probable that most of the grand underlying principles [of physics] have been firmly established."

He was quite wrong.

But he was also in the minority. Other physicists, starting with Maxwell himself, recognized that the known behavior of collections of particles and waves didn't always accord with our classical expectations. William Thomson, Lord Kelvin, is often the victim of a misattributed quote: "There is nothing new to be discovered in physics now. All that remains is more and more precise measurement." His real view was the opposite. In a lecture in 1900, Thomson highlighted the presence of two "clouds" looming over physics, one of which was eventually to be dispersed by the formulation of the theory of relativity, the other by the theory of quantum mechanics.

BLACKBODY Radiation

The history of science is subtle and complicated, and progress rarely takes the straight path we remember in retrospect. Quantum mechanics in particular had a painful and messy development. We're going to skip over many of the historical twists and turns to focus on two puzzling phenomena that kicked off the quantum revolution: waves exhibiting particle-like properties, and particles exhibiting wave-like properties.

The particle-like properties of light came first. The idea arose from studying blackbody (or "thermal") radiation, which is the radiation emitted by an object that absorbs any incident light but nevertheless radiates just because it has a nonzero temperature. To physicists, the temperature of an object characterizes the random jiggling of its constituent particles, and randomly jiggling particles are going to emit radiation depending on how fast they are moving. When you look at a painting, you see an intricate configuration of shape and color that reflects the light that is shining on it. A blackbody, by contrast, is what you get when you turn off all the ambient light and just let objects glow because of their temperature; the glow from a heating element on an electric stove is a good example. Everything with a nonzero temperature gives off some thermal radiation, but pure blackbody radiation depends only on the temperature, unspoiled by color or reflectivity or other properties of the object. A low-temperature blackbody will primarily radiate at infrared or even radio wavelengths, and as we increase the temperature we see more visible light, ultraviolet, and ultimately x-rays.

So blackbody radiation represents a seemingly simple physics problem (a spherical cow, one might say). It has a temperature, and none of its other properties matter. Temperature measures the kinetic energy of atoms in the body jiggling back and forth, and those atoms contain charged particles, so this jiggling leads to the emission of electromagnetic radiation. Our physics problem is, how much radiation is given off at each wavelength?

Physicists in the nineteenth century set about both measuring the radiation as a function of wavelength-the spectrum of the blackbody-and calculating it theoretically. The measured curve is a thing of beauty, climbing up from zero at short wavelengths to a peak that depends on the temperature, then decaying back down to zero at long wavelengths.

The theoretical situation, however, was a mess. One proposed theory, by Wilhelm Wien in 1896, seemed to fit well at short wavelengths but diverged from experimental data at longer wavelengths. Another, by John Strutt (Lord Rayleigh) in 1900, worked the other way around: his fit well at long wavelengths but not at short ones. Indeed, it predicted an infinite amount of radiation at short wavelengths.

About

The instant New York Times bestseller

Quanta and Fields
, the second book of Sean Carroll’s already internationally acclaimed series The Biggest Ideas in the Universe, is an adventure into the bare stuff of reality.

 
Sean Carroll is creating a profoundly new approach to sharing physics with a broad audience, one that goes beyond analogies to show how physicists really think. He cuts to the bare mathematical essence of our most profound theories, explaining every step in a uniquely accessible way.  

Quantum field theory is how modern physics describes nature at its most profound level. Starting with the basics of quantum mechanics itself, Sean Carroll explains measurement and entanglement before explaining how the world is really made of fields. You will finally understand why matter is solid, why there is antimatter, where the sizes of atoms come from, and why the predictions of quantum field theory are so spectacularly successful. Fundamental ideas like spin, symmetry, Feynman diagrams, and the Higgs mechanism are explained for real, not just through amusing stories. Beyond Newton, beyond Einstein, and all the intuitive notions that have guided homo sapiens for millennia, this book is a journey to a once unimaginable truth about what our universe is.

Praise

Praise for Quanta and Fields:

“Readers will be electrified by his discussion of wave functions, entanglement, fields, and so much more. From the most infinitesimal of subatomic particles to the seemingly vast infinities of the universe’s great expanse, Carroll’s latest inquiry illuminates, well, everything.” Booklist

Author

© Jennifer Ouellette


Sean Carroll is Homewood Professor of Natural Philosophy at Johns Hopkins University, and Fractal Faculty at the Santa Fe Institute. He is host of the Mindscape podcast, and author of From Eternity to Here, The Particle at the End of the Universe, The Big Picture, and Something Deeply Hidden. He has been awarded prizes and fellowships by the National Science Foundation, NASA, the American Institute of Physics, the Royal Society of London, and many others. He lives in Baltimore with his wife, writer Jennifer Ouellette.

View titles by Sean Carroll

Excerpt

ONE
Wave Functions

As the nineteenth century drew to a close, you would have forgiven physicists for hoping that they were on track to understand everything. The universe, according to this tentative picture, was made of particles that were pushed around by fields.

The idea of fields filling space had really taken off over the course of the 1800s. Earlier, Isaac Newton had presented a beautiful and compelling theory of motion and gravity, and Pierre-Simon Laplace had shown how we could reformulate that theory in terms of a gravitational field stretching between every object in the universe. A field is just something that has a value at each point in space. The value could be a simple number, or it could be a vector or something more complicated, but any field exists everywhere through space.

But if all you cared about was gravity, the field seemed optional-a point of view you could choose to take or not, depending on your preferences. It was equally okay to think as Newton did, directly in terms of the force created on one object by the gravitational pull of others without anything stretching between them.

That changed in the nineteenth century, as physicists came to grips with electricity and magnetism. Electrically charged objects exert forces on each other, which is natural to attribute to the existence of an electric field stretching between them. Experiments by Michael Faraday showed that a moving magnet could induce electrical current in a wire without actually touching it, pointing to the existence of a separate magnetic field, and James Clerk Maxwell managed to combine these two kinds of fields into single a theory of electromagnetism, published in 1873. This was an enormous triumph of unification, explaining a diverse set of electrical and magnetic phenomena in a single compact theory. "Maxwell's equations" bedevil undergraduate physics students to this very day.

One of the triumphant implications of Maxwell's theory was an understanding of the nature of light. Rather than a distinct kind of substance, light is a propagating wave in the electric and magnetic fields, also known as electromagnetic radiation. We think of electromagnetism as a "force," and it is, but Maxwell taught us that fields carrying forces can vibrate, and in the case of electric and magnetic fields those vibrations are what we perceive as light. The quanta of light are particles called photons, so we will sometimes say "photons carry the electromagnetic force." But at the moment we're still thinking classically.

Take a single charged particle, like an electron. Left sitting by itself, it will have an electric field surrounding it, with lines of force pointing toward the electron. The force will fall off as an inverse-square law, just as in Newtonian gravity. If we move the electron, two things happen: First, a charge in motion creates a magnetic field as well as an electric one. Second, the existing electric field will adjust how it is oriented in space, so that it remains pointing toward the particle. And together, these two effects (small magnetic field, small deviation in the existing electric field) ripple outward, like waves from a pebble thrown into a pond. Maxwell found that the speed of these ripples is precisely the speed of light-because it is light. Light, of any wavelength from radio to x-rays and gamma rays, is a propagating vibration in the electric and magnetic fields. Almost all the light you see around you right now has its origin in a charged particle being jiggled somewhere, whether it's in the filament of a lightbulb or the surface of the sun.

Simultaneously in the nineteenth century, the role of particles was also becoming clear. Chemists, led by John Dalton, championed the idea that matter was made of individual atoms, with one specific kind of atom associated with each chemical element. Physicists belatedly caught on, once they realized that thinking of gases as collections of bouncing atoms could explain things like temperature, pressure, and entropy.

But the term "atom," borrowed from the ancient Greek idea of an indivisible elementary unit of matter, turned out to be a bit premature. Though they are the building blocks of chemical elements, modern-day atoms are not indivisible. A quick-and-dirty overview, with details to be filled in later: atoms consist of a nucleus made of protons and neutrons, surrounded by orbiting electrons. Protons have positive electrical charge, neutrons have zero charge, and electrons have negative charge. We can make a neutral atom if we have equal numbers of protons and electrons, since their electrical charges will cancel each other out. Protons and neutrons have roughly the same mass, with neutrons being just a bit heavier, but electrons are much lighter, about 1/1,800th the mass of a proton. So most of the mass in a person or another macroscopic object comes from the protons and neutrons. The lightweight electrons are more able to move around and are therefore responsible for chemical reactions as well as the flow of electricity. These days we know that protons and neutrons are themselves made of smaller particles called quarks, which are held together by gluons, but there was no hint of that in the early 1900s.

This picture of atoms was put together gradually. Electrons were discovered in 1897 by British physicist J. J. Thompson, who measured their charge and established that they were much lighter than atoms. So somehow there must be two components in an atom: the lightweight, negatively charged electrons, and a heavier, positively charged piece. A few years later Thompson suggested a picture in which tiny electrons floated within a larger, positively charged volume. This came to be called the plum pudding model, with electrons playing the role of the plums.

The plum pudding model didn't flourish for long. A famous experiment by Ernest Rutherford, Hans Geiger, and Ernest Marsden shot alpha particles (now known to be nuclei of helium atoms) at a thin sheet of gold foil. The expectation was that they would mostly pass right through, with their trajectories slightly deflected if they happened to pass through an atom and interact with the electrons (the plums) or the diffuse positively charged blob (the pudding). Electrons are too light to disturb the alpha particles' trajectories, and a spread-out positive charge would be too diffuse to have much effect. But what happened was, while most of the particles did indeed zip through unaffected, some bounced off at wild angles, even straight back. That could only happen if there was something heavy and substantial for the particles to carom off of. In 1911 Rutherford correctly explained this result by positing that the positive charge was concentrated in a massive central nucleus. When an incoming alpha particle was lucky enough to score a direct hit on the small but heavy nucleus, it would be deflected at a sharp angle, which is what was observed. In 1920 Rutherford proposed the existence of protons (which were just hydrogen nuclei, so had already been discovered), and in 1921 he theorized the existence of neutrons (which were eventually discovered in 1932).

So far, so good, thinks our imagined fin de siècle physicist. Matter is made of particles, the particles interact via forces, and those forces are carried by fields. The entire mechanism would run according to rules established by the framework of classical physics. For particles this is pretty familiar: we specify the positions and the momenta of all the particles, then use one of our classical techniques (Newton's laws or their equivalent) to describe their dynamics. Fields work in essentially the same way, except that the "position" of a field is its value at every point in space, and its "momentum" is how fast it's changing at every point. The overall classical picture applies in either case.

The suspicion that physics was close to being all figured out was tempting. Albert Michelson, at the dedication of a new physics laboratory at the University of Chicago in 1894, proclaimed, "It seems probable that most of the grand underlying principles [of physics] have been firmly established."

He was quite wrong.

But he was also in the minority. Other physicists, starting with Maxwell himself, recognized that the known behavior of collections of particles and waves didn't always accord with our classical expectations. William Thomson, Lord Kelvin, is often the victim of a misattributed quote: "There is nothing new to be discovered in physics now. All that remains is more and more precise measurement." His real view was the opposite. In a lecture in 1900, Thomson highlighted the presence of two "clouds" looming over physics, one of which was eventually to be dispersed by the formulation of the theory of relativity, the other by the theory of quantum mechanics.

BLACKBODY Radiation

The history of science is subtle and complicated, and progress rarely takes the straight path we remember in retrospect. Quantum mechanics in particular had a painful and messy development. We're going to skip over many of the historical twists and turns to focus on two puzzling phenomena that kicked off the quantum revolution: waves exhibiting particle-like properties, and particles exhibiting wave-like properties.

The particle-like properties of light came first. The idea arose from studying blackbody (or "thermal") radiation, which is the radiation emitted by an object that absorbs any incident light but nevertheless radiates just because it has a nonzero temperature. To physicists, the temperature of an object characterizes the random jiggling of its constituent particles, and randomly jiggling particles are going to emit radiation depending on how fast they are moving. When you look at a painting, you see an intricate configuration of shape and color that reflects the light that is shining on it. A blackbody, by contrast, is what you get when you turn off all the ambient light and just let objects glow because of their temperature; the glow from a heating element on an electric stove is a good example. Everything with a nonzero temperature gives off some thermal radiation, but pure blackbody radiation depends only on the temperature, unspoiled by color or reflectivity or other properties of the object. A low-temperature blackbody will primarily radiate at infrared or even radio wavelengths, and as we increase the temperature we see more visible light, ultraviolet, and ultimately x-rays.

So blackbody radiation represents a seemingly simple physics problem (a spherical cow, one might say). It has a temperature, and none of its other properties matter. Temperature measures the kinetic energy of atoms in the body jiggling back and forth, and those atoms contain charged particles, so this jiggling leads to the emission of electromagnetic radiation. Our physics problem is, how much radiation is given off at each wavelength?

Physicists in the nineteenth century set about both measuring the radiation as a function of wavelength-the spectrum of the blackbody-and calculating it theoretically. The measured curve is a thing of beauty, climbing up from zero at short wavelengths to a peak that depends on the temperature, then decaying back down to zero at long wavelengths.

The theoretical situation, however, was a mess. One proposed theory, by Wilhelm Wien in 1896, seemed to fit well at short wavelengths but diverged from experimental data at longer wavelengths. Another, by John Strutt (Lord Rayleigh) in 1900, worked the other way around: his fit well at long wavelengths but not at short ones. Indeed, it predicted an infinite amount of radiation at short wavelengths.