Particle and wave?

My friend Allen Downey (whose blog, Probably Overthinking It, has a bunch of good stuff on how to think about statistics and data) sent me a mini-rant a while back about the way people often write and talk about quantum physics. He asked me what I thought and suggested it’d be a good topic to write about here. I agree that it’s a good topic. I’ll give a bit of introduction, and then just quote Allen (with his permission).

The topic is the grandiose way in which people often play up the mysteriousness of quantum physics. For about as long as there’s been quantum physics, people have been saying pseudo-profound things about What It All Means. I tend to agree with Allen that the more woo-woo descriptions are annoying at best and misleading at worst. (I’ve grumbled about some before.)

Allen specifically refers to this quote from David Brooks:

A few of the physicists mention the concept of duality, the idea that it is possible to describe the same phenomenon truthfully from two different perspectives. The most famous duality in physics is the wave-particle duality. This one states that matter has both wave-like and particle-like properties. Stephon Alexander of Haverford says that these sorts of dualities are more common than you think, beyond, say the world of quantum physics.

Brooks is writing about the answers to the Edge’s 2011 Question, “What scientific concept would improve everybody’s cognitive toolkit?”  (This is not The Edge from U2; it’s the website edge.org, which annually asks a broad question like this to a bunch of scientists.)

With that context, here’s Allen:

Physicists who wax philosophic about wave-particle duality are a pet peeve of mine, especially when they say nonsensical things like “light is both a particle and a wave,” as if that were (a) true, (b) a useful way to describe the situation, and (c) somehow profound, mystical and inspiring.  But none of those are true.

It seems to me to be (a) clearer, (b) not profound, and (c) true to say instead that light is neither a (classical) particle nor a wave.  It’s something else.  If you model it as a particle, you will get things approximately right in some circumstances, and completely wrong in others.  If you model it as a wave, same thing, different circumstances.  And if you model it as (modern) particle, you get good answers in almost all circumstances, but it seems likely that we will find circumstances where that model fails too (if we haven’t already).

So none of that is an example of “describ[ing] the same phenomenon truthfully from two different perspectives.”  It’s just plain old model selection.

This is quite right. An electron is not a particle (as that term is generally understood), nor is it a wave. It’s an excitation of a quantum field. Practically nobody has much intuition for what “an excitation of a quantum field” means, so it’s useful to have alternative descriptions that are more intuitive, albeit imperfect. Sometimes an excitation of a quantum field behaves like a particle, and sometimes it behaves like a wave.

By the way, when I say it’s useful to have such descriptions, I don’t just mean when talking to laypeople — although that’s part of what I mean. Actual physicists use these description in our own minds all the time. That’s perfectly fine, as long as we know the limits of how far we can push them.

But to go from “electrons sometimes behave like particles, and sometimes like waves” to “an electron is both a particle and a wave” is to mistake the maps for the territory. Here’s an analogy. Sometimes we model the economy as a machine (“economic engines”, “driving the economy”, “tinkering”). Sometimes we model it as a living thing (“green shoots”). But nobody makes the mistake of thinking that the economy is some kind of cyborg.

Stephon Alexander, the physicist Brooks refers to, isn’t particularly guilty of this sin. He actually does a pretty good job explaining what he means by duality in his answer (scroll down on that page to get to him):

In physics one of the most beautiful yet underappreciated ideas is that of duality. A duality allows us to describe a physical phenomenon from two different perspectives; often a flash of creative insight is needed to find both. However the power of the duality goes beyond the apparent redundancy of description. After all, why do I need more than one way to describe the same thing? There are examples in physics where either description of the phenomena fails to capture its entirety. Properties of the system ‘beyond’ the individual descriptions ’emerge’ …

Most of us know about the famous wave-particle duality in quantum mechanics, which allowes the photon (and the electron) to attain their magical properties to explain all of the wonders of atomic physics and chemical bonding. The duality states that matter (such as the electron) has both wave-like and particle like properties depending on the context. What’s weird is how quantum mechanics manifests the wave-particle duality. According to the traditional Copenhagen interpretation, the wave is a travelling oscillation of possibility that the electron can be realized omewhere as a particle.

There’s one major sin in this description: the word “magical.” Other than that, his description of what wave-particle duality actually means is just fine. One of the other respondents, Amanda Gefter, also writes about duality. She starts out OK, but then uses the notion as an egregiously silly metaphor for people disagreeing with each other. (Sadly, you can’t expect much from someone who works for New Scientist, which was once a good pop-science magazine but isn’t anymore.)

Baseball bats

It’s spring, when a young man’s fancy turns to baseball.

The NCAA (and high schools, I think) changed their standards for baseball bats this year, apparently in response to safety concerns about people (especially pitchers) being hit by fast-moving balls. The change took effect back in January and was decided on long before that, but I just heard about it.

Descriptions of the change are very confusing, at least to a naive physicist. An example:

So they adopted a standard called the Ball-Bat Coefficient of Restitution (BBCOR), which provides a more accurate measure of bats in lab tests than the old standard, the Ball Exit Speed Ratio (BESR). Rather than measure the ball’s speed off the bat, BBCOR testing measures the collision between the bat and the ball to see how lively the bat is.

That distinction is way too subtle for me! What does “how lively the bat is” mean, if it doesn’t mean how fast the ball leaves the bat?

To be more specific, the coefficient of restitution, by definition, is a measure of what fraction of the mechanical energy that was present before a collision remains after the collision. Having a standard that restricts the  speed of the ball (following a collision under controlled circumstances) is precisely the same thing as having a standard that restricts the energy after the collision (i.e., the coefficient of restitution).

Where’s the Physicist to the National League when you need him?

Of course, even if the two standards are essentially equivalent, changing from one to the other might be a way to tighten up the standard, without making it explicitly obvious that that’s what you’re doing. Maybe that’s all that’s going on here.

You can actually read the old and new standards, equations and all, if you feel like it. It turns out, as far as I can tell, the headline change, from “exit speed” to “coefficient of restitution,” really is a bit of a red herring. The COR is a bit of a cleaner thing to measure, because the old standard had to be measured on a sliding scale for different bat sizes, and the new one doesn’t, but fundamentally they’re measuring essentially the same thing.

The more important point is that they’ve also added an accelerated break-in procedure to the protocol. Apparently composite bats get springier over use (I guess as the materials get compressed). The old procedure tested them new; the new procedure breaks them in first, so that you can’t buy a standards-compliant bat and later end up with one that’s too springy for the standard.

3D movies and the human eye

On Roger Ebert’s blog, the acclaimed film editor Walter Murch explains what he sees as insurmountable problems with 3D movies:

The biggest problem with 3D, though, is the “convergence/focus” issue. A couple of the other issues — darkness and “smallness” — are at least theoretically solvable. But the deeper problem is that the audience must focus their eyes at the plane of the screen — say it is 80 feet away. This is constant no matter what.

But their eyes must converge at perhaps 10 feet away, then 60 feet, then 120 feet, and so on, depending on what the illusion is. So 3D films require us to focus at one distance and converge at another. And 600 million years of evolution has never presented this problem before. All living things with eyes have always focussed and converged at the same point.

That’s an interesting idea. It’s true the convergence and focus are two separate processes: when you look at something close to you, your eyes tilt in towards each other (convergence), and each eye shifts its focus. The latter process is known as accommodation and involves flexing muscles in the eye to change the power of the lens.

It’s certainly true that 3D movies involve one but not the other, and it’s possible in principle that this has an effect on how we perceive them, but I wouldn’t have thought it was a significant effect in practice. This is way outside of my expertise, but here’s how it seems to me anyway.

The eye is set up so that, when the muscles are completely relaxed, you’re focusing on points that are extremely far away — “at infinity”, as they usually say. The closer you want to focus, the more you have to strain the muscles. The amount of strain is very small for a wide range of distances, shooting up sharply as the distances get small. Here’s a graph I mocked up:

accommodation.gif

The horizontal axis is the distance to the object you’re looking at, and the vertical axis is the amount of strain the muscles have to provide — to be specific, it’s the fractional change in power of the lens, Delta f/f. In case you’re wondering, I assumed the diameter of the eye is 25 mm and the person’s near point is 25 cm. The graph starts at the near point, so the highest point on the graph is the maximum accommodation the person’s eye is capable of.

The point is that the things you’re looking at in a 3D movie are pretty much always “far away”, as far as accommodation is concerned. The range of examples Murch gives, from 10 feet (i.e. 3 meters) on up, is a good example.  Note that the graph is very flat for this range.

If 3D movies routinely involved closeups of a book someone was reading, or the construction of a ship in a bottle, that’d be different. But they don’t. Most of the time, the point you’re looking at is far enough away to be practically at infinity, so your visual system should be expecting not to have to do any accommodation. And of course it doesn’t have to, because the screen is really quite far away (essentially at infinity).

So it seems implausible to me that the accommodation / convergence problem really matters. But this is very, very far from any area of expertise of mine, so maybe I’m wrong.

Booooooooooooogus

The guys at Car Talk got a math-physics based puzzler wrong this week.  You can hear or read the question and their answer at their Web site.

The puzzler in question is labeled week of Jan. 17 as of now, but it should really be the week of Jan. 15 — in fact, the original question aired here in Richmond on the 14th. Anyway, it’s the one about how to tell when a cylindrical gas tank is 1/4 full.

Here’s what I wrote to them:

Sorry, but you got the answer to last week’s puzzler (i.e., the puzzler from Jan. 14, 2011) wrong. You assumed that the center of mass of an object has the property that there are equal amounts of mass on both sides, but that’s not true.

To convince yourself of this, think of the following example: take a 100-pound weight, and a 200-pound weight, and join them together with a long bar. The weight of the bar is small, say 1 pound. Now where’s the center of mass of this funny asymmetrical dumbbell? It’s somewhere between the two masses, about 2/3 of the way along the bar. But there’s certainly not the same amount of weight on the two sides of that point: there’s over 200 pounds on one side, and just over 100 pounds on the other.

In the case of a semicircle, the difference isn’t as dramatic as that, but there still is a difference. The center of mass is about 42.4% of the way out from the center (4/3pi, if you must know). The correct answer (i.e., the point where half the mass is on one side and half the mass is on the other) is only 40.3% of the way out.

Unfortunately, I don’t know a good way to answer the original question: I can’t think of a way to get the 40.3% value without some annoying calculus.

Update: Ray admits he got it wrong.

Physics Q&A web site

 I learned (from the Cosmic Variance blog) about a Web site with interesting discussions of a wide variety of questions in physics. It reminded me of some parts of my misspent youth.

Back in the 1990s, I spent a lot of time reading and posting articles on the various physics and astronomy Usenet newsgroups. For those who don’t know, newsgroups are forums in which people can discuss a huge variety of topics. They allowed freewheeling electronic communication back in the days before blogs, and even before the invention of the Web. Last time I checked, Usenet newsgroups still existed, but with all the other options out there nowadays, they don’t play the same role they used to.

Newsgroups had participants with a wide variety of backgrounds and interests. Participants in the physics newsgroups included graduate and undergraduate students, professional scientists, and tons and tons of interested laypeople. The freedom of anybody to participate was both a blessing and a curse. It made for a lot of variety, of course, but it also meant that discussions could easily be hijacked by crackpots. Some of the crackpots were highly entertaining, but eventually most serious people would get frustrated with all the noise and go away.

One solution to this problem was moderated newsgroups. I was one of the moderators of the group sci.physics.research for quite a while. One of us moderators would have to approve each post before it could appear in the newsgroup. The big problem with this, of course, was that it was labor-intensive.

OK, that’s it for the history lesson. What reminded me of all this is the Web site Physics Stack Exchange, which aims to produce a similar sort of forum for discussion of physics questions. The site has moderators, but they don’t approve each post manually as we did. Rather, the participants in the group vote answers up or down, so that the ones that are deemed most useful rise to the top. There’s a complicated set of rules whereby only people who have earned the right (through useful participation in the past) are allowed to vote.

Once I started looking at it, I couldn’t resist posting some answers of my own.

There’s a lot of good stuff there. Check it out, and participate if you’re interested!

My new year’s resolution: More curmudgeonliness

Benjamin Bederson, physics professor emeritus at NYU, has a letter in today’s NY Times magazine:

As a physicist myself, I read with great interest Jonah Lehrer's article about Geoffrey West, who is interested in developing a general theory of cities from first principles. Physicists €” myself included €” are intrigued by the idea that application of rigorous laws in the world of natural science can inspire similar applications in other areas, including social ones.

It may be true that methods from the natural sciences can be fruitfully applied in the social sciences, but as a matter of diplomacy I’d rather physicists refrained from “we’re rigorous and you’re not” language.  Whether or not it’s true, I don’t think it’s helpful.

But that’s not what bothers me about this letter.  He goes on to say

This quest has been going on for a long time: for example, the efforts to apply the Heisenberg uncertainty principle to human behavior.

This would be fine, if he went on to clarify that “efforts to apply the Heisenberg uncertainty principle to human behavior” are utterly stupid and pointless.  In fact, he seems to think precisely the opposite, namely that such efforts are a good example of the use of natural-science ideas in the social sciences.

The Heisenberg uncertainty principle is often used as a metaphor, sometimes to convey the banal notion that it’s often hard to measure stuff, and sometimes to convey the slightly more interesting idea that measurements affect the system being measured.  I suppose that what Bederson means by “apply[ing] the Heisenberg uncertainty principle to human behavior” is simply that when you survey people the act of surveying them has an effect on them.  That’s true, but it’s much less interesting than the actual Heisenberg uncertainty principle.   It’s also such an obvious idea that it’s downright insulting to suggest that social scientists needed physicists’ help to figure it out.

In which I earn the contempt of my peers

Edward Tufte is auctioning off a bunch of rare old manuscripts from his library at Christies in New York.  There’s a Galileo first edition going for a mere $5000-$7000, which sounds like a steal to me (not that I know anything about this stuff.)  I love old manuscripts like this.  I were in the area, I’d love to go.

But here comes my shameful confession: I’ve tried to like Edward Tufte’s sensibility, and I just can’t do it.  I look at the Minard map of Napoleon’s Russian campaign, which Tufte says “may well be the best statistical graphic ever drawn,”and I just find it confusing and cluttered.  Just because you can layer seven (or however many it is) dimensions of data on one graphic doesn’t mean you should.

Even when he aims at the low-lying fruit of anti-PowerPoint ranting, I can’t really get behind him.  Yes, lots of PowerPoint presentations (including some I’ve perpetrated) are deadly, but I don’t think it’s because of the “cognitive style of PowerPoint.”  It’s because, no matter what tools you use, creating a bad presentation is much easier than creating a good presentation. Take it from me, young folks: even before PowerPoint, most talks were bad.

QUBIC paper submitted

For quite a while now, I’ve been part of a collaboration working on the development of QUBIC, a new kind of instrument for measuring the polarization of the microwave background.  For those who want details, we’ve written a paper describing the current status and prospects of the project.  It’s posted on the arxiv, and it’s been submitted for publication in the journal Astronomy and Astrophysics.  Enjoy.