Meeting at Ohio State

I just got back from a great workshop at Ohio State University, organized by my collaborator Paul Sutter, on Innovative Techniques in 21-centimeter Analysis. This was a very small (~25 people), tightly-focused meeting. I like these a lot more than the gigantic meetings I sometimes go to. The topic was an area in which I haven’t done any real work yet, but I hope to soon, so it was extremely helpful to get up to speed on the state of the art.

The meeting was about measuring the 21-centimeter radio waves from hydrogen at very high redshifts, in order to map out the distribution of matter at times much earlier than the present (but much later than the cosmic microwave background radiation, which is the main thing I study). Cold hydrogen atoms like to emit and absorb radiation at the specific wavelength of 21 centimeters. This radiation, like all radiation, is shifted to longer wavelengths by the expansion of the universe. This redshift is greater at greater distances, so by observing this radiation at different wavelengths, you can map out the distribution of stuff in the universe in three dimensions.

At least, that’s the idea. The measurements are incredibly hard, because the signal is incredibly faint: it’s 1000 or more times fainter than other, more local sources of radio waves, so separating the signal you want to see from all the other stuff is a big challenge.

One of the speakers was Jeff Zheng, University of Richmond class of 2011, who did some great work in my research group when he was an undergraduate. He’s now a second-year graduate student at MIT. Here’s what he talked about:

The Omniscope: developing scalable technology for precision cosmology

Jeff Zheng

I describe the design and current status of the Omniscope, a 21 cm interferometer architecture optimized for scalability to very large (10^4-10^6) numbers of antennas N. By exploiting a hierarchical antenna grid layout, the correlator cost scales as N log N rather than N^2, and massive baseline redundancy enables automatic calibration and identification of bad data and failed components.

I’m pretty sure he was the most inexperienced speaker there, but you’d never know it from his talk, which was excellent. It’s great to see one of our graduates out there doing such top-notch work.

Leave a Reply