Chasing the Origins of the Universe…

Right down to the first trillionith of a trillionth of a second

By Karen Blum

Astrophysicist Charles “Chuck” Bennett has spent more than 25 years studying faint afterglow radiation from the earliest moments of the universe.

Bennett, the Alumni Centennial Professor of Physics and Astronomy who came to Johns Hopkins in 2005 from the National Aeronautics and Space Administration (NASA), says there are many unanswered questions about the universe. His is, “What happened at the very beginning?”

“You might think that this is not answerable by humans,” he says, “and it may not be, but we’re going to try.”

With a $5 million grant from the National Science Foundation, Bennett is leading a team developing a ground-based telescope to study the origin of the universe. The instrument will probe the cosmic microwave background radiation—faint traces of the universe’s very beginning, 13.7 billion years ago.

Called the Cosmology Large-Angular Scale Surveyor (CLASS), the telescope will search large regions of the sky for a specific polarization pattern generated by gravitational waves formed during the first moments of the universe. Bennett hopes it will help provide a definitive test of the theory of inflation, which suggests that tiny quantum mechanical energy fluctuations suddenly grew at an enormous rate to an astronomical size to start the universe.

Several members of the CLASS team. From left: Assistant Professor Tobias Marriage, Graduate Research Assistant Joseph Eimer, Professor Charles Bennett, and Undergraduate Research Assistant Tiffany Wei. Between Eimer and Bennett rests a helium refrigerator system and dewar, which are used in conjunction to cool CLASS detectors to 100 millikelvin—just one-tenth of a degree above absolute zero.

“This inflation theory, in my mind, is like a theory that adds on to the beginning of the Big Bang theory,” he says. “Inflation theory actually is the theory of how things got started.” While most cosmologists agree that the universe started very hot and dense and has been expanding and cooling ever since, the Big Bang theory many use to explain the universe’s origin has become misconstrued, Bennett says. According to him, the Big Bang theory clarifies how the universe has evolved, but it doesn’t tell us how it began.

Looking at the cosmic microwave background is looking straight into the past, due to the finite speed of light, Bennett explains. “We are directly looking at what the universe looked like 13.7 billion years ago. It isn’t an idea, or a concept, we are looking at the light from 13.7 billion years ago, and that’s what’s so cool about’s incredibly exciting to try to use this data to figure out what happened in the first trillionth of a trillionth of a second of the universe.” (It’s worth noting that Bennett was the principal investigator for the 2001 Wilkinson Microwave Anisotropy Probe space mission, which improved cosmic parameters by a factor of 30,000, fixing the age of the universe to within 1 percent accuracy.)

“One of the most rewarding things is seeing undergraduate and graduate students really dig into the project. It’s a great combination of enthusiasm and transfer of expertise.”

The instrument will work somewhat like a satellite television dish. A reflector and circuitry within the instrument will intercept microwaves coming from space and direct the signal onto an array of tiny devices (superconducting transition edge sensors or bolometers), where that energy will be absorbed. The device heats up slightly due to this absorbed energy, allowing the team to record the energy coming from different directions of the sky. For this to be possible, the team has to cool the detectors within the instrument to a tenth of a degree above absolute zero. A position-cycled grid of wires located in front of the instrument will rapidly modulate the polarization state input to the detectors so the instrument’s effects can be separated from sky polarization.

CLASS will be built at Johns Hopkins and transported to the Atacama Desert of northern Chile—one of the highest, driest places on earth. Bennett has been working with the Chilean government for permission to situate the instrument 17,000 feet above sea level in an evolving astronomical park scattered with other telescopes and observatories. Last year, he visited the area to select a specific location for the telescope, in view of the town of San Pedro de Atacama, some 9,000 feet below. Because the desert’s extreme temperatures and little oxygen make for an inhospitable work environment, he plans to have an antenna transmit the data from the telescope to town so team members can access the information on computers at a more moderate altitude, limiting the amount of time spent on-site each day.

Now a year into the five-year grant, the team is well on its way to constructing the telescope. A lab across from Bennett’s office houses part of the instrument’s core circuitry, as well as cryogenic cooling equipment. An adjacent room holds their polarization filter, consisting of hundreds of thin gold wires, stretched taut.

A key member of the team is Assistant Professor Tobias Marriage, who came to Johns Hopkins last year from Princeton University. Marriage has also been studying the cosmic microwave background, and at Princeton he worked on the Atacama Cosmology Telescope (ACT), located on the same Chilean plateau where CLASS will reside. Marriage studies clusters of galaxies that were found by their effect on the cosmic microwave background, and uses this information to deduce additional properties of the universe.

Marriage has been working closely with Johns Hopkins physics and astronomy students designing state-of-the-art technology for CLASS, including the optical instrumentation and a shield to protect the detectors from the Earth’s magnetic field.

“One of the most rewarding things is seeing undergraduate and graduate students really dig into the project,” Marriage says. “It’s a great combination of enthusiasm and transfer of expertise.

“This is a real watershed moment,” he adds. “This project hopefully will be the gateway to many, many more cosmological experiments over the next decades.” If CLASS detects the anticipated signal, it would be the first-ever detection of the gravitational waves that signify the origin of the universe.

Discovery on a Grand Scale

Hopkins physicists play important role with Large Hadron Collider

By Jim Schnabel

Hundreds of feet beneath the countryside west of Geneva, Switzerland, theLarge Hadron Collider (LHC) has captured the attention of particle physicists worldwide. The 17-mile circular synchrotron is the newest and largest particle-collider on the planet—capable of boosting clusters of protons to 99.9999991 percent of the speed of light, and steering them against protons that come at the same speed from the opposite direction. Its proton-on-proton collision energies reach 7 TeV, and are expected to attain energies as high as 14 TeV by 2014, seven times higher than the collision energies achieved at Fermilab’s Tevatron in Batavia, Illinois—the world’s largest collider before the LHC.

“The LHC is essentially a machine for detecting new physics,” says Department of Physics and Astronomy Professor Bruce Barnett.

Run by the European physics consortium CERN, with some U.S. funding and plenty of U.S. participation, the LHC has been moving into normal working mode over the past few months. More than 6,000 particle physicists around the world are now trying to map its huge flow of data onto the Standard Model of particles and forces, which has held up well until now, but still contains terra incognita that only multi-TeV energies can illuminate.

“You can think about it as almost like a voyage to discover the Americas,” says Associate Professor Kirill Melnikov. “We’re still in the middle of the journey; we don’t know that we’re going to discover anything, but if we do, it will be a discovery on a grand scale.”

Johns Hopkins has been a part of this journey from its beginning. Even before the LHC’s construction was completed in 2008, Johns Hopkins experimental particle physicists—Professors Bruce Barnett, Barry Blumenfeld, Andrei Gritsan, Petar Maksimovic, Morris Swartz, and their postdocs and students—helped assemble one of LHC’s two main detectors, the Compact Muon Solenoid (CMS). The 14,000-ton device enables physicists to detect and analyze the collision-products that explode outward from these proton-on-proton impacts.

Installation of the CMS silicon tracking detector that Johns Hopkins researchers helped design, test, and install. (Photo courtesy of CERN)

“We took our experience working with detectors at the Tevatron and brought that to the CMS,” says Barnett. “We collaborated with other institutions to design the silicon detectors inside the CMS, carefully assembled them, tested them, shipped them off to CERN, and played a part in the delicate process of installing them.”

“We’ve also developed some advanced software for these detectors, and have been designing some next-generation tracking instruments,” says Swartz, the principal investigator for the NSF grant that funds most of Johns Hopkins’ research at the LHC.

The work has been challenging.

"A huge mystery is why the top quark weighs 175 times the mass of a proton—about as much as an atom of gold"

Useful collision-product-detection requires hardware and software that is sensitive enough to distinguish the fleeting tracks of infinitesimally spaced subatomic particles; sophisticated enough to flag the noteworthy events amid the billions of proton-proton collisions that take place in a typical day of operation; and robust enough to stay useful even after months and years within a maelstrom of radiation. “To continue to get value out of the detectors as they’re being degraded by radiation is tricky,” says Jonathan Bagger, a Krieger-Eisenhower Professor and theoretical physicist at Johns Hopkins.

But now the CMS detectors are in place and yielding vast amounts of data—with which physicists hope to soon discern the answers to some important questions. Atop their list is the question of whether the hypothetical Higgs boson exists. Discovery of the Higgs boson would explain why the W and Z bosons—which mediate the weak nuclear force—have mass, while the photon—which mediates the electromagnetic force—does not.

“The Higgs mechanism is the simplest hypothetical mechanism that would give masses to these particles without breaking the Standard Model,” says Melnikov. “But being the simplest proposed mechanism doesn’t mean that it is the correct one.”

Particle physicists worldwide are now scrutinizing data from LHC detectors for statistical hints of the Higgs—which theoretically was unlikely to manifest itself at the Tevatron. Gritsan, who has worked to install and align key CMS detectors, designed special software to pick up decay products that could occur if the Higgs emerges briefly into existence. “If you were to create a Higgs boson, it would live so briefly that the only way to see it would be to detect the channels of products from the two photons, or two W bosons, or two Z bosons into which it can disintegrate,” he says.

An event with the production of four muons in the proton-proton collisions at the LHC, observed in the CMS detector. Such events are important in the search for the Higgs boson, as the production of four muons is a potential decay mode of the Higgs. (Courtesy of CERN)

By early August, data from the LHC had ruled out the existence of a Higgs-like particle over much of its allowed mass range. By the end of 2012, says Gritsan, Hopkins researchers expect the Standard Model Higgs boson to be either observed or completely ruled in. While finding the Higgs would shore up the Standard Model, not finding it wouldn’t be considered a failure: “If we don’t find it, we’ll probably find something else that will be even more interesting and exciting,” Gritsan says.

There are other, less popularly discussed puzzles that the LHC could help solve. One of these puzzles has to do with the top quark, discovered at Tevatron in 1994.

“A huge mystery is why the top quark weighs 175 times the mass of a proton—about as much as an atom of gold,” says Bagger. “Since top quarks have so much mass, they must be intimately involved with whatever gives quarks and leptons their mass; what you really want to do is to study them every way you can.”

Johns Hopkins professors (theorist David Kaplan and experimentalist Maksimovic) have been busy determining how top quarks should appear in LHC collisions and integrating their techniques into CMS detector software algorithms.

JHU physicists are also hoping the LHC might help solve some of the questions surrounding dark matter, whose existence astronomers infer from their observation that the visible matter in the universe cannot account for all of the universe’s gravity. “If dark matter reacts even weakly to the usual collision-measuring instruments, then there’s a reasonable chance that it can be detected at LHC,” says Melnikov. “Should that happen, it would be a totally different game for physics.”

A New Spin on Quantum Matter

Physics and Astronomy Professor Colin Broholm explores the unusual spin-liquid state.

By Jim Schnabel

Collin Broholm, the Gerhard H. Dieke Professor of Physics and Astronomy, is on the trail of a new and exotic state of matter known as a “spin-liquid” state.

Physics and Astronomy Professor Colin Broholm explores the unusual spin-liquid state.

“Our motivation is almost entirely to learn new, fundamental physics, but we also have an eye out for applications,” says Broholm. “In condensed-matter physics there can be a beautiful confluence of basic phenomena and practical application.” In the case of spin-liquids, the exotic quasiparticles they host could one day become the Q-bits of a super-fast quantum computer.

The spin-liquid state is closely related to the superfluid state of helium near the absolute zero temperature, because like in liquid helium, quantum fluctuations preclude a periodic ordered state. “A material in the spin-liquid state is one that should be magnetically ordered from a classical viewpoint but remains a fluid,” says Broholm. “It’s the magnetic equivalent of superfluid 4He.”

Like superfluid 4He, which has unique roton excitations, electrons in spin-liquids can fractionalize into spinons and holons– quasi-particles that separately carry the electrons’ spin and charge. Broholm and his colleagues are on a quest to expose new forms of collective quasi-particles within crystalline solids to understand their interactions, impact on materials properties, and potential for technical applications.

A quantum computer would replace the usual binary 1-or-0 bits by the magnetic states of so-called qubits, which exist in an unresolved, maybe-1-maybe-0 state during calculations. As a result, a continuum of calculations is conducted, from which the result of interest can be projected upon completion. For certain types of computation, this makes a qubit-driven computer dramatically faster than an ordinary binary computer. The challenge is to realize a qubit that operates coherently long enough to complete a calculation. “A spin-liquid might do the job because the collective nature of spinons can protect them from defects and noise,” says Broholm.

Originally from Copenhagen, Denmark, Broholm leads a team of scientists with similar interests and complementary expertise in the Johns Hopkins-Princeton Institute for Quantum Matter (IQM). Johns Hopkins professors Zlatko Tesanovic and Oleg Tchernyshyov focus on the theory of quantum-correlated materials; Tyrel McQueen and Robert Cava of the Johns Hopkins and Princeton departments of chemistry, respectively, synthesize spinliquid-prone materials; and Broholm and Hopkins physicist N. Peter Armitage probe the collective properties of new materials. Specifically, they use neutrons or photons to detect novel quasi-particles and determine whether the new materials match the theory.

"The spallation neutron source is on a different scale. The neutron-detecting systems on the output side are the size of houses"

Some of this work requires large and expensive equipment that universities cannot afford to operate, so Broholm uses facilities run by the federal government. The National Institute of Standards and Technology (NIST) Center for Neutron Research is conveniently located in Gaithersburg, Md, and Broholm was the principal investigator for a novel instrument that has just been completed there. The Multi-Axis Crystal Spectrometer (MACS) was jointly funded by NIST, the National Science Foundation, and Johns Hopkins, and 20 percent of the beam time on it is available for IQM research.

By probing the distribution of neutrons scattered from the materials of interest, MACS provides an atomic scale view of structure and motion in materials. “Neutrons are perfect for what we’re trying to do,” says Broholm. “They don’t have charge, so they enter materials without disturbing things too much. Yet they do carry a magnetic dipole moment and can deliver energy and momentum to spinons through quantum collisions, witnessed by the pattern of neutron scattering.”

By knowing the energy and momentum of neutrons as they enter a material and measuring how much these changed in the course of interacting with the material, Broholm and his colleagues can infer the amount of energy and momentum delivered to spinons—and thus infer properties of the spinons themselves. With 50 times the number of neutrons per time unit hitting the sample and 20 detectors to collect scattered neutrons, MACS is much more efficient than conventional instrumentation and is providing a whole new view of magnetism and electronic correlations at the atomic scale.

Left: Close-up photo of single crystals of geometrically frustrated antiferromagnets, such as strontium holmium oxide. The crystals pictured here were grown in the crystal-growth facility of the Department of Physics and Astronomy’s Institute for Quantum Matter by research scientist Seyed Koohpayeh. Conflicting magnetic interactions in these materials produce a rich variety of emergent phenomena, including the exotic spin-liquid state of matter.

Right: The crystals at approximately 2.5 times their actual size.

Broholm and his colleagues say they are also excited about the capabilities of the completely new Spallation Neutron Source (SNS) at the Department of Energy’s facility in Oak Ridge, Tennessee (Broholm has a joint-faculty appointment there). The SNS produces neutrons by “spallation” of nuclei when energetic protons from a ~1 GeV accelerator strike a dense target. The pulsed incident proton beam creates a pulsed neutron beam with peak intensities much greater than is possible at a reactor-based facility. “SNS is on a different scale,” says Broholm. “The neutron-detecting systems on the output side are the size of houses.” He should know, having served as the chair of the SNS’s Experimental Facilities Advisory Committee from 2002 to 2006. For his scientific research and his work to develop neutron scattering instrumentation, the Neutron Scattering Society of America (NSSA) awarded him its 2010 Sustained Research Prize.

Broholm’s work on the spin-liquid state most recently led him to Japan, where he spent several months visiting with physicists at the University of Tokyo’s Institute for Solid State Physics. “Over the last 20 years or so, they’ve developed a lot of expertise in the discovery and characterization of novel electronic materials, so I’ve been visiting there to strengthen our collaboration,” he says. “They bring expertise in materials production, and on our side we’re helping to disentangle atomic scale properties of strange quantum magnets through neutron scattering.”

Not Your Average Server Room

A former mission control center has been transformed into the new home for Johns Hopkins data-intensive computing.

There’s a large—3,100 square feet—brightly lit room on the first floor of the Bloomberg Center that was once a critical base for the Far Ultraviolet Spectroscopic Explorer (FUSE). It was the control center for the FUSE, which was launched into space in 1999 and operated until late 2007. Four years later, the room is now repurposed, dedicated to exploring a realm that is quickly becoming nearly as vast and difficult to navigate as space itself: data.

Funded in part by the National Science Foundation through an American Recovery and Reinvestment Grant, the new data center serves both the Krieger School of Arts and Sciences and the Whiting School of Engineering, and several major computing clusters in the room are dedicated to projects led by physics and astronomy faculty members. By the end of 2011, the data center will house two data-intensive computing clusters, dubbed GrayWulf and Data-Scope; the entire Homewood High-Performance Cluster (HHPC), and several other state-of-the-art computing projects. Each will be used to create, store, organize, and eventually interpret the vast amount of data generated from disciplines such as astrophysics, genetics, condensed matter physics, environmental science, or bioinformatics.

A former mission control center has been transformed into the new home for Johns Hopkins data-intensive computing.

Only partially completed as of this fall, here is a sampling of the computing power already available in the new data center:

  • Two towers of graphics processing unit (GPU) powered computers called the 100Teraflop Graphics Processor Laboratory. One teraflop denotes the ability to perform one trillion operations per second.
  • 10 gigabyte-per-second connectivity between clusters in the data center and computers elsewhere on the Johns Hopkins network.
  • Over 3,500 compute cores in the HHPC (each comparable to those in a fast desktop PC) that can be harnessed to work in parallel on complex problems.
  • A direct connection to Internet2—an advanced networking consortium of academic and research clusters like the HHPC from around the U.S. More than 350 member institutions will soon share 100 gigabit-per-second connectivity and 8.8 terabits of capacity, granting research institutions like Johns Hopkins unprecedented shared-computing capability.
  • GrayWulf, the largest database in any university, has roughly 1.5 petabyte storage capacity. That’s equivalent to 6,000 250 gigabyte hard drives—a typical amount of storage on a consumer laptop. When completed, Data-Scope will have a 10 petabyte capacity.