Pulling QCD Predictions Out of a (Black) Hat
The Large Hadron Collider will start taking data at the end of this year, opening the door to particle collisions at energies in the trillions of electron volts (tera-electron volts, or TeV). Exploring the behavior of elementary particles at these energies, LHC researchers expect to uncover the secret of how particles get their masses. Many theoretical models have been proposed and are waiting to be tested. The LHC is, first of all, a proton-proton collider. Since protons are composed of quarks and gluons, most of the events that the LHC will produce will come from the strong interaction that binds these particles in the nuclei of atoms. This area of high-energy physics is known as Quantum Chromodynamics (QCD). To test the theories, and even to be persuaded that new physical processes are being observed, theorists need to understand QCD very precisely. Many of the computations needed are so difficult that, for a long time, they were considered intractable. But now, working with Lance Dixon and collaborators at SLAC and elsewhere, I am writing a computer program called BlackHat that we hope will do these computations automatically.
The predictions of QCD are derived from Feynman diagrams, models named for their key creator, Richard Feynman. At the first level, only the simplest, so-called "tree" diagrams are needed. For precise predictions, though, it is also necessary to analyze more complicated diagrams in which virtual particles travel around closed loops.
There are (at least) two ways to make theorists and their computers sweat. The first is to require them to compute loop diagrams. Computing diagrams with more closed loops increases the precision but also makes the work much more difficult. The second is to require answers for processes with many particles, since the complexity of the computation increases dramatically with the number of particles produced. Unfortunately for the theorist, reactions that produce four or more energetic quarks and gluons will be common at the LHC.
The traditional approach to QCD is to compute results from Feynman diagrams one by one. This approach works very well as long as the number of particles in the process under consideration is small. But the number of Feynman diagrams needed grows exponentially with the number of particles produced. For example, computation of the one-loop corrections for six-gluon production at a hadron collider such as the LHC requires about three million Feynman diagrams. Each diagram yields a mathematical expression with between 1,000 and 10,000 terms. The result of the whole procedure is then a gigabyte-long formula. There are obvious technical problems in having a computer read this formula, but there is a worse problem. There are large cancellations among the diagrams, such that value of the final answer is approximately that of a single diagram. Without exquisite care, numerical rounding errors easily accumulate, making the final answer worthless.
For a long time, these difficulties have been accepted as the price one must pay for the improved precision that one-loop corrections provide. However, in some simpler problems that can yield an exact result, the final result turned out to be much simpler than the huge expression generated in diagram-by-diagram calculation. This was a strong hint that there ought to be a simpler way to derive these results.
Now, a new approach based on unitarity and complex analysis has changed the picture for one-loop computations. In 1994, Lance Dixon, Zvi Bern (UCLA), David Dunbar (now at Swansea University) and David Kosower (CEA-Saclay) showed that, in supersymmetric versions of QCD, the loop diagrams could be computed by slicing the loops open to obtain tree diagrams. The tree diagrams can be computed very efficiently and have simple values. Dixon and his collaborators showed that the loop diagrams could be reconstructed by using these trees as building blocks. Very recently, it has been shown that, by extending the particle momenta to complex values and making the right multiple slices, loop diagrams in QCD can also be constructed in this way. It might seem daring to use momentum vectors with complex, and therefore unphysical, values. But scattering amplitudes for complex momenta have properties very similar to those with real momenta. When the momenta can become complex, the full power of complex analysis can be harnessed to glue the trees back together into loops.
Probably the biggest advantage of the unitarity approach is that it makes it easier to automate the computation of one loop corrections. This is an important aspect, since precise predictions are needed for a large number of different processes that will be studied at the LHC. I am now working with Lance Dixon and Darren Forde here at SLAC, Carola Berger, formerly at SLAC, now at MIT, Zvi Bern and his colleagues Ferdinand Febres Cordero and Harold Ita at UCLA and David Kosower at CEA-Saclay to develop an automated computer program—BlackHat—to perform these calculations. At its present stage, BlackHat has computed QCD amplitudes at the one-loop level for the production of up to six gluons, and for the production of a W boson along with up to three jets.
Our goal is a completely automated program that computes QCD predictions at the one-loop level accuracy. Although the prospect of a completely automated program that computes the loop amplitude for any process on a single machine during a coffee break still sounds like science fiction (at least with the current typical length of a coffee break), unitarity-based methods have brought it much closer to reality.