Contact
sgupta+tool [at] theory.tifr.res.in
(Yes, really, the name contains a plus sign)
Currently maintained by Sourendu Gupta

Uncertainties in collider predictions

Working group on collider tools

Physics motivation

QCD allows us to transfer information about one cross section to another through the pdf's. These include interesting standard model (SM) processes such as the Higgs production rate. They also include new physics signals such as the production of superpartners of SM particles and the SM backgrounds to these processes such as W+jets.

There are two kinds of uncertainties that plague this transfer:

1. since QCD predictions are computed in a power series in αs (to different orders for different processes) there may be some incompatibility in using these computations together. Complications such as this should be termed irreducible theory uncertainties (IT?).
2. PDFs are extracted from data which have statistical and systematic uncertainties. As a result, PDF sets have inherent uncertainties which are also often called QCD uncertainties, but should really be termed theory-filtered experimental uncertainties (TFE?).

Normally one takes a certain set of input cross sections measured in experiments (DATA box), and fits common parton density sets to them using QCD predictions at a certain available order (QCD box) using certain statistical methods for parameter extraction (STAT box). The output of this phase of the anaysis is a published PDF set (which may or may not include statistical uncertainties). Users interested in other processes then write codes for the processes they are interested in (NEW-PROC box) and use the PDF sets as inputs to investigate cross sections for these processes. The uncertainties in the results are usually estimated in fairly crude ways.

This proposal is about building a toolset for colliders which does the following:

• allows users to systematically investigate either IT? or TFE?, or the two together
• reuse written code
• add new code to the repertoire of available tools
• smoothly join new data to the analysis as it becomes available
The main physics job is to carefully define the interfaces between the DATA, QCD, STAT and NEW-PROC boxes, and a replacement of the PDF sets by a pipeline (which we might call, obviously, the PDF pipe).

It was pointed out that different groups use different methods of statistical analysis, so the interface of the STAT box has to be sufficiently general.

The PDF pipe does not have to be human readable, and hence can contain much more information about statistical properties of the PDF's than is customarily published (the specification should also allow for less information).

One has to take care that the interfaces to the modules inside the QCD box and the NEW-PROC box should be good enough to ``plug and play''. If some one writes a NEW-PROC module (say for $pp\to Z+b‾b$) now then it should be possible to move it to the QCD box when data on this process becomes available.

At the same time, the DATA box should have an interface general enough that new data can be included and old data selectively removed in a given computation.

The next step

It is proposed that a working group be set up for this and meet in March to produce the next level of detail in the problem specification. People who join this working group will take responsibility for finalizing a detailed report (length of around four A4 pages) setting out the state of the art for various processes.

1. $\alpha$s: D. Indumathi
2. Evolution equations
3. Deep-inelastic scattering
4. Inclusive Drell-Yan (including vector boson production)
5. Heavy-flavour production
6. Fragmentation functions
7. Higgs
8. Jets
9. Vector bosons plus jets
10. Photons