The Epistemology of the
Large Hadron Collider (LHC)

LHC-experiments between theory-ladenness and exploration: the problem of data selection

The project addresses the question how exactly the LHC copes with the tension between its high selectivity for certain data and its aim to explore a new empirical domain. The experiments planned at the LHC are expected not only test the theoretical expectation of a further particle within the standard model of elementary particle physics – the so-called Higgs-boson – up to energies of 1000 GeV, but also to discover fundamentally new phenomena, in this domain and hence open a pathway for ‘new physics’ beyond the standard model. Indeed, an increasing number of physicists is considering this second aim as most important. Thus it is a serious challenge that, for technological reasons, the LHC can only process a minute part of its immense data input: These data are automatically selected in various steps within a few nanoseconds, while the bulk of all primary data is irretrievably lost. The criteria for this data selection are necessarily shaped by theoretical assumptions about the standard model, which is precisely the theory the LHC has been built to test or extend. The aim of this project is to describe this delicate constellation as detailed as possible and to evaluate the range for possible ‘new physics’. It centers around an investigation of the mechanisms and criteria of data selection and their effects on the explorative search for ‘new physics’. It focuses on the ATLAS detector and shall not only systematically analyze and evaluate the selection criteria and their justifications, but also, during the operation of LHC, investigate possible modifications of those criteria and the detector in general. The project is based on a close collaboration of physics and contemporary history of science.

 

Principal Investigators:
Friedrich Steinle
Christian Zeitnitz

Principal Collaborator:
Koray Karaca