The impact of computer simulations and machine learning
on the epistemic status of LHC Data
The impact of computer simulations and machine learning on the epistemic status of LHC Data
Current Notes: Workshop Machine Learning
Computer simulations (CSs) and machine learning (ML) are important tools for experimental data generation and analysis in contemporary high-energy physics (HEP). In this project, we address epistemic issues related to this use of CS and ML in HEP. We focus on the use of both in the ATLAS experiment, which has been operating at CERN’s LHC since 2008. We base our research on a rich background in the philosophy of experiment (e.g. Schiemann, 2008) as well as detailed knowledge of the relevant computational methods (e.g. Zeitnitz & Gabriel, 1994)
The project’s future objectives are: To address the possibility of concealed uncertainties induced by the use of CSs and ML in HEP; to define the precise sense of robustness that CSs and ML enjoy in HEP; and to understand epistemological challenges arising from CSs’ and ML’s purported opacity, and their management in HEP.
An important concept for our research is that of an epistemic risk (e.g. Hillerbrand, 2012a; 2014), broadly construed. We will use this concept to investigate the impact of CSs and ML on the discovery potential of the experiment. Among other things, this will include an assessment of the scope and epistemological underpinnings of the management of uncertainties in HEP, when these are induced by the use of CSs and ML.
Our past research has led to a detailed picture of the intricate relations pertaining between different simulation models used by ATLAS, in turn raising challenges for traditional tenets in the epistemology of simulation (unpublished). Other results include a classification of the kinds of CSs relevant for HEP (Hillerbrand, 2012b), detailed studies on the relation between theory, experiment, and simulation (Boge, 2019, forthcoming), the (non-)necessity of CSs for HEP experiments (Krämer, Schiemann & Zeitnitz, in prep.), as well as first results on questions of opacity (Boge & Grünke, forthcoming).
Marianne van Panhuys
Boge, F. J. and Zeitnitz, C. (forthcoming). Polycratic hierarchies and networks: What simulation-modeling at the LHC can teach us about the epistemology of simulation. Synthese. doi.org/10.1007/s11229-020-02667-3
Boge, Florian J. (forthcoming). How to infer explanations from computer simulations. Studies in History and Philosophy of Science. doi.org/10.1016/j.shpsa.2019.12.003
Boge, Florian J. (2019a). Why computer simulations are not inferences, and in what sense they are experiments. European Journal for Philosophy of Science Vol. 9, no. 13 (30 pp.). doi.org/10.1007/s13194-018-0239-z
Boge, F.J. and Grünke, P. (forthcoming) Computer Simulations, Machine Learning and the Laplacean Demon: Opacity in the Case of High Energy Physics, in: Resch, Kaminski, and Gehring (Eds.), The Science and Art of Simulation II, Springer.
Hillerbrand, R. (2012a). The risk of climate change. In Roeser, S., Hillerbrand, R., Sandin, P., and Peterson, M. (eds.), Handbook of Risk Theory. Springer.
Hillerbrand, R (2012b) Order out of chaos? A case study in high energy physics. Studia Philosophica Estonica, 5(2):61–78.
Hillerbrand, R. (2014). Climate simulations: Uncertain projections for an uncertain world. Journal for General Philosophy of Science, 54(1):17–32.
Krämer M., Schiemann G. & Zeitnitz C. (in preparation). Could we have done without computer simulations in the Higgs discovery?
Schiemann, G. (2008). Experimental Knowledge and the Theory of Producing it. In Feest, U., Hon, G., Rheinberger, H.J., Schickore, J., Steinle, F. (eds.), Generating Experimental Knowledge. Preprints of the Max Planck Institute for the History of Science Berlin, pp. 109-120.
Zeitnitz, C. and Gabriel, T. A. (1994). The GEANT-CALOR interface and benchmark calculations for ZEUS calorimeters. Nuclear Instruments and Methods in Physics Research Section A, 349:106-.111.