View inside a server tape Storateck in the Computer Centre.
(Image: CERN)


Collisions in the LHC generate particles that often decay in complex ways into even more particles. Electronic circuits record the passage of each particle through a detector as a series of electronic signals, and send the data to the CERN Data Centre for digital reconstruction. The digitised summary is recorded as a 'collision event'. Up to about 1 billion particle collisions can take place every second inside the LHC experiment's detectors. It is not possible to read out all of these events. A 'trigger' system is therefore used to filter the data and select those events that are potentially interesting for further analysis.

Even after the drastic data reduction performed by the experiments, the CERN Data Centre processes on average one petabyte (one million gigabytes) of data per day. The LHC experiments produce over 50 petabytes of data per year, and an additional 25 petabytes of data are produced per year for data from other (non-LHC) experiments at CERN. Archiving the vast quantities of data is an essential function at CERN. Magnetic tapes are used as the main long-term storage medium and data from the archive is continuously migrated to newer technology, higher density tapes.

The CERN storage system, EOS, was created for the extreme LHC computing requirements. EOS instances at CERN are approaching two billion files (as of 2018), matching the exceptional performances of the LHC machine and experiments. EOS is now expanding for other data storage needs beyond high-energy physics, with AARNET, the Australian Academic and Research Network, and the EU Joint Research Centre for Digital Earth and Reference Data adopting it for their big-data systems.

The trigger: how collision events in ATLAS are selected for storage (Video: Daniel Dominguez/CERN)