Scientific data production and processing

IN2P3 researchers need very large computing power in order to process data from big scientific instruments (accelerators, telescopes or satellites) or to make models and validate scientific theories. To do this they use the thousands of CC-IN2P3 servers on which computer calculations are shared through a scheduling system. Calculations may be sequential or parallel. Computing servers, which run a Linux system, mostly feature Intel x86_64 processors and, to a lesser extent, special graphic processors. The resources may be used locally or remotely via a computing grid infrastructure.

Massive data storage

Scientific data is stored with a mass storage system based on the combined use of various magnetic tape and conventional hard disk technologies. The type of storage used varies depending on the performance level required to access this data and the duration of archiving. The CC-IN2P3 has a storage capacity of 20 petabytes on disk and 340 petabytes on magnetic tapes.

Network

The CC-IN2P3 is responsible for the Internet connectivity of all the institute’s laboratories and several experiment sites where data is produced. To fullfil this role the CC-IN2P3 relies on the RENATER (the national electronic communication network for technology, education and research). This means that the CC-IN2P3 has a dedicated 20 Gbit/s link to CERN, a 30 Gbit/s line to the LHCONE international private network devoted to the LHC, a 20 Gbit/s line to the NCSA in the University of Illinois for the LSST project and a 20 Gbit/s link for access to general RENATER and GEANT networks. The CC-IN2P3 is working with RENATER to get external links of up to 100 Gbit/s in 2017. The CC-IN2P3 also enjoys a unique internal network with a 400 Gbit/s backbone and more than 1,000 10 Gbit/s connections, making it one of the fastest in the Auvergne Rhône-Alpes region.

This article is also available in: French