The IT team is composed of:

  • Project Manager and Infrastructure Expert: Anne-Sophie Ledoux (IR CNRS)
  • Systems and Networks Administrator: Vincent Massy (IE University of Lille)


and ensures:

  • Administration and maintenance of IT resources (servers, storage spaces, databases, software) in operational conditions
  • Increasing computing capacity in terms of computing power and data storage
  • The management of current services (emails, webmail, web, file hosting, collaborative platform, …)
  • The technical choices and the installation of services for EGID – European Genomic Institute for Diabetes and for PreciDIAB – National Center of Precision Medicine for Diabetes
  • Technical support and user assistance for the entire unit (70 people)
  • IT security according to the recommendations of the Information Systems Security Policy of institutions and the ANSSI – National Agency for the Security of Information Systems



The laboratory has a set of 26 physical servers (LINUX and Windows) hosted in a dedicated machine room and equipped with the main security systems according to the principles of the Datacenter Tier 3 classification (fire alarms and fire protection system, redundant air conditioning and power supply, remote site for data replication), a hundred workstations and about twenty scientific cockpits and has been certified as an Automated Information Processing Center (AIPC) since 2008. The laboratory has access  to the hybrid cluster integrated and managed by the  Mesocenter for Intensive Scientific Computing at the University of Lille  and access to the computing cluster of  the GenOuest bioinformatics platform, both available through the EGI grid.




  • Bioinformatics: 10 DELL servers running Debian 10 (5xC6420/2xR740/3xR930) with a total of 12 NVIDIA T4 GPU cards (Clara Parabricks), 1098 CPU cores and 10 TB RAM)
  • Biostatistics: 4 DELL servers running Debian 10 (1xC6420/2xR910/1xR930) with a total of 458 CPU cores and 2.5 TB of RAM)
  • Database Management/S/Schedulers: 6 DELL servers running Debian 10, Proxmox 6.5, MongoDB, Nextflow and Slurm (2xR430/1xR910/3xR920)
  • User management, computer management, security and monitoring: 2 DELL R440 servers running VMware 6.7u3
  • DMZ Service Management: 3 DELL R640 servers running Proxmox 6.5 running Debian 10 + 1 NFS storage (Dell Isilon H500)
  • Working and dedicated sequencing storage: 1 Dell Isilon cluster consisting of 4 H500 nodes and 8 A2000 nodes interconnected in 10GB ethernet and 25GB ethernet for sequencing and having a total of 2.11 PB usable
  • Storage dedicated tox HPC calculations: 1 Dell PowerScale cluster composed of 6 F200 nodes interconnected in 25GB ethernet and having a total of 41 TB usable
  • Storage dedicated to Disaster Recovery Plan (DRP) and data archiving: 1 Dell Isilon cluster composed of 4 A2000 nodes interconnected in 10GB and having a total of 865 TB usable. This cluster is hosted at the University of Lille
  • Backup: 1 DELL R740XD ESX VMWARE 6.7u3 server managing 3 virtual servers (Data Domain Virtual Edition (DDVE), Avamar Virtual Edition (AVE) and AVAMAR accelerator) and interconnected to 1 Dell MD1400 array in 25GB ethernet with a total usable of 50 TB
  • PaloAlto Site Firewall: 2 x PA-3250 HA Appliance
  • Private Key Infrastructure: encryption of exchanges to and from the research unit’s internal network (management of SSL security certificates, SHA fingerprints and public keys)




  1. GNU/Linux: Debian, Centos, Suse, Ubuntu, Raspberry Pi, IoT
  2. Microsoft Windows: Windows Server 2019, 2022, 10
  3. Apple: MAC OS X

Virtualization/containerization: Vmware, Proxmox, Docker

Antivirus: WithSecure 15.30, Clamav

Computer management: GLPI 10, FusionInventory, Dokuwiki

Backup: BackupPC, Dell EMC Avamar Client 19.3

Remote Site Replication/Synchronization: EyeGlass Superna 2.5.8

Supervision, monitoring: Zabbix, Grafana, Promotheus, InsightIQ

Audit/Hardening Systems: Lynis, PurpleKnight Community



For more information, please send a letter to