Aarhus University Seal

The national HPC landscape

The national HPC landscape

DeiC Interactive HPC is part of the national HPC landscape consisting of different types of supercomputers available to all researchers affiliated with the Danish universities. The purpose of this landscape is to ensure the necessary and sufficient computing power for Danish research now and in the future. 

The universities handle operations and development, while DeiC has a coordinating role. The national HPC landscape is financed by the Danish universities and the Ministry of Education and Research. 


DeiC Interactive HPC 

DeiC Interactive HPC is an interactive digital research environment accessed through UCloud. 

Characteristics of Interactive HPC: 

  • Interactive – you can interact with jobs.
  • Persistent storage.
  • Access management on project and sub project levels. A project PI can create access groups for their project data specified on a group or individual level. 
  • Preinstalled applications that can be customized as necessary for individual app instantiations. 
  • Persistent software and library stack for e.g., Python in Jupyter, RStudio or Julia.
  • Both students and staff have access to Interactive HPC. 
  • Requires a minimum level of technical expertise and is suitable for both new users and students as well as experienced users. 

DeiC Large Memory HPC 

DeiC Large Memory HPC is a small cluster intended for calculations requiring a large amount of memory - configured as a traditional HPC system. 

Large Memory HPC focuses on problem solving, with a structure that cannot be easily or efficiently distributed between many computer nodes. This is a type of system that is characterized by typically relatively few cores with access to a large globally addressable memory area.  

The application is particularly ideal in certain areas of classical chemistry, physics, signal processing with for example handling of large matrix problems and quantum chemistry. 

Characteristics of Large Memory HPC: 

  • Mostly unattended batch jobs 
  • Large in-memory datasets 
  • Large workflows 
  • Requires some level of technical expertise to use 

DeiC Throughput HPC 

The DeiC Throughput HPC system is represented in the national HPC landscape by computerressources delivered by the Sophia HPC cluster, the GenomeDK cluster and the Computerome2 cluster. 

Throughput HPC is typically used for calculations on big data within areas such as health science, technical simulations, chemistry, physics and bioinformatics. 

Characteristics of Throughput HPC: 

  • Mostly unattended batch jobs 
  • Many jobs/many cores 
  • Large workflows (inhomogeneous jobs) 
  • Requires some level of technical expertise to use 

LUMI Capability HPC 

Capability HPC is the European pre-exascale supercomputer, LUMI, and it’s located in the CSC datacentre in Finland. LUMI is financed 50% by EuroHPC Joint Undertaking, and 50% by a consortium of countries of which Denmark funds 3%.

LUMI offers a setup similar to that of Throughput HPC, but with the very height of hardware enabling LUMI to handle calculations that surpass the capacity of Throughput HPC due to latency or memory.  

Characteristics of Capability HPC: 

  • Mostly unattended batch jobs 
  • Many jobs/many cores 
  • Large workflows 
  • Requires some level of technical expertise to use 

DeiC Integration Portal

It is now possible for researchers to access the DeiC HPC systems via a common national portal. The majority of DeiC HPC services are already part of the portal: DeiC Interactive HPC, DeiC Large Memory HPC, and part of DeiC Throughput HPC (Sophia). The remaining parts of DeiC Throughput HPC will be added in the future.  

As for LUMI the integration is limited to project management and requests of resources for the time being. It is not yet possible to run jobs on LUMI directly from the DeiC Integration Portal.