NIHR Biomedical Research Centre for Mental Health: Computational Biology
[Home] [HPC Cluster] [News] [Past Presentations] [Software] [Wiki]

[About] [Documentation] [Ganglia] [Grants] [Software] [Support] [VPN Access Form]


This website documents the NIHR Biomedical Rearch Centre for Mental Health (BRC-MH) Linux Cluster. Much of the documentation format has been taken from the SGDP Cluster website which is recommended as a great resource for a Linux Cluster introduction.


Incorporated into the cluster are two HP C7000 enclosures, each containing 15 x HP BL460c G6 blades. Each blade is configured with 2 x 4-core Intel® Xeon® Processor X5550. Of the 30 blades, 15 blades contain 78 GB RAM and the remaining 15 blades contain 54 GB of RAM.

In each HP C7000 Blade enclosure, the first blade is configured as a head-node, which contains two 146 GB SAS 15k drives, configured in RAID 1 mirror configuration. The head-node of the first enclosure serves out a Centos 5.4 operating system to the other 28 diskless compute-nodes in the C7000 Blade enclosures. The calculated performance of the blade systems is 2.272 TFLOPS using HPL 2.0.

All nodes of the cluster are connected via Infinband to provide a separate network path for MPI applications.

In addition to the blade cluster, included in the cluster is a HP DL580 G5 system containing 4 x 6-core Intel® Xeon® Processor X7460, and 144 GB of memory. This system can be used to run jobs which require large amounts of memory, and MPI jobs running shared-memory. The calculated performance of the system is 142 GFLOPS using HPL 2.0.

Storage requirements are provided by 3 Panasas Active Store shelves, providing 120TB of disk storage, connected via 10Gb Ethernet SFP+ to all nodes on the cluster.

Operating Systems

All nodes in the cluster use x86_64 Linux as their operating system. Each head node of the two C7000 blade enclosures and the large memory server run RedHat Enterprise Linux 5.4, while the 28 diskless compute nodes run Centos 5.4.

Sun Grid Engine is used to schedule jobs across the cluster, and information on using this system can be found in the wiki.


A wide range of state-of-the-art bioinformatic tools have been installed on the cluster to provide systems for machine learning, next-generation sequencing, genome-wide association studies to mention only a few. In addition Mathworks Matlab is available with a total of 64 licenses. A comprehensive list of software can be viewed here.

Performance Benchmark

Using the High-Performance Linpack Benchmark 2.0 (HPL) and GotoBLAS libraries a combined performace of 2.4TFLOPS was calculated. All components were compiled using GCC 4.1.2.

NB: 160
P: 5
Q: 6
Threshold: 16.0
A grid size of 30 was run over 30 nodes, with 8 threads on each node.
NB: 160
P: 2
Q: 2
Threshold: 16.0
A grid size of 4 was run on the single server nodes, with 6 threads on each cpu.