NIH Blueprint: The Human Connectome Project

HCP Data Releases

Over a 3-year span (2012-2015), the Human Connectome Project (HCP) is scanning 1,200 healthy adult subjects. Our goal is to map region-to-region structural and functional connections of the human brain: the "human connectome."

1200 Subjects Release | Mar 2017

Summary: The 1200 Subjects release (final release of new subjects) includes behavioral and 3T MR imaging data from1206 healthy young adult participants collected in 2012-2015. 3T structural scans are available for 1113 subjects. 46 subjects have 3T HCP protocol Retest data available. 184 subjects have multimodal 7T MR imaging data available. 3T and 7T Diffusion data for all subjects have been re-preprocessed with an updated diffusion pipeline that removes noise caused by subject movement. 3T diffusion bedpostX analysis datasets have been added.

Full 1200 Subjects Release Documentation


Extensively Processed fMRI Data

Netmats Megatrawl (820 Subjects): An analysis of the relationships between imaging and non-imaging measures in the HCP, using multivariate-prediction and univariate-regression.

Documentation (PDF) | View

HCP_S900_GroupAvg_v1 Dataset (881, 820, and 787 Subjects): This Connectome Workbench-compatible dataset (1.4 GB zip file) includes group-average structural and functional MRI data and associated tutorial for the HCP S900 data release (December, 2015). Composite files containing MSMAll-registered maps of folding, ‘sulc’, myelin, and thickness also enable efficient navigation of individual subject data.

Download in ConnectomeDB (Login Required)

Group Average Functional Connectivity (820 Subjects): Two group average dense functional connectomes have been generated from 820 subjects in the S900 release, one based on MSMAll (recommended for most analyses) and the other based on MSMSulc (less well aligned, available for comparison purposes).

Documentation (PDF) | View in ConnectomeDB (Login Required)

HCP900 Parcellation + Timeseries + Netmats (820 Subjects): Analysis based on data from all 820 subjects in the S900 data release having four complete rfMRI runs (with 100% of collected timepoints), yielding the following outputs at 6 ICA dimensionalities: Group-average parcellations yielded by group-ICA, subject-specific parcellations, subject-specific node timeseries, and a set of subject-specific parcellated connectomes.

Documentation (PDF) | Download in ConnectomeDB (Login Required)

HCP Lifespan Pilot Project Release | Aug 2014

Summary: The WU-Minn HCP consortium is acquiring and sharing pilot multimodal imaging data acquired across the lifespan, in 6 age groups (4-6, 8-9, 14-15, 25-35, 45-55, 65-75) and using scanners that differ in field strength (3T, 7T) and maximum gradient strength (70-100 mT/m).

The scanning protocols are similar to those for the WU-Minn Young Adult HCP, except shorter in duration. The objectives are (i) to enable estimates of effect sizes for identifying group differences across the lifespan and (ii) to enable comparisons across scanner platforms, including data from the MGH Lifespan Pilot. 

Full HCP Lifespan Pilot Project Documentation

MGH Adult Diffusion Dataset Added | Aug 2014

Summary: The MGH-USC HCP team has acquired and shared diffusion imaging data from 35 healthy adults, between the ages of 20 and 59, scanned on the customized Siemens 3T Connectom scanner. This scanner is a modified 3T Skyra system (MAGNETOM Skyra Siemens Healthcare), housed at the MGH/HST Athinoula A. Martinos Center for Biomedical Imaging (see Setsompop et al., 2013 for details of the scanner design and implementation). A 64-channel, tight-fitting brain array coil (Keil et al., 2013) was used for data acquisition. 

MGH Adult Diffusion Documentation | Separate data agreement required

HCP Subjects

Our target number is 1,200 healthy adults, ages 22-35 (~100 scanned/quarter), whose race/ethnicity is representative of the US population. We are recruiting twins and their non-twin siblings in order to enable research on the variability and heritability of brain structure and connectivity.

HCP Protocols

MR datasets are collected from each participant during a 2-day visit to Washington University that includes a demographic survey, behavioral tests, and up to 5 MR scan sessions that include structural, functional (resting state and task) and diffusion imaging.

MEG datasets are collected from a subset of MR participants during a 1-day visit to Saint Louis University. Each visit includes a series imaging modalities including task and resting state.

We follow a Standard Operating Procedure for data collection and use the same Imaging Protocols for every participant.

Several types of data are available in this release:

MRI Data Types

  • Unprocessed NIFTI images for structural MRI, fMRI, dMRI
  • Minimally Preprocessed NIFTI images for structural MRI, fMRI, dMRI
  • ICA-FIX Denoised rfMRI data
  • Group Average Functional MRI Data on 40 unrelated subjects and 120 subjects, some of whom are related

MEG Data Types

  • Unprocessed MEG data in 4D Neuroimaging format
  • Co-registration information that allow coordinate transformations between individual subject MEG coordinate systems and the MNI coordinate system
  • Volume conduction model of the head and regular 3-D source models in MATLAB format

Scan and Subject Metadata

  • Task data scores in CSV format, and task stimuli in AVI format
  • Behavioral data scores in CSV format
  • E-Prime log files and physiological data in tab-delimited spreadsheets.
  • Additional modality-specific log and reference data

Warning: Big Data

The high-resolution image data collected on each HCP subject results in file sizes that are quite large. The full set of released data, including 3T, 7T, and MEG scan data, may take 100TB or more of disk space. Therefore, if you choose to obtain even a subset of HCP data via download, you should be prepared to experience long download times and to devote an appropriately large amount of disk space to housing the data.

We recommend that groups of users at the same institution organize themselves to obtain a single copy of HCP data for local distribution (via joint purchase of "Connectome in a Box" or a single download).

As an alternative, HCP data is also accessible via Amazon Web Services, where processing can be performed without transferring the data from us to you. Learn more about how to access HCP data via AWS on our wiki.

Avoid Data Version Conflicts

HCP data release versions can contain appreciable differences that can affect your research.

We are releasing data as it is deemed to be "production-ready," as we want to be responsive to demand from the scientific community. However, there are still processes ongoing inside the WU-Minn HCP Consortium to improve our data processing. As such, it is imperative that users attempt to keep track of which version of data they are using, and to be cognizant about avoiding mixing old data with new in their research.

Users can track data improvements and ongoing known issues on the public HCP Data Wiki.

As a reminder, HCP strongly advises against mixing data processed with different whole number versions of the processing pipelines. Check the release notes in the {SubjectID}/releasenotes directory distributed with each subject dataset to ensure all data used in your analyses have been processed with the same whole number pipeline versions. For example, data released at the 500, 900, and 1200 Subjects Releases were processed using v3 of the HCP pipelines and should not be used in analyses with HCP data processed with v1 or v2 versions of the HCP pipelines.

Diffusion Data Warning

The updated diffusion pipeline (v3.19.0) produced cleaner diffusion data than the v3 pipelines used for the S900 or S500 releases. Due to this upgrade, we recommend that users only use S1200 release data to compare preprocessed diffusion data between subjects.

Get Updates on Data Releases

Sign up to receive announcements and/or join the HCP Data Users email discussion group, on our Contact Us page