NIH Blueprint: The Human Connectome Project

HCP Disease-related Connectome Research FAQ

Use this FAQ for your questions regarding HCP protocols and data sharing in regard to applications for NIH FOA Connectomes Related to Human Disease (U01) (https://grants.nih.gov/grants/guide/pa-files/PAR-14-281.html).

What data acquisition protocols are used for HCP multimodal data acquisition?

What MRI hardware is needed for HCP data acquisition protocols?

  • Multiband pulse sequences used in HCP fMRI and dMRI are available on multiple Siemens 3T platforms (Prisma, Skyra, and Trio) using 32-channel head coils and on the Siemens 7T scanner; they are currently being implemented for GE and Phillips platfoms.  Email Essa Yacoub (yaco0006@umn.edu) for how to obtain these pulse sequences and/or obtain additional guidance.
  • Diffusion MRI (dMRI) benefits from high maximal gradient strength, as is available on the Siemens Prisma (80 mT/m) as well as the customized Siemens MAGNETOM Connectom scanners at WashU (100 mT/m; see Sotiropoulos et al., 2013) and MGH (300 mT/m; see Setsompop et al., 2013).

What are the HCP processing pipelines for the HCP protocol?

  • See Glasser et al. (2013) for an overview of the HCP Minimal PreProcessing (MPP) pipelines, whose goals include distortion correction, motion correction, cross-modal alignment, generation of surfaces and subcortical segmentation, and cross-subject alignment to a standard CIFTI grayordinates space, and which encompass:
    • Structural preprocessing (PreFreeSurfer, FreeSurfer, PostFreeSurfer)
    • fMRI preprocessing
    • dMRI preprocessing
  • Additional HCP processing pipelines include:
    • FIX denoising for rfMRI
    • Level 2 (cross-run) and Level 3 (cross-subject) tfMRI processing
    • Other HCP pipelines for advanced processing of rfMRI data and better surface-based alignment across subjects are currently under development.
  • The pipeline scripts used for the HCP 500 subject data release are available with associated documentation at https://github.com/Washington-University/Pipelines.

What are the HCP Quality Assurance procedures?

What are the HCP behavioral measures?

What if the population we propose to study cannot handle the full HCP protocol (4hr imaging, several hr behavioral testing)?

  • The HCP is currently undertaking a Lifespan Pilot Project that includes scanning protocols similar to those for the WU-Minn Young Adult HCP except shorter in duration and amenable to children and older adults. 
  • Detailed information about these modified behavioral and imaging protocols is available at http://lifespan.humanconnectome.org/data/phase1b-pilot-parameters.html.

What is the Connectome Coordination Facility (CCF) and how can I plan to make use of it?

  • A Connectome Coordination Facility (CCF), centered at Washington University (WashU) and also involving the University of Minnesota (UMinn) is expected to be established in 2015.   
    • The UMinn HCP team will provide advice and guidance regarding HCP data acquisition protocols;
    • The WashU HCP team will host data sharing as part of an expanded ConnectomeDB database currently under development.
  • Imaging data from the Lifespan Pilot Project is being made available through ConnectomeDB.
    • This will include data obtained from 3 scanners used by the WU-Minn HCP consortium (Siemens 3T MAGNETOM Connectom at WashU; Siemens 3T Prisma and 7T at UMinn) and one scanner used by the MGH team (Siemens 3T MAGNETOM Connectom).

Will the CCF help manage and process the data collected in projects funded by the Connectomes Related to Human Disease FOA?

  • While all awardees will be required to submit their data to the CCF, the methods they use internally to manage, process, and analyze their data will be at their own discretion and expense.
  • The CCF plans to provide documentation and example datasets that illustrate how to set up HCP pipelines, test them with example data, and evaluate that they are working properly.
  • Awardees who wish to work directly with the CCF to manage their study data as part of the study’s internal operations should contact Dan Marcus (dmarcus@wustl.edu) to discuss logistical and budgetary considerations.

Do I need to include a budget in my proposal for CCF data hosting?

  • It depends.  If you plan to implement the HCP data acquisition and preprocessing pipelines at your institution and then send the data to the CCF, no additional budget request will be needed.  If you plan to collaborate with the CCF to implement additions to or changes from the existing HCP data collection or data analysis process, you should request an appropriate subcontract budget to work with the CCF. If you plan to propose to use data analysis pipelines in addition to the ones that currently exist, it is expected that those pipelines will be made available to the research community via the CCF.

How will access to CCF data be regulated?

  • NIH expects that data hosted on CCF will be made broadly available: “Applicants must use appropriate consent forms to allow the data to be deposited in and distributed from the Connectome Coordination Facility.  Those consent forms must make the data broadly available to qualified researchers”.
  • ConnectomeDB currently provides access to all HCP imaging data and most behavioral data after agreement with HCP Open Access Data Use Terms; sensitive data are handled through Restricted Data Use Terms.
  • The CCF will provide information and advice to interested investigators regarding consent forms used by the HCP (contact Sandra Curtiss (scurtiss@brainvis.wustl.edu).

If I have additional questions, whom should I contact?

  • Contact NIH program staff from the participating Institutes and Centers for questions regarding applications and appropriateness of research plans for this FOA.
  • The hcp-users mailing list is recommended for questions that may be of interest to others in the connectome community.
  • Sending email to info@humanconnectome.org is recommended if the question is not for broadcasting to a wider audience.
  • For questions related to the to-be-established CCF, please email Dan Marcus (dmarcus@wustl.edu) or Eileen Cler (clere@mir.wustl.edu) if it pertains to data processing or data sharing, or email Essa Yacoub (yaco0006@umn.edu) if it pertains to data acquisition.