Description of project: The purpose of this project is to use magnetoencephalography (MEG) to examine the spatiotemporal dynamics of neural representations of individual face identities. We recorded brain activity with MEG in four adult human participants while they viewed face images from a large, carefully-controlled set (91 face identities, with two facial expressions per identity), with a sufficiently large number of trials for each face identity to be able to evaluate the representation of individual face identities in each participant. Each participant also completed an independent localizer task (block design, five categories: faces, houses, objects, scrambled objects, words) in MEG. For each participant, we also collected structural MRI data for use in source localization of MEG signals. Finally, we collected pairwise behavioral similarity ratings of a subset of the face identities presented in the main MEG experiment, from a group of 7 participants who did not participate in the main MEG experiment. The following provides information about the files included in the data set, and about data aquisition: Participant identity codes: DR, AT, JG, and JK each refer to one of the four participants in the MEG face identity experiment. MEG data files: AT.zip, DR.zip, JG.zip, and JK.zip contain raw MEG recordings for each of the four participants. Within each .zip file, there is a separate folder for each session. All MEG data files are stored in standard .fif format, which can be read in MNE (see http://martinos.org/mne/stable/manual/cookbook.html). Files for the independent localizer task include the identifier "loc" in the file name, and the block number for this task is given by a number from 1 to 6. Names of files containing empty room data include the identifier "er" or "emptyroom". Names of files containing data from the main face identity task include the identifier "fst" (except in participant DR for sessions 3-5, which have no task identifier), and a block number. Behavioral data for MEG face identity task: behav_fst_forarchive.zip contains raw behavioral data for the MEG face identity task. Within the main .zip file, there is a separate folder for each participant. Each of these folders contains a data file for each block. The data files are .mat files, which can be opened in MATLAB, R (R.matlab package), or Python (scipy.io.loadmat package). Each .mat file contains a variable called data, which stores a 364 x 8 matrix. Each row in the matrix represents a trial, and the following columns contain relevant data: column 1 - facial identity presented - identities are numbered from 1 to 91. column 2 - facial expression presented - 1 is neutral and 2 is happy. column 8 - response (task is to press with the right index if the current face identity is the same as the previous one) - if no response registered, value is -1. if a response is registered, response time (relative to the onset of the current face identity) in seconds is given here. MEG category localizer event files: meg_loc_eve.zip contains event files (in MNE format, http://martinos.org/mne/stable/manual/cookbook.html) that give the trial structure of the category localizer for each participant and block. There are separate folders for each participant, and each folder contains 6 event files numbered 1-6, which give the trial structure for each of the 6 runs completed by each participant. In each file, the second column gives the event time, and the fourth column gives a numerical code for stimulus category. The association between the codes and categories is as follows: 1 - faces 2 - objects 3 - houses 4 - scrambled objects 5 - words Structural MRI data: MRI_forarchive.zip contains structural MRI data for each participant. There are separate folders for each participant. Each folder contains raw MRI images (see ~/mri), as well as Freesurfer and MNE output formatted for use in source localization of MEG signals on the reconstructed cortical surface (see MNE and Freesurfer documentation for full details: http://martinos.org/mne/stable/manual/cookbook.html, http://surfer.nmr.mgh.harvard.edu). Behavioral pairwise similarity ratings of stimuli: behav_pairwise_forarchive.zip contains similarity ratings for a subset of face identities presented during the MEG face identity experiment. This data set contains ratings from each of 7 participants who did not participate in the main MEG face identity experiment. Each participant provided ratings across two sessions. There is a separate .mat file for each participant and session, with each participant identified by a unique two letter code appearing after FST_b_ in the data file. The session number is identified by "s1" for session 1 or "s2" for session 2. Each .mat file contains a variable named data, which is a 764 x 4 matrix. Each row represents a single trial. The following columns contain relevant data: column 1 - One member of the pair of identities to be compared. Always presented with a neutral facial expression. column 2 - Second member of the pair of face identities to be compared. Always presented with a happy facial expression. *note that the order in which the face identities within a pair were presented was selected randomly on each trial, and is unrelated to the column in which the identity appears. column 3 - Similarity rating on a scale from 1-8 (coded here as 0-7), with 1 being the lowest similarity and 8 being the highest similarity). Acquisition of MEG and MRI data: MEG data: MEG data were acquired at the University of Pittsburgh Medical Center (UPMC) Brain Mapping Center, with a 306-channel Neuromag (Elektra AB, Sweden) system. A Panasonic PT-D7700U projector (1024 x 768 resolution, 60 Hz refresh rate) presented the stimuli at the center of a back-projection screen placed 120 cm from the participant. Face images were 6.87º high and 5º wide at this viewing distance. To track stimulus timing, we used a photodiode that emitted a continuous signal when the stimulus was on the screen. In addition, the experimental software sent a signal to the MEG acquisition computer whenever a stimulus was presented. Participants entered behavioral responses by pressing a button with their right index finger. MRI data: For each participant, we acquired a T1-weighted MPRAGE anatomical MRI scan on a Siemens Verio 3T scanner (voxel size = 1 mm3, flip angle = 9º, TE = 1.97 ms, TR = 2300 ms, FOV = 256 x 256 x 176 mm). All scans were carried out at the Scientific Imaging and Brain Research Center (SIBR) at Carnegie Mellon University.