figshare
Browse
1/1
3 files

Estimation of Emotional Arousal Changes of a Group of Individuals during Movie Screening Using Steady-State Visual-Evoked Potential

Download all (52.61 MB) This item is shared privately
dataset
modified on 2022-05-02, 16:12

Neurocinematics is an emerging discipline in neuroscience, which aims to provide new filmmaking techniques by analyzing the brain activities of a group of audiences. Several neurocinematics studies attempted to track temporal changes in mental states during movie screening; however, it is still needed to develop efficient and robust electroencephalography (EEG) features for tracking brain states precisely over a long period. This study proposes a novel method for estimating emotional arousal changes in a group of individuals during movie screening by employing steady-state visual evoked potential (SSVEP), which is a widely used EEG response elicited by the presentation of periodic visual stimuli. Previous studies have reported that the emotional arousal of each individual modulates the strength of SSVEP responses. Based on this phenomenon, movie clips were superimposed on a background, eliciting an SSVEP response with a specific frequency. Two emotionally arousing movie clips were presented to six healthy male participants, while EEG signals were recorded from the occipital channels. We then investigated whether the movie scenes that elicited higher SSVEP responses coincided well with those rated as the most impressive scenes by 37 viewers in a separate experimental session. Our results showed that the SSVEP response averaged across six participants could accurately predict the overall impressiveness of each movie, evaluated with a much larger group of individuals.

@font-face {font-family:"Cambria Math"; panose-1:2 4 5 3 5 4 6 3 2 4; mso-font-charset:0; mso-generic-font-family:roman; mso-font-pitch:variable; mso-font-signature:-536870145 1107305727 0 0 415 0;}@font-face {font-family:"맑은 고딕"; panose-1:2 11 5 3 2 0 0 2 0 4; mso-font-charset:129; mso-generic-font-family:swiss; mso-font-pitch:variable; mso-font-signature:-1879048145 701988091 18 0 524289 0;}@font-face {font-family:"\@맑은 고딕"; mso-font-charset:129; mso-generic-font-family:swiss; mso-font-pitch:variable; mso-font-signature:-1879048145 701988091 18 0 524289 0;}p.MsoNormal, li.MsoNormal, div.MsoNormal {mso-style-unhide:no; mso-style-qformat:yes; mso-style-parent:""; margin-top:6.0pt; margin-right:0cm; margin-bottom:12.0pt; margin-left:0cm; mso-pagination:widow-orphan; font-size:12.0pt; mso-bidi-font-size:11.0pt; font-family:"Times New Roman",serif; mso-fareast-font-family:"맑은 고딕"; mso-fareast-theme-font:minor-fareast; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-fareast-language:EN-US;}.MsoChpDefault {mso-style-type:export-only; mso-default-props:yes; font-size:11.0pt; mso-ansi-font-size:11.0pt; mso-bidi-font-size:11.0pt; font-family:"Cambria",serif; mso-ascii-font-family:Cambria; mso-ascii-theme-font:major-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:major-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-font-kerning:0pt; mso-fareast-language:EN-US;}.MsoPapDefault {mso-style-type:export-only; margin-bottom:10.0pt; line-height:115%;}div.WordSection1 {page:WordSection1;}

Funding

This work was supported in part by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korean government (MSIT) (2017-0-00432, Development of non-invasive integrated BCI SW platform to control home appliances and external devices by user's thought via AR/VR interface) and in part by the National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIT). (No. NRF-2019R1A2C2086593).