figshare
Browse
1/1
3 files

MAMEM Phase I Dataset - A dataset for multimodal human-computer interaction using biosignals and eye tracking information

Download all (7.3 GB)
Version 3 2017-07-26, 07:33
Version 2 2017-07-24, 07:28
Version 1 2017-07-24, 07:27
dataset
posted on 2017-07-26, 07:33 authored by Spiros NikolopoulosSpiros Nikolopoulos, Kostas Georgiadis, Fotis Kalaganis, Georgios Liaros, Ioulietta Lazarou, Katerina Adam, Anastasios Papazoglou-Chalikias, Elisavet Chatzilari, Vengerlis P. Oikonomou, Panagiotis C. Petrantonakis, Ioannis Kompatsiaris, Chandan Kumar, Raphael Menges, Steffen Staab, Daniel Muller, Korok Sengupta, Sevasti Bostantjopoulou, Katsarou Zoe, Gabi Zeilig, Meir Plotnik, Amihai Gottlieb, Sofia Fountoukidou, Jaap Ham, Dimitrios Athanasiou, Agnes Mariakaki, Dario Comanducci, Edoardo Sabatini, Wlater Nistico, Markus Plank
This dataset combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals along with demographic, clinical and behavioral data collected from 36 individuals (18 able-bodied and 18 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. Alongside these data we also include evaluation reports both from the subjects and the experimenters as far as the experimental procedure and collected dataset are concerned. We believe that the presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.

Please use the following citation:
Nikolopoulos, Spiros, Georgiadis, Kostas, Kalaganis, Fotis, Liaros, Georgios, Lazarou, Ioulietta, Adam, Katerina, Papazoglou – Chalikias, Anastasios, Chatzilari, Elisavet , Oikonomou, Vangelis P., Petrantonakis, Panagiotis C., Kompatsiaris, Ioannis, Kumar, Chandan, Menges, Raphael, Staab, Steffen, Müller, Daniel, Sengupta, Korok, Bostantjopoulou, Sevasti, Zoe, Katsarou , Zeilig, Gabi, Plotnik, Meir, Gottlieb, Amihai, Fountoukidou, Sofia, Ham, Jaap, Athanasiou, Dimitrios, Mariakaki, Agnes, Comanducci, Dario, Sabatini, Edoardo, Nistico, Walter & Plank, Markus. (2017). The MAMEM Project - A dataset for multimodal human-computer interaction using biosignals and eye tracking information. Zenodo. http://doi.org/10.5281/zenodo.834154

Read/analyze using the following software:
https://github.com/MAMEM/eeg-processing-toolbox

Funding

This work is part of project MAMEM that has received funding from the European Union’s Horizon 2020 research & innovation program under grant agreement number: 644780.

History