Next Article in Journal
Dataset on Food Waste in Households: The Case of Latvia
Next Article in Special Issue
A Non-Binary Approach to Super-Enhancer Identification and Clustering: A Dataset for Tumor- and Treatment-Associated Dynamics in Mouse Tissues
Previous Article in Journal
From Crisis to Algorithm: Credit Delinquency Prediction in Peru Under Critical External Factors Using Machine Learning
Previous Article in Special Issue
Draft Genome Sequence Data of the Ensifer sp. P24N7, a Symbiotic Bacteria Isolated from Nodules of Phaseolus vulgaris Grown in Mining Tailings from Huautla, Morelos, Mexico
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Data Descriptor

A Complementary Dataset of Scalp EEG Recordings Featuring Participants with Alzheimer’s Disease, Frontotemporal Dementia, and Healthy Controls, Obtained from Photostimulation EEG

by
Aimilia Ntetska
1,
Andreas Miltiadous
2,
Markos G. Tsipouras
1,
Katerina D. Tzimourta
1,
Theodora Afrantou
3,
Panagiotis Ioannidis
3,
Dimitrios G. Tsalikakis
1,
Konstantinos Sakkas
2,
Emmanouil D. Oikonomou
2,
Nikolaos Grigoriadis
3,
Pantelis Angelidis
1,
Nikolaos Giannakeas
2 and
Alexandros T. Tzallas
2,*
1
Department of Electrical and Computer Engineering, University of Western Macedonia, 50100 Kozani, Greece
2
Department of Informatics and Telecommunications, University of Ioannina, 47100 Arta, Greece
3
2nd Department of Neurology, AHEPA University Hospital, Aristotle University of Thessaloniki, 54636 Thessaloniki, Greece
*
Author to whom correspondence should be addressed.
Data 2025, 10(5), 64; https://doi.org/10.3390/data10050064
Submission received: 14 April 2025 / Revised: 27 April 2025 / Accepted: 28 April 2025 / Published: 29 April 2025
(This article belongs to the Special Issue Benchmarking Datasets in Bioinformatics, 2nd Edition)

Abstract

:
Research interest in the application of electroencephalogram (EEG) as a non-invasive diagnostic tool for the automated detection of neurodegenerative diseases is growing. Open-access datasets have become crucial for researchers developing such methodologies. Our previously published open-access dataset of resting-state (eyes-closed) EEG recordings from patients with Alzheimer’s disease (AD), frontotemporal dementia (FTD), and cognitively normal (CN) controls has attracted significant attention. In this paper, we present a complementary dataset consisting of eyes-open photic stimulation recordings from the same cohort. The dataset includes recordings from 88 participants (36 AD, 23 FTD, and 29 CN) and is provided in Brain Imaging Data Structure (BIDS) format, promoting consistency and ease of use across research groups. Additionally, a fully preprocessed version is included, using EEGLAB-based pipelines that involve filtering, artifact removal, and Independent Component Analysis, preparing the data for machine learning applications. This new dataset enables the study of brain responses to visual stimulation across different cognitive states and supports the development and validation of automated classification algorithms for dementia detection. It offers a valuable benchmark for both methodological comparisons and biological investigations, and it is expected to significantly contribute to the fields of neurodegenerative disease research, biomarker discovery, and EEG-based diagnostics.
Dataset: 10.18112/openneuro.ds006036.v1.0.2.
Dataset License: CC0

1. Summary

Alzheimer’s disease (AD) and frontotemporal dementia (FTD) are two progressive neurodegenerative disorders that mainly affect older adults, often leading to significant memory loss and cognitive decline [1]. As of recent studies, over 55 million people around the world are living with dementia, with Alzheimer’s patients accounting for 60% to 80% of them [2,3]. On the opposite side, frontotemporal dementia is a rarer form of dementia accounting for approximately 5–10% of these cases [1]. It has been estimated that, in Europe alone, there were one million persons living with dementia in 2024 (982,000), projected to be estimated at 1.4 million by the year 2040 [4]. Both neurological diseases are manifested by cognitive impairment and behavior change, affect the brain in different ways, and cause similar (but possibly overlapping) symptoms [5]. Alzheimer’s disease (AD) is characterized by deteriorating memory loss, language impairment, and debilitated visuospatial functions. However, FTD is characterized by a mixture of early behavioral changes, such as disinhibition, apathy, compulsivity, and language impairment, while memory is relatively preserved in the early course [6]. It is important to note that as of now, both conditions are incurable, with existing treatments providing limited relief of symptoms [7].
Diagnosis of frontotemporal dementia (FTD) and Alzheimer’s disease (AD) is a comprehensive process that includes clinical examination, neuropsychological testing, and advanced imaging techniques [8]. Clinical evaluation typically includes patient history and cognitive testing to identify characteristic patterns of impairment. Neuropsychological tests also detail cognitive deficits and help to differentiate AD from FTD [9]. Neuroimaging is also crucial in the diagnosing process. Magnetic resonance imaging (MRI) and fluorodeoxyglucose positron emission tomography (FDG-PET) are the most common modalities used for diagnosing dementia. Imaging techniques are able to detect morphological brain altercations, such as structural brain changes like atrophy patterns specific to each disorder [10]. Finally, emerging biomarkers offer additional diagnostic data; for example, cerebrospinal fluid (CSF) analysis measuring amyloid-beta and tau protein levels can be useful in the diagnosis of AD [11]. However, the early symptoms of AD and FTD can sometimes be subtle and more often than not overlap with other neurodegenerative disorders, making timely and accurate diagnosis more complicated [12]. This diagnostic challenge highlights the necessity of ongoing research. Early and accurate diagnosis enables early intervention, implementation of safety measures, legal and financial planning, and access to support services, with attendant improvement in outcomes and quality of life for both patients and their families.
Electroencephalography (EEG) is increasingly being explored as a supportive tool for investigating Alzheimer’s disease and frontotemporal dementia, particularly when used alongside clinical evaluation and neuroimaging techniques [13]. EEG records brain electrical activity and can assist in the identification of anomalies interrelated with certain disorders. By applying machine or deep learning techniques to EEG data, it is possible to identify spatial and temporal patterns that may indicate the presence of neurodegenerative disorders [14]. For example, abnormalities such as slowing of brain rhythms and disruptions in functional connectivity have been observed in both Alzheimer’s disease and frontotemporal dementia, albeit with varying characteristics across conditions [15]. While the use of machine learning with EEG data for the automatic identification of these conditions is in its early stages, further studies and validation are essential. Nonetheless, EEG-based machine learning techniques present an encouraging, non-invasive, cost-effective, and universally accessible method for diagnosing these syndromes. Further research in the field might potentially lead to earlier and more accurate diagnosis of Alzheimer’s disease and frontotemporal dementia.
This study provides a detailed description of a publicly available dataset of EEG signals of elderly patients with AD and FTD with their age-matched controls during photo-stimulation. This dataset is complementary to a previously published dataset of EEG recordings from the same patients with their eyes closed [14,16,17]. The data are presented in Brain Imaging Data Structure (BIDS) format [18], a standardized method for classifying and arranging neuroimaging data. BIDS was created to increase neuroimaging data’s uniformity, interoperability, and usability across many research teams and organizations. The publicly available collection of EEG recordings from individuals with AD, healthy controls (CN), and FTD is a vital resource for researchers studying these neurodegenerative diseases. The collection will enable researchers to investigate the underlying causes of these diseases, identify potential biomarkers for early detection. Such EEG datasets are essential for developing and evaluating machine learning pipelines aimed at the automatic detection and classification of neurodegenerative diseases. This dataset, in particular, can support the advancement of the field by enabling researchers to test, compare, and refine their algorithms on real-world, clinically relevant data. All things considered, this dataset could greatly improve our knowledge of frontotemporal dementia, Alzheimer’s disease, and the function of EEG in diagnosing these conditions.

2. Data Description

This dataset includes EEG recordings from a total of 88 subjects with their eyes open in a multiple photic stimulation setting. Among the subjects, 36 were diagnosed with Alzheimer’s disease (AD group), 23 were affected by frontotemporal dementia (FTD group), and 29 were cognitively normal (CN group). The cognitive and neuropsychological functioning of all the subjects were assessed by means of the renowned Mini-Mental State Examination (MMSE). The MMSE is a 30-point cognitive screening tool, and lower scores indicate greater levels of cognitive impairment. Generally, scores above 24 are taken to indicate normal cognition, scores of 18–23 can indicate mild impairment, and scores below 18 will often indicate more severe cognitive decline [19]. This dataset is hosted on OpenNeuro.org, a free and open platform for sharing neuroimaging data, which supports the BIDS standard and facilitates broad accessibility and reuse by the neuroscience community [20].

2.1. EEG Recordings

As mentioned above, this dataset is complementary to the previously published dataset named “A dataset of EEG recordings from: Alzheimer’s disease, Frontotemporal dementia and Healthy subjects” [16]. The dataset is designed to complement a previously published dataset in which the same cohort underwent EEG recordings with their eyes closed. Its recordings are based on the clinical protocol applied by the Second Department of Neurology at AHEPA University of Thessaloniki, Greece, consisting of eyes-open EEG recordings in various photic stimulation situations. Participants were exposed to photic stimulation while sitting with their eyes open during the recordings. Stimulation was delivered in 5 Hz increments, beginning at 5 Hz and progressing to higher frequencies, working up to 10 Hz, 15 Hz, and in certain situations, 30 Hz. EEG was recorded using 19 scalp electrodes (Fp1, Fp2, F7, F3, Fz, F4, F8, T3, C3, Cz, C4, T4, T5, P3, Pz, P4, T6, O1 and O2) and two reference electrodes (A1 and A2) positioned according to the 10–20 international system. The resolution was 10 uV/mm, and the sampling rate was 500 Hz. The recordings were obtained within the range of the following amplifier parameters: the time constant was at 0.3 s, a high-frequency filter was present at 70 Hz, and sensitivity was equal to 10 uV/mm. Each recording lasted approximately 4.86 min for the AD group (min = 1.30 min, max = 8.77 min), 4.42 min for the FTD group (min = 1.25 min, max = 10.05 min), and 6.43 min for the CN group (min = 3.17 min, max = 9.17 min). In total, 174.94 min of AD, 101.56 min of FTD, and 186.50 min of CN recordings were collected and are included in the dataset.

2.2. Participants

The number of the participants and the order in which they appear in this dataset is identical to the ones in the previously published dataset it complements. In total, the dataset consists of 44 male patients and 44 female patients. According to the clinical protocol, a routine EEG consisted of eyes-closed recording followed by eyes-open hyperventilation and photo-stimulation recordings, meaning that the recordings in the previous dataset were followed by the recordings presented in this dataset. Cognitive function in 36 AD patients, 23 FTD patients, and 29 cognitively normal (CN) controls was measured with the Mini-Mental State Examination (MMSE), with lower scores reflecting greater cognitive impairment. The AD group’s mean MMSE score was 17.75 (±4.5), the FTD group’s was 22.17 (±8.22), and that of the CN group was 30. The mean ages for the groups were 66.4 years (±7.9) for the AD group, 63.6 years (±8.2) for the FTD group, and 67.9 years (±5.4) for the CN group. The median duration of illness was 25 months, with an interquartile range of 24 to 28.5 months. Interestingly, there were no comorbid conditions related to dementia in the AD group. Table A1 (Appendix A) presents a detailed description of each participant. Participants have been anonymized and personal information has not been disclosed, following GDPR restrictions.

2.3. Dataset Structure

The dataset is organized according to the Brain Imaging Data Structure (BIDS) standard, ensuring consistency, interoperability, and ease of use within the neuroimaging research community. According to the BIDS format, all neuroimaging data, including structural and functional MRI and EEG, should follow the file organization structure and naming pattern specified. Additionally, it specifies metadata in JSON format that describes the data, including job details, acquisition settings, and subject and session identities. Making the dataset BIDS-compatible guarantees that other researchers will find it easy to use, as open-source software (like EEGLAB) offers tools for processing and interpreting neuroimaging data from datasets that comply with BIDS. The dataset structure is described in the following Figure 1.
As seen in the Figure 1 above, the following are included in the dataset. (1) The dataset_description.json file, which includes details about the dataset’s authors, the DOI, the BIDS version of the dataset, the license under which it is released, the ethical approval statement, and the acknowledgement of the research project that made this work possible. (2) The participants.json file, which includes definitions for the characteristics of the participants listed in Table A1. Software like EEGLAB uses this information file to automatically classify and organize the participants’ EEG recordings. (3) The data from Table A1 are contained in the tab-separated participants.tsv file. (4) Sub-0XX is the name of a folder system. One participant-id from the participant database is linked to each folder.
Four files are also included in each folder: (A) A sub0XXtask_photomark_eeg.json file that includes all the information required for EEG recording, including the reference (A1 and A2), the placement scheme (10–20), the model of the amplifier and device, the channel count, the sampling frequency, the recording length, and more. (B) A file called sub-0XX_taskphotomark_channels.tsv that contains electrode position data. (C) The participant’s EEG recordings are contained in a sub0XX_taskphotomark_eeg.set file in a set format, one of the four BIDS-allowed EEG formats (the others are the Biosemi format.bdf, the European data format.edf, the BrainVision Core Data Format.vhdr or.eeg, and the EEGLAB format.set). (D) Finally, file sub0XXtask_photomark_events.tsv contains all the information about the events occuring per recording).
It is worth noting that all .set files can be accessed and used outside of a BIDS-compatible environment, as they contain all necessary recording information. Moreover, users do not need to inspect each individual sub-0XX_task-photomark_channels.tsv and sub-0XX_task-photomark_eeg.json file, since they are identical across participants (except for the recording duration, which naturally varies). Additionally, the derivatives folder includes preprocessed EEG data, organized in the same structure as the raw dataset, with one subfolder per participant.

3. Methods

This paper is a continuation in direct association with an earlier published dataset study of resting-state EEG activity with subjects with closed eyes and data obtained under the conditions of routine clinical practice on the same set of subjects with Alzheimer’s disease (AD), frontotemporal dementia (FTD), and age-matched cognitively normal controls (CN) [16]. In the same recording setup, earlier work provided insightful information on intrinsic brain activity and reported benchmark outcomes with classifiers including LightGBM, SVM, MLP, kNN, and Random Forests, with accuracy greater than 70% with relative band power (RBP) features and Leave-One-Subject-Out (LOSO) verification.
The current study complements and extends that foundation by obtaining EEG recordings from the same subjects under a controlled photic stimulation protocol with increasing frequency while having their eyes open.

3.1. Recordings

The recordings presented in this dataset were collected and therefore published to investigate functional differences in the EEG activity of AD versus CN, FTD versus CN, and even AD versus FTD. These recordings were made in a regular clinical environment. A skilled group of neurologists obtained recordings from the Second Department of Neurology at the AHEPA General Hospital in Thessaloniki. The EEG signals were recorded using a clinical EEG device (Nihon Kohden 2100) that had 19 scalp electrodes placed on the scalp based on the 10–20 international system. The electrodes used for the recordings were Fp1, Fp2, F7, F3, Fz, F4, F8, T3, C3, Cz, C4, T4, T5, P3, Pz, P4, T6, O1, and O2 plus two additional electrodes (A1 and A2) applied to the mastoids as reference electrodes and for an impedance check. Electrode Cz was used for common mode rejection, and the recorded montage was referential. The resolution was 10 uV/mm, and the sampling rate was 500 Hz. The dataset is intended to supplement one previously published and involved the same cohort undergoing eyes-closed EEG recordings.
Based on this protocol, the same group of participants was exposed to intermittent photic stimulation at a series of increasingly higher frequencies, beginning at 5 Hz and advancing in increments of 5 Hz—that is, at 5 Hz, 10 Hz, 15 Hz, 20 Hz, and, where tolerated, as high as 30 Hz for some subjects. Each stimulation frequency was presented for a standard interval of time, allowing sufficient time for the brain’s electro-physiological response to stabilize and be recorded. Although participants were instructed to keep their eyes open during the photic stimulation sessions, it is important to note that, due to the cognitive impairment associated with their conditions, some participants occasionally closed their eyes or moved, without being explicitly instructed to do so. These spontaneous behaviors were carefully annotated in the event files accompanying the dataset, ensuring accurate documentation of the actual participant state during each recording segment. The design allowed assessment of photic driving responses at multiple frequencies, while also capturing EEG segments with variations in eye state. The structured sequence of stimulation was identical for all subjects, supporting comparability across groups, although natural fluctuations in attention and alertness may have introduced minimal variability in the EEG recordings.
Recordings were performed in a standard clinical EEG environment using a medical-grade EEG system, under routine hospital electrical grounding, with the device’s built-in shielding minimizing typical ambient electrical noise. Ambient lighting was kept dim to reduce glare and improve participant comfort during the photic stimulation protocol. Potential sources of variability, such as spontaneous eye closures, fluctuations in attention, minor movements, and fatigue—particularly among participants with cognitive impairment—were carefully annotated whenever observed to ensure accurate documentation of the experimental conditions.

3.2. Preprocessing

The preprocessed data are stored in the derivatives folder, which corresponds to the BIDS format employed in this study. The recordings were preprocessed in Matlab’s EEGLAB toolbox [21]. The data underwent a preprocessing pipeline that consists of (a) average re-referencing to the value of A1-A2 electrodes, (b) a FIR filter, (c) Artifact Subspace Reconstruction (ASR) routine [22], and (d) Independent Component Analysis [21], as shown in Figure 2.
More specifically, the signals were re-referenced to the average value of A1–A2. Additionally, a FIR band-pass filter was used to cut frequencies below 0.5 Hz and above 45 Hz. Afterwards, the signals were subjected to the ASR routine, an automatic artifact rejection technique that can eliminate persistent or large-amplitude artifacts, which removed bad data periods that exceeded the maximum acceptable 0.5 s window standard deviation of 15. The 19 EEG signals were then converted to 19 ICA components using the ICA method (RunICA algorithm) [23]. The EEGLAB platform’s ‘ICLabel’ automatic classification approach automatically eliminated ICA components, and the components classified as ‘eye artifacts’ or ‘jaw artifacts’ with a probability greater than 90% were removed. A snapshot of the same signal in both raw and preprocessed form is shown in Figure 3. It is evident that baseline correction has been used and that extreme high frequency artifacts have been eliminated.

3.3. Classification Benchmark

Several relatively simple feature extraction and classification methods that are easily replicable and expandable by other researchers were used to compare the EEG dataset’s classification performance on the classification of AD vs. CN and FTD vs. CN. The objective was to create a fundamental benchmark for the dataset that could be readily verified and replicated, even though more sophisticated algorithms (like deep learning) and feature extraction methods will perform better.

3.3.1. Feature Extraction

The Relative Band Power (RBP) of the five frequency bands of interest of the brain activity is one of the most often retrieved variables for EEG classification tasks. As proven by previous works [24,25], the EEG frequency bands are defined as follows: delta ranges from 0.5 to 4 Hz, theta from 4 to 8 Hz, alpha from 8 to 13 Hz, beta from 13 to 25 Hz, and gamma from 25 to 45 Hz. Furthermore, the literature indicates that AD patients have altered RBPs, including decreased alpha power and increased theta power.
In this work, 4 s windows with 50% overlap were extracted from the EEG recordings. Each epoch was labeled as A, F, or CN in accordance with the participant list provided with the dataset.
The preprocessed data were used to compute the Relative Band Power (RBP) across five standard frequency bands. The Power Spectral Density (PSD) for each epoch and channel was estimated using Welch’s method. The Welch method estimates the power spectral density of a signal by dividing it into overlapping segments, applying a window function to each, computing their Fourier transforms, and averaging the resulting periodograms to reduce variance [26]. Afterwards, the RBP is calculated by dividing the power within each band by the total power in the 0.5–45 Hz range. Finally, these features are grouped according to the patient’s diagnosis (AD, FTD, or CN) and visualized through scalp topographic maps to provide further information about the spatial distribution of brain activity across diagnostic groups. This visualization can help to highlight potential changes in neural oscillatory patterns in neurodegenerative disorders.
Figure 4, which includes heatmaps depicting the PSD over the scalp and averaged across the AD, FTD, and CN groups, is presented to show the variation in the PSD of each group for each frequency band.

3.3.2. Classification

To establish a stable classification baseline for the dataset, various machine learning algorithms were tried out on benchmark performance metrics. These include accuracy, precision, recall (sensitivity), F1 score, and specificity. To evaluate the performance of this algorithms, the Leave-One-Subject-Out (LOSO) validation method was selected. Table 1 presents results for five models classifying AD patients and healthy subjects: LightGBM, Support Vector Machine (SVM), Multi-Layer Perceptron (MLP), and a k-Nearest Neighbors (kNN) classifier preceded by prior principal component analysis (PCA). For every algorithm, the Hyperopt [27] library was used for the hyperparameter optimization and fine-tuning.
Among these models, SVM with hyperparameter optimization performed the best in terms of accuracy (0.625) and recall (0.796), indicating good case detection of clinical cases but poorer specificity (0.472). LightGBM performed slightly better after optimization. MLP and PCA + kNN performed in a similar manner, with PCA + kNN reporting the highest in specificity (0.616), indicating a more conservative approach to positive labels. Table 2 presents the results for the FTD/CN problem, indicating that 71% is the highest achieved accuracy (by LightGBM) and 83.3% the highest achieved F1 score (by LightGBM).
In order to evaluate the synergistic capabilities of this dataset with the previously reported dataset, the combined datasets were again used to evaluate the classification performance of the same algorithms. An extra column has been added, used to indicate from which dataset each row originated (0 for the eyes-closed dataset and 1 for the eyes-open dataset). To avoid redundancy, only the performance metrics of the best performing algorithm for each classification task are reported in Table 3, as including all results would offer limited additional information.
It should be noted that the reported performance metrics do not reflect the best possible results achievable with this dataset. The feature extraction approach employed—based on PSD and RBP—was intentionally kept simple, and the classification models evaluated were limited to traditional machine learning algorithms, without the use of deep learning techniques. These results are intended to serve as a baseline benchmark, representing a minimum expected performance level for future studies. It should be noted that these results do not represent an upper bound of achievable performance. More capable models, such as deep neural networks or advanced feature extraction techniques, might achieve significantly better classification scores. Researchers using this dataset are encouraged to seek and develop improved methods and ought to take the reported results as a floor rather than a ceiling for performance.

4. User Notes

The derivatives folder contains preprocessed data that we urge researchers to use. Additionally, please refer to the “How to Acknowledge” section of the online dataset page and reference the relevant article when publishing a work based on this dataset. To facilitate dataset use, we note that the data are fully compatible with standard BIDS loading tools. In Python (3.8 or higher), the MNE-BIDS library allows users to read and load the EEG data easily via the mne_bids.read_raw_bids function, supporting direct access to both raw and preprocessed files. In MATLAB, the dataset can be opened using the EEGLAB BIDS plugin, where the BIDS importer automatically organizes sessions and metadata for analysis. Example scripts for reading, loading, and basic plotting using MNE and EEGLAB are available on the OpenNeuro platform and associated community resources.

Author Contributions

Conceptualization, A.T.T., K.D.T. and A.M.; methodology, P.I., T.A. and N.G. (Nikolaos Grigoriadis); software, A.N., A.M. and N.G. (Nikolaos Giannakeas); validation, A.M., A.N., K.S. and E.D.O.; formal analysis, M.G.T. and A.M.; investigation, T.A., P.I. and D.G.T.; resources, M.G.T., P.A. and A.T.T.; data curation, K.S., A.N. and E.D.O.; writing—original draft preparation, A.N.; writing-review and editing, A.M., K.D.T. and M.G.T.; visualization, A.N.; supervision, A.M. and A.T.T.; project administration, M.G.T.; funding acquisition, N.G. (Nikolaos Giannakeas). All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki and approved by the Scientific and Ethics Committee of AHEPA University Hospital, Aristotle University of Thessaloniki, under protocol number 142/12-04-2023.

Informed Consent Statement

Informed consent was obtained from all subjects involved in this study.

Data Availability Statement

The dataset is available at https://openneuro.org/datasets/ds006036/versions/1.0.2 (accessed on 27 April 2025).

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Participant Information

Table A1. Participant description. In the Group column, A indicates AD patient, F indicates FTD patient, and C indicates a healthy subject. In the gender column, F indicates female and M indicates male. The last column mentions the highest frequency of photo-stimulation each participant was subjected to.
Table A1. Participant description. In the Group column, A indicates AD patient, F indicates FTD patient, and C indicates a healthy subject. In the gender column, F indicates female and M indicates male. The last column mentions the highest frequency of photo-stimulation each participant was subjected to.
Participant IDGenderAgeGroupMMSEHighest Photo-Stimulation Freq.
sub-001F57A1620 Hz
sub-002F78A2220 Hz
sub-003M70A1420 Hz
sub-004F67A2020 Hz
sub-005M70A2220 Hz
sub-006F61A1420 Hz
sub-007F79A2020 Hz
sub-008M62A1620 Hz
sub-009F77A2320 Hz
sub-010M69A2020 Hz
sub-011M71A2220 Hz
sub-012M63A1820 Hz
sub-013F64A2020 Hz
sub-014M77A1420 Hz
sub-015M61A1820 Hz
sub-016F68A1420 Hz
sub-017F61A620 Hz
sub-018F73A2320 Hz
sub-019F62A1425 Hz
sub-020M71A425 Hz
sub-021M79A2220 Hz
sub-022F68A2020 Hz
sub-023M60A1620 Hz
sub-024F69A2020 Hz
sub-025F79A2020 Hz
sub-026F61A1820 Hz
sub-027F67A1620 Hz
sub-028M49A2020 Hz
sub-029F53A1620 Hz
sub-030F56A2020 Hz
sub-031F67A2220 Hz
sub-032F59A2020 Hz
sub-033F72A2020 Hz
sub-034F75A1820 Hz
sub-035F57A2220 Hz
sub-036F58A920 Hz
sub-037M57C3020 Hz
sub-038M62C3020 Hz
sub-039M70C3020 Hz
sub-040M61C3020 Hz
sub-041F77C3025 Hz
sub-042M74C3020 Hz
sub-043M72C3020 Hz
sub-044F64C3020 Hz
sub-045F70C305 Hz
sub-046M63C305 Hz
sub-047F70C3020 Hz
sub-048M65C3030 Hz
sub-049F62C3020 Hz
sub-050M68C3020 Hz
sub-051F75C3020Hz
sub-052F73C3030 Hz
sub-053M70C3020 Hz
sub-054M78C3020 Hz
sub-055M67C3020 Hz
sub-056F64C3020 Hz
sub-057M64C3020 Hz
sub-058M62C3020 Hz
sub-059M77C3020 Hz
sub-060F71C3020 Hz
sub-061F63C3020 Hz
sub-062M67C3020 Hz
sub-063M66C3020 Hz
sub-064M66C3020 Hz
sub-065F71C3020 Hz
sub-066M73F2020 Hz
sub-067M66F2420 Hz
sub-068M78F2520 Hz
sub-069M70F2220 Hz
sub-070F67F2220 Hz
sub-071M62F2020 Hz
sub-072M65F1820 Hz
sub-073F57F2220 Hz
sub-074F53F2020 Hz
sub-075F71F2220 Hz
sub-076M44F2420 Hz
sub-077M61F2220 Hz
sub-078M62F2220 Hz
sub-079F60F1820 Hz
sub-080F71F2020 Hz
sub-081F61F1820 Hz
sub-082M63F2720 Hz
sub-083F68F2020 Hz
sub-084F71F2420 Hz
sub-085M64F2620 Hz
sub-086M49F2620 Hz
sub-087M73F2420 Hz
sub-088M55F2420 Hz

References

  1. Levy, M.L.; Miller, B.L.; Cummings, J.L.; Fairbanks, L.A.; Craig, A. Alzheimer Disease and Frontotemporal Dementias: Behavioral Distinctions. Arch. Neurol. 1996, 53, 687–690. [Google Scholar] [CrossRef] [PubMed]
  2. World Health Organization. Dementia Fact Sheet; WHO: Geneva, Switzerland, 2024; Available online: https://www.who.int/news-room/fact-sheets/detail/dementia (accessed on 9 April 2025).
  3. Centers for Disease Control and Prevention. About Dementia; CDC: Atlanta, GA, USA, 2024. Available online: https://www.cdc.gov/alzheimers-dementia/index.html (accessed on 9 April 2025).
  4. Zuin, M.; Brombo, G.; Polastri, M.; Romagnoli, T.; Cervellati, C.; Zuliani, G. Variability in Alzheimer’s disease mortality from European vital statistics, 2012–2020. Int. J. Geriatr. Psychiatry 2024, 39, e6068. [Google Scholar] [CrossRef]
  5. Perry, R.; Hodges, J. Differentiating frontal and temporal variant frontotemporal dementia from Alzheimer’s disease. Neurology 2000, 54, 2277–2284. [Google Scholar] [CrossRef] [PubMed]
  6. Frontotemporal Dementias. Bryn Mawr Communications. Available online: https://fyra.io (accessed on 9 April 2025).
  7. Nishida, K.; Yoshimura, M.; Isotani, T.; Yoshida, T.; Kitaura, Y.; Saito, A.; Mii, H.; Kato, M.; Takekita, Y.; Suwa, A.; et al. Differences in quantitative EEG between frontotemporal dementia and Alzheimer’s disease as revealed by LORETA. Clin. Neurophysiol. 2011, 122, 1718–1725. [Google Scholar] [CrossRef]
  8. Jack, C.R.; Andrews, J.S.; Beach, T.G.; Buracchio, T.; Dunn, B.; Graf, A.; Hansson, O.; Ho, C.; Jagust, W.; McDade, E.; et al. Revised criteria for diagnosis and staging of Alzheimer’s disease: Alzheimer’s Association Workgroup. Alzheimer’s Dement. 2024, 20, 5143–5169. [Google Scholar] [CrossRef] [PubMed]
  9. Musa, G.; Slachevsky, A.; Muñoz-Neira, C.; Méndez-Orellana, C.; Villagra, R.; González-Billault, C.; Ibáñez, A.; Hornberger, M.; Lillo, P. Alzheimer’s Disease or Behavioral Variant Frontotemporal Dementia? Review of Key Points Toward an Accurate Clinical and Neuropsychological Diagnosis. J. Alzheimer’s Dis. 2020, 73, 833–848. [Google Scholar] [CrossRef] [PubMed]
  10. Davatzikos, C.; Resnick, S.M.; Wu, X.; Parmpi, P.; Clark, C.M. Individual patient diagnosis of AD and FTD via high-dimensional pattern classification of MRI. NeuroImage 2008, 41, 1220–1227. [Google Scholar] [CrossRef]
  11. Pandey, N.; Yang, Z.; Cieza, B.; Reyes-Dumeyer, D.; Kang, M.S.; Montesinos, R.; Soto-Añari, M.; Custodio, N.; Honig, L.S.; Tosto, G. Plasma phospho-tau217 as a predictive biomarker for Alzheimer’s disease in a large south American cohort. Alzheimer’s Res. Ther. 2025, 17, 1. [Google Scholar] [CrossRef]
  12. Jiang, A. Challenges in Early Diagnosis and Treatment of Alzheimer’s Disease. Highlights Sci. Eng. Technol. 2023, 74, 713–718. [Google Scholar] [CrossRef]
  13. Babiloni, C.; Arakaki, X.; Azami, H.; Bennys, K.; Blinowska, K.; Bonanni, L.; Bujan, A.; Carrillo, M.C.; Cichocki, A.; de Frutos-Lucas, J.; et al. Measures of resting state EEG rhythms for clinical trials in Alzheimer’s disease: Recommendations of an expert panel. Alzheimer’s Dement. 2021, 17, 1528–1553. [Google Scholar] [CrossRef]
  14. Miltiadous, A.; Gionanidis, E.; Tzimourta, K.D.; Giannakeas, N.; Tzallas, A.T. DICE-Net: A Novel Convolution-Transformer Architecture for Alzheimer Detection in EEG Signals. IEEE Access 2023, 11, 71840–71858. [Google Scholar] [CrossRef]
  15. Jalilianhasanpour, R.; Beheshtian, E.; Sherbaf, G.; Sahraian, S.; Sair, H.I. Functional Connectivity in Neurodegenerative Disorders: Alzheimer’s Disease and Frontotemporal Dementia. Top. Magn. Reson. Imaging 2019, 28, 317–324. [Google Scholar] [CrossRef] [PubMed]
  16. Miltiadous, A.; Tzimourta, K.D.; Afrantou, T.; Ioannidis, P.; Grigoriadis, N.; Tsalikakis, D.G.; Angelidis, P.; Tsipouras, M.G.; Glavas, E.; Giannakeas, N.; et al. A Dataset of Scalp EEG Recordings of Alzheimer’s Disease, Frontotemporal Dementia and Healthy Subjects from Routine EEG. Data 2023, 8, 95. [Google Scholar] [CrossRef]
  17. Miltiadous, A.; Tzimourta, K.D.; Afrantou, T.; Ioannidis, P.; Grigoriadis, N.; Tsalikakis, D.G.; Angelidis, P.; Tsipouras, M.G.; Glavas, E.; Giannakeas, N.; et al. A dataset of EEG recordings from: Alzheimer’s disease, Frontotemporal dementia and Healthy subjects. OpenNeuro 2024. [Google Scholar] [CrossRef]
  18. Gorgolewski, K.J.; Auer, T.; Calhoun, V.D.; Craddock, R.C.; Das, S.; Duff, E.P.; Flandin, G.; Ghosh, S.S.; Glatard, T.; Halchenko, Y.O.; et al. The brain imaging data structure, a format for organizing and describing outputs of neuroimaging experiments. Sci. Data 2016, 3, 160044. [Google Scholar] [CrossRef]
  19. Folstein, M.F.; Folstein, S.E.; McHugh, P.R. “Mini-mental state”: A practical method for grading the cognitive state of patients for the clinician. J. Psychiatr. Res. 1975, 12, 189–198. [Google Scholar] [CrossRef]
  20. Ntetska, A.; Miltiadous, A.; Tzallas, A.T.; Tzimourta, K.D.; Afrantou, T.; Ioannidis, P.; Tsalikakis, D.G.; Sakkas, K.; Oikonomou, E.D.; Giannakeas, N.; et al. A complementary dataset of open-eyes EEG recordings in a photo-stimulation setting from: Alzheimer’s disease, Frontotemporal dementia and Healthy subjects. OpenNeuro 2025. [Google Scholar] [CrossRef]
  21. Delorme, A.; Makeig, S. EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods 2004, 134, 9–21. [Google Scholar] [CrossRef]
  22. Chang, C.Y.; Hsu, S.H.; Pion-Tonachini, L.; Jung, T.P. Evaluation of Artifact Subspace Reconstruction for Automatic Artifact Components Removal in Multi-Channel EEG Recordings. IEEE Trans. Biomed. Eng. 2020, 67, 1114–1121. [Google Scholar] [CrossRef]
  23. Delorme, A.; Sejnowski, T.; Makeig, S. Enhanced detection of artifacts in EEG data using higher-order statistics and independent component analysis. NeuroImage 2007, 34, 1443–1449. [Google Scholar] [CrossRef] [PubMed]
  24. Newson, J.J.; Thiagarajan, T.C. EEG Frequency Bands in Psychiatric Disorders: A Review of Resting State Studies. Front. Hum. Neurosci. 2019, 12, 521. [Google Scholar] [CrossRef] [PubMed]
  25. Moretti, D. Individual analysis of EEG frequency and band power in mild Alzheimer’s disease. Clin. Neurophysiol. 2004, 115, 299–308. [Google Scholar] [CrossRef] [PubMed]
  26. Welch, P. The use of fast Fourier transform for the estimation of power spectra: A method based on time averaging over short, modified periodograms. IEEE Trans. Audio Electroacoust. 1967, 15, 70–73. [Google Scholar] [CrossRef]
  27. Bergstra, J.; Komer, B.; Eliasmith, C.; Yamins, D.; Cox, D.D. Hyperopt: A Python library for model selection and hyperparameter optimization. Comput. Sci. Discov. 2015, 8, 014008. [Google Scholar] [CrossRef]
Figure 1. File structure of the dataset in BIDS format.
Figure 1. File structure of the dataset in BIDS format.
Data 10 00064 g001
Figure 2. Pipeline procedure.
Figure 2. Pipeline procedure.
Data 10 00064 g002
Figure 3. Raw EEG data and their counterparts from the derivatives folder.
Figure 3. Raw EEG data and their counterparts from the derivatives folder.
Data 10 00064 g003
Figure 4. PSD’s five-band scale heatmaps averaged for Groups A (Alzheimer’s disease), C (controls), and F (frontotemporal dementia). The unit of measurement is represented in uV 2 / Hz .
Figure 4. PSD’s five-band scale heatmaps averaged for Groups A (Alzheimer’s disease), C (controls), and F (frontotemporal dementia). The unit of measurement is represented in uV 2 / Hz .
Data 10 00064 g004
Table 1. Comparison of classification model performance on the proposed EEG dataset with Leave-One-Subject-Out validation for the AD-CN problem.
Table 1. Comparison of classification model performance on the proposed EEG dataset with Leave-One-Subject-Out validation for the AD-CN problem.
ModelAccuracyPrecisionRecallF1 ScoreSpecificity
LightGBM0.6040.5790.5990.5890.608
SVM0.6250.5760.7960.6680.472
MLP0.6040.5800.5990.5890.609
PCA + kNN (k = 19)0.6030.5800.5880.5840.616
Table 2. Comparison of classification model performance on the proposed EEG dataset with Leave-One-Subject-Out validation for the FTD-CN problem.
Table 2. Comparison of classification model performance on the proposed EEG dataset with Leave-One-Subject-Out validation for the FTD-CN problem.
ModelAccuracyPrecisionRecallF1 ScoreSpecificity
LightGBM0.7100.6050.6800.6730.833
SVM0.7020.5840.7350.7090.794
MLP0.6500.5800.7210.6690.735
PCA + kNN (k = 14)0.6850.5710.6510.6240.721
Table 3. Performance metrics on the classification of the combined dataset (eyes-closed + eyes-open) of the best performing algorithms examined.
Table 3. Performance metrics on the classification of the combined dataset (eyes-closed + eyes-open) of the best performing algorithms examined.
Combined DatasetsAccuracyPrecisionRecallF1 ScoreSpecificity
AD/CN SVM0.6870.6690.8030.7300.557
FTD/CN LightGBM0.7640.6320.7760.7220.841
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ntetska, A.; Miltiadous, A.; Tsipouras, M.G.; Tzimourta, K.D.; Afrantou, T.; Ioannidis, P.; Tsalikakis, D.G.; Sakkas, K.; Oikonomou, E.D.; Grigoriadis, N.; et al. A Complementary Dataset of Scalp EEG Recordings Featuring Participants with Alzheimer’s Disease, Frontotemporal Dementia, and Healthy Controls, Obtained from Photostimulation EEG. Data 2025, 10, 64. https://doi.org/10.3390/data10050064

AMA Style

Ntetska A, Miltiadous A, Tsipouras MG, Tzimourta KD, Afrantou T, Ioannidis P, Tsalikakis DG, Sakkas K, Oikonomou ED, Grigoriadis N, et al. A Complementary Dataset of Scalp EEG Recordings Featuring Participants with Alzheimer’s Disease, Frontotemporal Dementia, and Healthy Controls, Obtained from Photostimulation EEG. Data. 2025; 10(5):64. https://doi.org/10.3390/data10050064

Chicago/Turabian Style

Ntetska, Aimilia, Andreas Miltiadous, Markos G. Tsipouras, Katerina D. Tzimourta, Theodora Afrantou, Panagiotis Ioannidis, Dimitrios G. Tsalikakis, Konstantinos Sakkas, Emmanouil D. Oikonomou, Nikolaos Grigoriadis, and et al. 2025. "A Complementary Dataset of Scalp EEG Recordings Featuring Participants with Alzheimer’s Disease, Frontotemporal Dementia, and Healthy Controls, Obtained from Photostimulation EEG" Data 10, no. 5: 64. https://doi.org/10.3390/data10050064

APA Style

Ntetska, A., Miltiadous, A., Tsipouras, M. G., Tzimourta, K. D., Afrantou, T., Ioannidis, P., Tsalikakis, D. G., Sakkas, K., Oikonomou, E. D., Grigoriadis, N., Angelidis, P., Giannakeas, N., & Tzallas, A. T. (2025). A Complementary Dataset of Scalp EEG Recordings Featuring Participants with Alzheimer’s Disease, Frontotemporal Dementia, and Healthy Controls, Obtained from Photostimulation EEG. Data, 10(5), 64. https://doi.org/10.3390/data10050064

Article Metrics

Back to TopTop