Please see the personal homepage for the analysis of the paper:
Please see your personal homepage for papers and source code:
https://download.csdn.net/download/qq_45874683/87667147
the whole frame:
1. Dataset folder, data set processing part: execute the script of all preprocessing steps detailed in Section 3 of the paper
1.1 prepare_deap.py: Prepare deap data set
1.2 prepare_mahnob.py: Prepare mahnob data set
1.3 reduce-dim.py: A practical function for dimensionality reduction of input EEG data. For prepare deap.py and prepare mahnob.py
2. nn folder, scripts and configurations of data sets, models, and training programs
datasets.py: Read datasets, PyTorch's class for reading examples in DEAP and MAHNOB.
models.py: Build models, PyTorch's DNN and CNN architecture models, see Section 4 of the paper for details
train-utils.py: Contains the code for the train program
train.py: start the train program
utils.py: Function calls, containing utility functions used in other scripts.
configs: Modify parameters, including YAML configuration files of models, hyperparameters, and training programs.
3. Statistical analysis folder: perform model testing and result analysis, and execute scripts for statistical tests used in the paper
5x2cv test.py: Performs a 5x2 cross-validation paired t test (see Section 5.2.2 of the paper).
confidence intervals.py: Calculate the confidence intervals for the results in Section 5.
kfold cross validation.py: Perform kfold cross validation. Used to perform 5x2cv tests, but can also be used alone mcnemartest.py: performed on pre-trained DNN and CNN models
mcnemar test.py: Perform McNemars test on pre-trained DNN and CNN models (see Section 5.2.1 of the paper)
4. pretrained models folder, saved pre-trained model:
Contains 4 pre-trained models (DNN on DEAP, CNN on DEAP, DNN on MAHNOB, CNN on MAHNOB), and the results are in the first part of Section 5.
Some code screenshots:
1. Dataset folder, data set processing part: execute the script of all preprocessing steps detailed in Section 3 of the paper
1.1 prepare_deap.py: Prepare deap data set
1.2 prepare_mahnob.py: Prepare mahnob data set
1.3 reduce-dim.py: A practical function for dimensionality reduction of input EEG data. For prepare deap.py and prepare mahnob.py
2. nn folder, scripts and configurations of data sets, models, and training programs
datasets.py: Read datasets, PyTorch's class for reading examples in DEAP and MAHNOB.
models.py: Build models, PyTorch's DNN and CNN architecture models, see Section 4 of the paper for details
train-utils.py: Contains the code for the train program
train.py: start the train program
utils.py: Function calls, containing utility functions used in other scripts.
configs: Modify parameters, including YAML configuration files of models, hyperparameters, and training programs.
3. Statistical analysis folder: perform model testing and result analysis, and execute scripts for statistical tests used in the paper
5x2cv test.py: Performs a 5x2 cross-validation paired t test (see Section 5.2.2 of the paper).
confidence intervals.py: Calculate the confidence intervals for the results in Section 5.
kfold cross validation.py: Perform kfold cross validation. Used to perform 5x2cv tests, but can also be used alone mcnemartest.py: performed on pre-trained DNN and CNN models
mcnemar test.py: Perform McNemars test on pre-trained DNN and CNN models (see Section 5.2.1 of the paper)
4. pretrained models folder, saved pre-trained model:
Contains 4 pre-trained models (DNN on DEAP, CNN on DEAP, DNN on MAHNOB, CNN on MAHNOB), and the results are in the first part of Section 5.
Please see the personal homepage for the analysis of the paper:
Please see your personal homepage for papers and source code:
https://download.csdn.net/download/qq_45874683/87667147