Self-Attention Long-Term Dependency Modelling in Electroencephalography Sleep Stage Prediction

Georg Brandmayr, Manfred Hartmann, Franz Fürbass, Georg Dorffner

Publikation: Beitrag in Buch oder TagungsbandBeitrag in Tagungsband ohne PräsentationBegutachtung


Complex sleep stage transition rules pose a challenge for the learning of inter-epoch context with Deep Neural Networks (DNNs) in ElectroEncephaloGraphy (EEG) based sleep scoring. While DNNs were able to overcome the limits of expert systems, the dominant bidirectional Long Short-Term Memory (LSTM) still has some limitations of Recurrent Neural Networks. We propose a sleep Self-Attention Model (SAM) that replaces LSTMs for inter-epoch context modelling in a sleep scoring DNN. With the ability to access distant EEG as easily as adjacent EEG, we aim to improve long-term dependency learning for critical sleep stages such as Rapid Eye Movement (REM). Restricting attention to a local scope reduces computational complexity to a linear one with respect to recording duration. We evaluate SAM on two public sleep EEG datasets: MASS-SS3 and SEDF-78 and compare it to literature and an LSTM baseline model via a paired t-test. On MASS-SS3 SAM achieves κ = 0.80, which is equivalent to the best reported result, with no significant difference to baseline. On SEDF-78 SAM achieves κ = 0.78, surpassing previous best results, statistically significant, with +4% F1- score improvement in REM. Strikingly, SAM achieves these results with a model size that is at least 50 times smaller than the baseline.
TitelNeural Information Processing - 28th International Conference, ICONIP 2021, Sanur, Bali, Indonesia, December 8–12, 2021, Proceedings, Part III
PublikationsstatusVeröffentlicht - 2021

Research Field

  • Exploration of Digital Health


  • Attention
  • Sleep scoring
  • Inter-epoch context


Untersuchen Sie die Forschungsthemen von „Self-Attention Long-Term Dependency Modelling in Electroencephalography Sleep Stage Prediction“. Zusammen bilden sie einen einzigartigen Fingerprint.

Diese Publikation zitieren