An Analysis of State-of-the-art Activation Functions For Supervised Deep Neural Network

Anh Nguyen (Speaker, Invited), Lam Pham (Author, Invited), Khoa Pham (Author, Invited), Dat Ngo (Author, Invited), Thanh Ngo (Author, Invited)

Research output: Chapter in Book or Conference ProceedingsConference Proceedings with Oral Presentationpeer-review

Abstract

This paper provides an analysis of state-of-the-art activation functions with respect to supervised classification of deep neural network. These activation functions comprise of Rectified Linear Units (ReLU), Exponential Linear Unit (ELU), Scaled Exponential Linear Unit (SELU), Gaussian Error Linear Unit (GELU), and the Inverse Square Root Linear Unit (ISRLU). To evaluate, experiments over two deep learning network architectures integrating these activation functions are conducted. The first model, based on Multilayer Perceptron (MLP), is evaluated with MNIST dataset to perform these activation functions. Meanwhile, the second model, referred to as VGGish-based architecture, is applied for Acoustic Scene Classification (ASC) Task 1A in DCASE 2018 challenge, evaluating whether these activation functions work well in different datasets as well as different network architectures.
Original languageEnglish
Title of host publication 2021 International Conference on System Science and Engineering (ICSSE)
Pages215- 220
ISBN (Electronic)978-1-6654-4848-2
DOIs
Publication statusPublished - Aug 2021
Event2021 International Conference on System Science and Engineering (ICSSE) -
Duration: 26 Aug 202128 Aug 2021

Conference

Conference2021 International Conference on System Science and Engineering (ICSSE)
Period26/08/2128/08/21

Research Field

  • Former Research Field - Data Science

Fingerprint

Dive into the research topics of 'An Analysis of State-of-the-art Activation Functions For Supervised Deep Neural Network'. Together they form a unique fingerprint.

Cite this