Revista de Engenharia Biomédica e Dispositivos Médicos

Revista de Engenharia Biomédica e Dispositivos Médicos
Acesso livre

ISSN: 2475-7586

Abstrato

The Effects of Modified ReLU Activation Functions in Image Classification

Charles Chinedu Nworu*, Emmanuel John Ekpenyong, John Chisimkwuo, Godwin Okwara, Christian Nduka Onyeukwu , Onyekachi Joy Agwu

The choice of activation functions is very important in deep learning. This is because activation functions are capable of capturing non- linear patterns in a data. The most popular activation function is the Rectified Linear Unit (ReLU) but it suffers from gradient vanishing problem. Therefore, we examined the modifications of the ReLU activation function to determine its effectiveness (accuracy) and efficiency (time complexity). The effectiveness and efficiency was verified by conducting an empirical experiment using x-ray images that contains pneumonia and normal samples. Our experiments show that the modified ReLU, ReLU6 performed better in terms of low generalization error (97.05% training accuracy and 78.21% test accuracy). The sensitivity analysis also suggests that the ELU is capable of correctly predicting more than half of the positive cases with 52.14% probability. For efficiency, the GELU shows the lowest training time when compared with other activation functions. This will allow practitioners in this field to choose activation functions based on effectiveness and efficiency.

Isenção de responsabilidade: Este resumo foi traduzido com recurso a ferramentas de inteligência artificial e ainda não foi revisto ou verificado.
Top