Show simple item record

AuthorKiranyaz, Serkan
AuthorInce, Turker
AuthorIosifidis, Alexandros
AuthorGabbouj, Moncef
Available date2020-08-20T11:44:17Z
Publication Date2017
Publication NameProceedings of the International Joint Conference on Neural Networks
ResourceScopus
URIhttp://dx.doi.org/10.1109/IJCNN.2017.7966157
URIhttp://hdl.handle.net/10576/15728
AbstractTraditional Artificial Neural Networks (ANNs) such as Multi-Layer Perceptrons (MLPs) and Radial Basis Functions (RBFs) were designed to simulate biological neural networks
Abstracthowever, they are based only loosely on biology and only provide a crude model. This in turn yields well-known limitations and drawbacks on the performance and robustness. In this paper we shall address them by introducing a novel feed-forward ANN model, Generalized Operational Perceptrons (GOPs) that consist of neurons with distinct (non-)linear operators to achieve a generalized model of the biological neurons and ultimately a superior diversity. We modified the conventional back-propagation (BP) to train GOPs and furthermore, proposed Progressive Operational Perceptrons (POPs) to achieve self-organized and depth-adaptive GOPs according to the learning problem. The most crucial property of the POPs is their ability to simultaneously search for the optimal operator set and train each layer individually. The final POP is, therefore, formed layer by layer and this ability enables POPs with minimal network depth to attack the most challenging learning problems that cannot be learned by conventional ANNs even with a deeper and significantly complex configuration.
Languageen
PublisherInstitute of Electrical and Electronics Engineers Inc.
TitleGeneralized model of biological neural networks: Progressive operational perceptrons
TypeConference Paper
Pagination2477-2485
Volume Number2017-May


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record