Hyperparameter optimisation for Capsule Networks

B, Gagana and Natarajan, S (2019) Hyperparameter optimisation for Capsule Networks. EAI Endorsed Transactions on Cloud Systems, 5 (14): e2. p. 158416. ISSN 2410-6895

[thumbnail of eai.13-7-2018.158416.pdf]
eai.13-7-2018.158416.pdf - Published Version
Available under License Creative Commons Attribution No Derivatives.

Download (2MB) | Preview


Convolutional Neural Networks and its contemporary variants have proven to be ruling benchmarks for most image processing tasks but resort to pooling techniques and routing mechanisms that affect classification accuracy and lose spatial relationship information between involved data points. Hence, Hinton et al, proposed a layered architecture called Capsule Networks (Capsnets) which outperform traditional systems by replacing pooling techniques with dynamic routing abilities. Capsnets are, thus, en-route to proving themselves as prospective future benchmarks in visual imagery tasks by surpassing existing state-of-theart results on the MNIST dataset. The two novel aspects inspected in this paper are: the enhancement of this performance on CIFAR-10 through regularization and hyperparameter optimization which, henceforth, augment applicability to stochastic numeric healthcare data helping uncover newer challenges of predictive neural networks.

Item Type: Article
Uncontrolled Keywords: hyperparameter optimisation, Stochastic numeric healthcare data, Capsule Networks, ReLU, performance benchmarks
Subjects: Q Science > QA Mathematics > QA75 Electronic computers. Computer science
QA75 Electronic computers. Computer science
Depositing User: EAI Editor II.
Date Deposited: 10 Sep 2020 12:35
Last Modified: 10 Sep 2020 12:35
URI: https://eprints.eudl.eu/id/eprint/172

Actions (login required)

View Item
View Item