Inner Ensemble Networks: Average Ensemble as an Effective Regularizer
Type
PreprintAuthors
Mohamed, AbduallahSadiq, Muhammed Mohaimin
AlBadawy, Ehab
Elhoseiny, Mohamed

Claudel, Christian
Date
2020-06-15Permanent link to this record
http://hdl.handle.net/10754/666017
Metadata
Show full item recordAbstract
We introduce Inner Ensemble Networks (IENs) which reduce the variance within the neural network itself without an increase in the model complexity. IENs utilize ensemble parameters during the training phase to reduce the network variance. While in the testing phase, these parameters are removed without a change in the enhanced performance. IENs reduce the variance of an ordinary deep model by a factor of $1/m^{L-1}$, where $m$ is the number of inner ensembles and $L$ is the depth of the model. Also, we show empirically and theoretically that IENs lead to a greater variance reduction in comparison with other similar approaches such as dropout and maxout. Our results show a decrease of error rates between 1.7\% and 17.3\% in comparison with an ordinary deep model. We also show that IEN was preferred by Neural Architecture Search (NAS) methods over prior approaches. Code is available at https://github.com/abduallahmohamed/inner_ensemble_nets.Publisher
arXivarXiv
2006.08305Additional Links
https://arxiv.org/pdf/2006.08305Relations
Is Supplemented By:- [Software]
Title: abduallahmohamed/inner_ensemble_nets: Code for "Inner Ensemble Nets". Publication Date: 2020-06-12. github: abduallahmohamed/inner_ensemble_nets Handle: 10754/667970