Show simple item record

dc.contributor.authorZhang, Haoling
dc.contributor.authorYang, Chao-Han Huck
dc.contributor.authorZenil, Hector
dc.contributor.authorKiani, Narsis A.
dc.contributor.authorShen, Yue
dc.contributor.authorTegner, Jesper
dc.date.accessioned2020-09-17T11:53:56Z
dc.date.available2020-02-26T08:54:37Z
dc.date.available2020-09-17T11:53:56Z
dc.date.issued2020
dc.identifier.citationZhang, H., Yang, C.-H. H., Zenil, H., Kiani, N. A., Shen, Y., & Tegner, J. N. (2020). Evolving Neural Networks through a Reverse Encoding Tree. 2020 IEEE Congress on Evolutionary Computation (CEC). doi:10.1109/cec48606.2020.9185648
dc.identifier.isbn978-1-7281-6930-9
dc.identifier.doi10.1109/CEC48606.2020.9185648
dc.identifier.urihttp://hdl.handle.net/10754/661708
dc.description.abstractNeuroEvolution is one of the most competitive evolutionary learning strategies for designing novel neural networks for use in specific tasks, such as logic circuit design and digital gaming. However, the application of benchmark methods such as the NeuroEvolution of Augmenting Topologies (NEAT) remains a challenge, in terms of their computational cost and search time inefficiency. This paper advances a method which incorporates a type of topological edge coding, named Reverse Encoding Tree (RET), for evolving scalable neural networks efficiently. Using RET, two types of approaches – NEAT with Binary search encoding (Bi-NEAT) and NEAT with Golden-Section search encoding (GS-NEAT) – have been designed to solve problems in benchmark continuous learning environments such as logic gates, Cartpole, and Lunar Lander, and tested against classical NEAT and FS-NEAT as baselines. Additionally, we conduct a robustness test to evaluate the resilience of the proposed NEAT approaches. The results show that the two proposed approaches deliver improved performance, characterized by (1) a higher accumulated reward within a finite number of time steps; (2) using fewer episodes to solve problems in targeted environments, and (3) maintaining adaptive robustness under noisy perturbations, which outperform the baselines in all tested cases. Our analysis also demonstrates that RET expends potential future research directions in dynamic environments. Code is available from https://github.com/HaolingZHANG/ReverseEncodingTree.
dc.description.sponsorshipThis work was initiated by Living Systems Laboratory at King Abdullah University of Science and Technology (KAUST) lead by Prof. Jesper Tegner and supported by funds from KAUST. Chao-Han Huck Yang was supported by the Visiting Student Research Program (VSRP) from KAUST.
dc.publisherIEEE
dc.relation.urlhttps://ieeexplore.ieee.org/document/9185648/
dc.relation.urlhttps://ieeexplore.ieee.org/document/9185648/
dc.relation.urlhttps://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9185648
dc.rightsArchived with thanks to IEEE
dc.subjectNeuroEvolution
dc.subjectEvolutionary Strategy
dc.subjectContinuous Learning
dc.subjectEdge Encoding
dc.titleEvolving Neural Networks through a Reverse Encoding Tree
dc.typeConference Paper
dc.contributor.departmentBiological and Environmental Sciences and Engineering (BESE) Division
dc.contributor.departmentBioscience Program
dc.conference.date19-24 July 2020
dc.conference.name2020 IEEE Congress on Evolutionary Computation (CEC)
dc.conference.locationGlasgow, United Kingdom
dc.eprint.versionPost-print
dc.contributor.institutionInstitute of Biochemistry,BGI-Shenzhen,Shenzhen,Guangdong,China
dc.contributor.institutionGeorgia Institute of Technology,School of ECE,Atlanta,GA,USA
dc.contributor.institutionAlgorithmic Dynamics Lab & Oxford Immune Algorithmics,U.K. & Sweden
dc.contributor.institutionKarolinska Institute,Algorithmic Dynamics Lab,Stockholm,Sweden
dc.identifier.arxividarXiv:2002.00539
kaust.personTegner, Jesper
refterms.dateFOA2020-02-26T08:55:18Z
kaust.acknowledged.supportUnitVSRP


Files in this item

Thumbnail
Name:
2002.00539.pdf
Size:
2.640Mb
Format:
PDF
Description:
Accepted Manuscript

This item appears in the following Collection(s)

Show simple item record

VersionItemEditorDateSummary

*Selected version