« 3D printing of generative art using the assembly and deformation of direction-specified parts | Main | Method for Procedural 3D Printing Using a Python Library »

Optimizing Neural-network Learning Rate by Using a Genetic Algorithm with Per-epoch Mutations

Kanada, Y., 2016 International Joint Conference on Neural Networks (IJCNN 2016), July 2016.
[ 日本語のページ ]
[ Paper PDF file ]
[ Slides PDF file ]

Recently, performance of deep neural networks, especially convolutional neural networks (CNNs), has been drastically increased by elaborate network architectures, by new learning methods, and by GPU-based high-performance compu- tation. However, there are still several difficult problems concerning back propagation, which include scheduling of learning rate and controlling locality of search (i.e., avoidance of bad local minima). A learning method, called “learning-rate- optimizing genetic back-propagation” (LOG-BP), which com- bines back propagation with a genetic algorithm by a new manner, is proposed. This method solves the above-mentioned two problems by optimizing the learning process, especially learning rate, by genetic mutations and by locality-controlled parallel search. Initial experimental results shows that LOG-BP performs better; that is, when required, learning rate decreases exponentially and the distances between chromosomes, which indicate the locality of a search, also decrease exponentially.

Keywords: Back propagation, Learning rate, Genetic algorithm, Multi-layer perceptron, Convolutional neural network (CNN), Deep learning, Search-locality control, Non-local search.

Post a comment


This page contains a single entry from the blog posted on July 27, 2016 3:33 AM.

Many more can be found on the main index page or by looking through the archives.

(C) Copyright 2007 by Yasusi Kanada
Powered by
Movable Type 3.36