Main

Neural network and deep learning Archives

Kanada, Y., 2016 International Joint Conference on Neural Networks (IJCNN 2016), July 2016.
[ 日本語のページ ]
[ Paper PDF file ]
[ Slides PDF file ]

abstract:
Recently, performance of deep neural networks, especially convolutional neural networks (CNNs), has been drastically increased by elaborate network architectures, by new learning methods, and by GPU-based high-performance compu- tation. However, there are still several difficult problems concerning back propagation, which include scheduling of learning rate and controlling locality of search (i.e., avoidance of bad local minima). A learning method, called “learning-rate- optimizing genetic back-propagation” (LOG-BP), which com- bines back propagation with a genetic algorithm by a new manner, is proposed. This method solves the above-mentioned two problems by optimizing the learning process, especially learning rate, by genetic mutations and by locality-controlled parallel search. Initial experimental results shows that LOG-BP performs better; that is, when required, learning rate decreases exponentially and the distances between chromosomes, which indicate the locality of a search, also decrease exponentially.

Keywords: Back propagation, Learning rate, Genetic algorithm, Multi-layer perceptron, Convolutional neural network (CNN), Deep learning, Search-locality control, Non-local search.