« 3D printing of generative art using the assembly and deformation of direction-specified parts | メイン | Python ライブラリを使用した手続き的な 3D 印刷法 (Method for Procedural 3D Printing Using a Python Library) »

Optimizing Neural-network Learning Rate by Using a Genetic Algorithm with Per-epoch Mutations

Kanada, Y., 2016 International Joint Conference on Neural Networks (IJCNN 2016), 2016-7.
[ English page ]
[ 論文 PDF ファイル ]
[ 発表スライド PDF ファイル ]

Recently, performance of deep neural networks, especially convolutional neural networks (CNNs), has been drastically increased by elaborate network architectures, by new learning methods, and by GPU-based high-performance compu- tation. However, there are still several difficult problems concerning back propagation, which include scheduling of learning rate and controlling locality of search (i.e., avoidance of bad local minima). A learning method, called “learning-rate- optimizing genetic back-propagation” (LOG-BP), which com- bines back propagation with a genetic algorithm by a new manner, is proposed. This method solves the above-mentioned two problems by optimizing the learning process, especially learning rate, by genetic mutations and by locality-controlled parallel search. Initial experimental results shows that LOG-BP performs better; that is, when required, learning rate decreases exponentially and the distances between chromosomes, which indicate the locality of a search, also decrease exponentially.

キーワード: Back propagation, Learning rate, Genetic algorithm, Multi-layer perceptron, Convolutional neural network (CNN), Deep learning, Search-locality control, Non-local search.



2016-07-27 04:53に投稿されたエントリーのページです。


(C) 2008 by Yasusi Kanada
Powered by
Movable Type 3.36