Adjusting Learning Rate of Memristor-Based Multilayer Neural Networks via Fuzzy Method | IEEE Journals & Magazine | IEEE Xplore

Adjusting Learning Rate of Memristor-Based Multilayer Neural Networks via Fuzzy Method


Abstract:

Back propagation (BP) based on stochastic gradient descent is the prevailing method to train multilayer neural networks (MNNs) with hidden layers. However, the existence ...Show More

Abstract:

Back propagation (BP) based on stochastic gradient descent is the prevailing method to train multilayer neural networks (MNNs) with hidden layers. However, the existence of the physical separation between memory arrays and arithmetic module makes it inefficient and ineffective to implement BP in conventional digital hardware. Although CMOS may alleviate some problems of the hardware implementation of MNNs, synapses based on CMOS cost too much power and areas in very large scale integrated circuits. As a novel device, memristor shows promises to overcome this shortcoming due to its ability to closely integrate processing and memory. This paper proposes a novel circuit for implementing a synapse based on a memristor and two MOSFET tansistors (p-type and n-type). Compared with a CMOS-only circuit, the proposed one reduced the area consumption by 92%-98%. In addition, we develop a fuzzy method for the adjustment of the learning rates of MNNs, which increases the learning accuracy by 2%-3% compared with a constant learning rate. Meanwhile, the fuzzy adjustment method is robust and insensitive to parameter changes due to the approximate reasoning. Furthermore, the proposed methods can be extended to memristor-based multilayer convolutional neural network for complex tasks. The novel architecture behaves in a human-liking thinking process.
Page(s): 1084 - 1094
Date of Publication: 09 May 2018

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.