Abstract:
A (weighted) Θ-fuzzy associative memory can be viewed as single hidden layer neural network whose inputs are drawn from an arbitrary bounded lattice L. The ξth hidden nod...Show MoreMetadata
Abstract:
A (weighted) Θ-fuzzy associative memory can be viewed as single hidden layer neural network whose inputs are drawn from an arbitrary bounded lattice L. The ξth hidden node applies a function Θξ: L → [0, 1] to an input A ∈ L. In this paper, we present a new algorithm for designing Θ-FAMs. Roughly speaking, this algorithm consists of the following two stages: 1) Construction of a set of functions Θξ: L → [0, 1] where ξ= 1,..., p; 2) Optimization of the weights. Our new algorithm differs from the previous algorithm for tunable equivalence fuzzy associative memories. Instead of merely extracting a subset of functions Θξ: L → [0, 1] from a given set, we generate a new set of functions on the basis of the partial ordering of L. The paper includes some experimental results in a set of benchmark classification problems. We also apply a combination of the resulting Θ-FAM and a deep convolutional neural network to a problem of image texture classification and compare the classification performance of our approach with the ones of some state-of-the-art methods from the literature.
Date of Conference: 23-26 June 2019
Date Added to IEEE Xplore: 11 October 2019
ISBN Information: