Skip to main content
Log in

A novel high-speed neural model for fast pattern recognition

  • Original Paper
  • Published:
Soft Computing Aims and scope Submit manuscript

Abstract

Neural networks have shown good results for detecting a certain pattern in a given image. In this paper, faster neural networks for pattern detection are presented. Such processors are designed based on cross-correlation in the frequency domain between the input matrix and the input weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through matrix decomposition. Each matrix is divided into smaller in size sub-matrices and then each one is tested separately using a single faster neural processor. Furthermore, faster pattern detection is obtained using parallel processing techniques to test the resulting submatrices at the same time using the same number of faster neural networks. In contrast to faster neural networks, the speed up ratio is increased with the size of the input matrix when using faster neural networks and matrix decomposition. Moreover, the problem of local sub-matrix normalization in the frequency domain is solved. The effect of matrix normalization on the speed up ratio of pattern detection is discussed. Simulation results show that local sub-matrix normalization through weight normalization is faster than sub-matrix normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Anifantis D, Dermatas E, Kokkinakis G (1999) A neural network method for accurate face detection on arbitrary images. In: Proceedings of 6th IEEE international conference on electronics. Circuits and systems, Paphos, Cyprus, 5–8 September, pp 109–112

  • Bao J-P, Shen J-Y, Liu H-Y, Liu X-D (2006) A fast document copy detection model. Soft Comput J Fusion Found Methodol Appl 10(1):41–46

    Google Scholar 

  • Ben-Yacoub S (1997) Fast object detection using MLP and FFT, IDIAP-RR 11, IDIAP

  • Ben-Yacoub S, Fasel B, Luettin J (1999) Fast face detection using MLP and FFT. In: Proceedings of the second international conference on audio and video-based biometric person authentication (AVBPA’99)

  • Bruce J, Veloso M (2003) Fast and accurate vision-based pattern detection and identification. In: Proceedings of ICRA’03, the 2003, IEEE International Conference on Robotics and Automation, Taiwan, May 2003, pp 1-6

  • El-Bakry HM, Zhao Q (2005) A fast neural algorithm for serial code detection in a stream of sequential data. Int J Inform Technol 2(1):71–90

    Google Scholar 

  • Essannouni L, Ibn Elhaj E (2006) Face identification of video sequence. In: Proceedings of 2006 the second European international symposium on communications, control and signal processing, 13–15 March 2006, Marrakech, Morocco, 4 p

  • Fasel B (1998) Fast multi-scale face detection, IDIAP-Com 98-04

  • Feraud R, Bernier O, Viallet JE, Collobert M (2000) A fast and accurate face detector for indexation of face images. In:Proceedings of the fourth IEEE international conference on automatic face and gesture recognition, Grenoble, France, 28–30 March

  • Gonzalez RC, Woods RE (2002) Digital image processing. Prentice-Hall, USA

    Google Scholar 

  • Ishak KA, Samad SA, Hussian A, Majlis BY (2004) A fast and robust face detection using neural networks. In: Proceedings of the international symposium on information and communication technologies. Multimedia University, Putrajaya, Malaysia, vol 2, 7–8 October, pp 5–8

  • James WC, John WT (1965) An algorithm for the machine calculation of complex Fourier series. Math Comput 19:297–301

    Article  MATH  Google Scholar 

  • Klette R, Zamperon P (1996) Handbook of image processing operators. Wiley, New York

    Google Scholar 

  • Lewis JP (1988) Fast Normalized Cross Correlation. Available from http://www.idiom.com/~zilla/Papers/nvisionInterface/nip.html

  • Lang KJ, Hinton GE (1988) The development of time-delay neural network architecture for speech recognition, Technical Report CMU-CS-88-152. Carnegie-Mellon University, Pittsburgh, PA

  • Ramasubramanian P, Kannan A (2006) A genetic-algorithm based neural network short-term forecasting framework for database intrusion prediction system. Soft Comput J Fusion Found Methodol Appl 10(8):699-714, June

    Google Scholar 

  • Rowley HA, Baluja S, Kanade T (1998) Neural network-based face detection. IEEE Trans Pattern Anal Mach Intell 20(1):23–38

    Article  Google Scholar 

  • Roth S, Gepperth A, Igel C (2006) Multi-objective neural network optimization for visual object detection, vol 16. Springer, Berlin

    Google Scholar 

  • Schneiderman H, Kanade T (1998) Probabilistic modeling of local appearance and spatial relationships for object recognition, In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Santa Barbara, CA, pp 45–51

  • Srisuk S, Kurutach W (2002) A new robust face detection in color images In: Proceedings of IEEE Computer Society International Conference on Automatic Face and Gesture Recognition, Washington DC, USA, 20–21 May, pp 306–311

  • Yang M, Kriegman DJ, Huja N (2002) Detecting faces in images: a survey. IEEE Trans Pattern Anal Mach Intell 24(1):34–58, January

    Google Scholar 

  • Zhu Y, Schwartz S, Orchard M (2000) Fast face detection using subspace discriminate wavelet features,Proceedings of IEEE Computer Society International Conference on Computer Vision and Pattern Recognition (CVPR’00), South Carolina, vol 1, 13–15 June, pp 1636–1643

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hazem M. El-Bakry.

Appendices

Appendix 1

1.1 An example proves that the cross-correlation between any two matrices is not commutative

$$ {\text{Let}}\quad X = \left[ {\begin{array}{*{20}c} 5 & 1 \\ 3 & 7 \\ \end{array} } \right],\quad {\text{and}}\quad W = \left[ {\begin{array}{*{20}c} 6 & 5 \\ 9 & 8 \\ \end{array} } \right] $$

Then, the cross-correlation between X and W can be obtained as follows:

$$ \begin{gathered} W \otimes X = \left[ {\begin{array}{*{20}c} 6 &\quad 5 \\ 9 &\quad 8 \\ \end{array} } \right] \otimes \left[ {\begin{array}{*{20}c} 5 &\quad 1 \\ 3 &\quad 7 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {8 \times 5} & {8 \times 1 + 9 \times 5} & {9 \times 1} \\ {5 \times 5 + 8 \times 3} &\quad {6 \times 5 + 5 \times 1 + 9 \times 3 + 8 \times 7} &\quad {6 \times 1 + 9 \times 7} \\ {5 \times 3} & {6 \times 3 + 5 \times 7} & {6 \times 7} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {40} &\quad {53} &\quad 9 \\ {49} &\quad {118} &\quad {69} \\ {15} &\quad {53} &\quad {42} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

On the other hand, the cross-correlation between W and X can be computed as follows:

$$ \begin{gathered} X \otimes W = \left[ {\begin{array}{*{20}c} 5 &\quad 1 \\ 3 &\quad 7 \\ \end{array} } \right] \otimes \left[ {\begin{array}{*{20}c} 6 &\quad 5 \\ 9 &\quad 8 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {7 \times 6} & {3 \times 6 + 7 \times 5} & {3 \times 5} \\ {1 \times 6 + 7 \times 9} &\quad {5 \times 6 + 1 \times 5 + 3 \times 9 + 7 \times 8} &\quad {5 \times 5 + 3 \times 8} \\ {1 \times 9} & {5 \times 9 + 1 \times 8} & {5 \times 8} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {42} &\quad {53} &\quad {15} \\ {69} &\quad {118} &\quad {49} \\ 9 &\quad {53} &\quad {40} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

which proves that X ⊗ W ≠ W ⊗ X.

Also, when one of the two matrices is symmetric, the cross-correlation between the two matrices is non-commutative as shown in the following example:

$$ {\text{Let}}\quad X = \left[ {\begin{array}{*{20}c} 5 & 3 \\ 3 & 5 \\ \end{array} } \right],\quad {\text{and}}\quad W = \left[ {\begin{array}{*{20}c} 6 & 5 \\ 9 & 8 \\ \end{array} } \right] $$

Then, the cross-correlation between X and W can be obtained as follows:

$$ \begin{gathered} X \otimes W = \left[ {\begin{array}{*{20}c} 5 &\quad 3 \\ 3 &\quad 5 \\ \end{array} } \right] \otimes \left[ {\begin{array}{*{20}c} 6 &\quad 5 \\ 9 &\quad 8 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {5 \times 6} & {3 \times 6 + 5 \times 5} & {3 \times 5} \\ {3 \times 6 + 5 \times 9} &\quad {5 \times 6 + 3 \times 5 + 3 \times 9 + 5 \times 8} &\quad {5 \times 5 + 3 \times 8} \\ {3 \times 9} & {5 \times 9 + 3 \times 8} & {5 \times 8} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {30} &\quad {43} &\quad {15} \\ {63} &\quad {112} &\quad {49} \\ {27} &\quad {69} &\quad {40} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

On the other hand, the cross-correlation between W and X can be computed as follows:

$$ \begin{gathered} W \otimes X = \left[ {\begin{array}{*{20}c} 6 &\quad 5 \\ 9 &\quad 8 \\ \end{array} } \right] \otimes \left[ {\begin{array}{*{20}c} 5 &\quad 3 \\ 3 &\quad 5 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {8 \times 5} & {8 \times 3 + 9 \times 5} & {9 \times 3} \\ {5 \times 5 + 8 \times 3} &\quad {6 \times 5 + 5 \times 3 + 9 \times 3 + 8 \times 5} &\quad {6 \times 3 + 9 \times 5} \\ {5 \times 3} & {6 \times 3 + 5 \times 5} & {6 \times 5} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {40} &\quad {69} &\quad {27} \\ {49} &\quad {112} &\quad {63} \\ {15} &\quad {43} &\quad {30} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

which proves that X ⊗ W ≠ W ⊗ X.

The cross-correlation between any two matrices is commutative only when the two matrices are symmetric as shown in the following example.

$$ {\text{Let}}\quad X = \left[ {\begin{array}{*{20}c} 5 & 3 \\ 3 & 5 \\ \end{array} } \right],\quad {\text{and}}\quad W = \left[ {\begin{array}{*{20}c} 8 & 9 \\ 9 & 8 \\ \end{array} } \right] $$

Then, the cross-correlation between X and W can be obtained as follows:

$$ \begin{gathered} X \otimes W = \left[ {\begin{array}{*{20}c} 5 &\quad 3 \\ 3 &\quad 5 \\ \end{array} } \right] \otimes \left[ {\begin{array}{*{20}c} 8 &\quad 9 \\ 9 &\quad 8 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {5 \times 8} & {9 \times 5 + 8 \times 3} & {9 \times 3} \\ {9 \times 5 + 8 \times 3} &\quad {8 \times 5 + 9 \times 3 + 9 \times 3 + 8 \times 5} &\quad {8 \times 3 + 9 \times 5} \\ {9 \times 3} & {8 \times 3 + 9 \times 5} & {8 \times 5} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {40} &\quad {69} &\quad {27} \\ {69} &\quad {122} &\quad {69} \\ {27} &\quad {69} &\quad {40} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

On the other hand, the cross-correlation between W and X can be computed as follows:

$$ \begin{gathered} W \otimes X = \left[ {\begin{array}{*{20}c} 8 &\quad 9 \\ 9 &\quad 8 \\ \end{array} } \right] \otimes \left[ {\begin{array}{*{20}c} 5 &\quad 3 \\ 3 &\quad 5 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {8 \times 5} & {9 \times 5 + 8 \times 3} & {9 \times 3} \\ {9 \times 5 + 8 \times 3} &\quad {8 \times 5 + 9 \times 3 + 9 \times 3 + 8 \times 5} &\quad {9 \times 5 + 8 \times 3} \\ {9 \times 3} & {5 \times 9 + 3 \times 8} & {8 \times 5} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {40} &\quad {69} &\quad {27} \\ {69} &\quad {122} &\quad {69} \\ {27} &\quad {69} &\quad {40} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

which proves that the cross-correlation is commutative (X ⊗ W = W ⊗ X) only under the condition when the two matrices X and W are symmetric.

Appendix 2

2.1 An example proves that the cross-correlation between any two matrices is different from their convolution

$$ {\text{Let}}\quad X = \left[ {\begin{array}{*{20}c} 5 & 1 \\ 3 & 7 \\ \end{array} } \right],\quad {\text{and}}\quad W = \left[ {\begin{array}{*{20}c} 6 & 5 \\ 9 & 8 \\ \end{array} } \right], $$

the result of their cross-correlation can be computed as illustrated from the previous example (first result) in Appendix 1. The convolution between W and X can be obtained as follows:

$$ \begin{gathered} W\diamondsuit X = \left[ {\begin{array}{*{20}c} 8 &\quad 9 \\ 5 &\quad 6 \\ \end{array} } \right]\diamondsuit \left[ {\begin{array}{*{20}c} 5 &\quad 1 \\ 3 &\quad 7 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {6 \times 5} & {5 \times 5 + 6 \times 1} & {5 \times 1} \\ {9 \times 5 + 6 \times 3} &\quad {8 \times 5 + 9 \times 1 + 5 \times 3 + 6 \times 7} &\quad {8 \times 1 + 5 \times 7} \\ {9 \times 3} & {8 \times 3 + 9 \times 7} & {8 \times 7} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {30} &\quad {31} &\quad 5 \\ {63} &\quad {106} &\quad {43} \\ {27} &\quad {87} &\quad {56} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

which proves that W  X  W⋄X.

When the second matrix W is symmetric, the cross-correlation between W and X can be computed as follows:

$$ \begin{gathered} W \otimes X = \left[ {\begin{array}{*{20}c} 8 &\quad 9 \\ 9 &\quad 8 \\ \end{array} } \right] \otimes \left[ {\begin{array}{*{20}c} 5 &\quad 1 \\ 3 &\quad 7 \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {8 \times 5} & {9 \times 5 + 8 \times 1} & {9 \times 1} \\ {9 \times 5 + 8 \times 3} &\quad {8 \times 5 + 9 \times 3 + 9 \times 1 + 8 \times 7} &\quad {8 \times 1 + 7 \times 9} \\ {9 \times 3} & {8 \times 3 + 9 \times 7} & {8 \times 7} \\ \end{array} } \right] \hfill \\ = \left[ {\begin{array}{*{20}c} {40} &\quad {87} &\quad 9 \\ {79} &\quad {106} &\quad {71} \\ {45} &\quad {53} &\quad {56} \\ \end{array} } \right] \hfill \\ \end{gathered} $$

while the convolution can be between W and X can be obtained as follows:

$$ \begin{aligned} W\diamondsuit X & = \left[ {\begin{array}{*{20}c} 8 &\quad 9 \\ 9 &\quad 8 \\ \end{array} } \right]\diamondsuit \left[ {\begin{array}{*{20}c} 5 &\quad 1 \\ 3 &\quad 7 \\ \end{array} } \right] \\ & = \left[ {\begin{array}{*{20}c} {8 \times 5} & {9 \times 5 + 8 \times 1} & {9 \times 1} \\ {9 \times 5 + 8 \times 3} &\quad {8 \times 5 + 9 \times 3 + 9 \times 1 + 8 \times 7} &\quad {8 \times 1 + 7 \times 9} \\ {9 \times 3} & {8 \times 3 + 9 \times 7} & {8 \times 7} \\ \end{array} } \right] \\ & = \left[ {\begin{array}{*{20}c} {40} &\quad {87} &\quad 9 \\ {79} &\quad {106} &\quad {71} \\ {45} &\quad {53} &\quad {56} \\ \end{array} } \right] \\ \end{aligned} $$

which proves that under the condition that the second matrix is symmetric (or the two matrices are symmetric) the cross-correlation between any the two matrices equals to their convolution.

Appendix 3

3.1 A cross-correlation example between a normalized matrix and other non-normalized one and vise versa

$$ {\text{Let}}\quad X = \left[ {\begin{array}{*{20}c} 5 & 1 \\ 3 & 7 \\ \end{array} } \right],\quad {\text{and}}\quad W = \left[ {\begin{array}{*{20}c} 6 & 5 \\ 9 & 8 \\ \end{array} } \right] $$

Then the normalized matrices \( \overline{X} , \) and \( \overline{W} \) can be computed as:

$$ \overline{X} = \left[ {\begin{array}{*{20}c} 1 & { - 3} \\ { - 1} & 3 \\ \end{array} } \right],\quad {\rm and}\quad \overline{W} = \left[ {\begin{array}{*{20}c} { - 1} & { - 2} \\ 2 & 1 \\ \end{array} } \right]. $$

Now, the cross-correlation between a normalized matrix and the other non-normalized one can be computed as follows:

$$ \overline{X} \otimes W = \left[ {\begin{array}{*{20}c} 1 & { - 3} \\ { - 1} & 3 \\ \end{array} } \right]\;\left[ {\begin{array}{*{20}c} 6 & 5 \\ 9 & 8 \\ \end{array} } \right] = \left[ {\begin{array}{*{20}c} {18} & 9 & { - 5} \\ 9 & 6 & { - 3} \\ { - 27} & { - 15} & 8 \\ \end{array} } \right] $$
$$ X \otimes \overline{W} = \left[ {\begin{array}{*{20}c} 5 & 1 \\ 3 & 7 \\ \end{array} } \right]\;\left[ {\begin{array}{*{20}c} { - 1} & { - 2} \\ 2 & 1 \\ \end{array} } \right] = \left[ {\begin{array}{*{20}c} { - 7} & { - 17} & { - 6} \\ {13} & 6 & { - 7} \\ 2 & {11} & 5 \\ \end{array} } \right] $$

which means that \( \overline{X} \otimes W \ne X \otimes \overline{W} \).

However, the two results are equal only at the center element which equals to the dot product between the two matrices. The value of the center element (2, 2) = 6 as shown above and also in Appendix 4.

Appendix 4

4.1 A dot product example between a normalized matrix and other non-normalized one and vise versa

This is to validate the correctness of Eq. 51. The left hand side of Eq. 51 can be expressed as follows:

$$ \overline{X} \cdot W = \left[ \begin{gathered} X_{1,1} - \overline{X} \ldots X_{1,n} - \overline{X} \hfill \\ \vdots\\ X_{n,1} - \overline{X} \ldots X_{n,n} - \overline{X} \hfill \\ \end{gathered} \right] \cdot \left[ \begin{gathered} W_{1,1} \ldots W_{1,n} \hfill \\ \vdots\\ W_{n,1} \ldots W_{n,n} \hfill \\ \end{gathered} \right] $$
(58)

and also the right hand side of the same can be represented as:

$$ X \cdot \overline{W} = \left[ \begin{gathered} X_{1,1} \ldots X_{1,n} \hfill \\ \vdots \\ X_{n,1} \ldots X_{n,n} \hfill \\ \end{gathered} \right] \cdot \left[ \begin{gathered} W_{1,1} - \overline{W} \ldots W_{1,n} - \overline{W} \hfill \\ \vdots\\ W_{n,1} - \overline{W} \ldots W_{n,n} - \overline{W} \hfill \\ \end{gathered} \right]. $$
(59)

\( \overline{X} \) and \( \overline{W} \) are defined as follows:

$$ \begin{aligned} \overline{X} &= {\frac{{X_{1,1} + X_{1,2} + \cdots + X_{n,n} }}{{n^{2} }}} \\ \overline{W} &= {\frac{{W_{1,1} + W_{1,2} + \cdots + W_{n,n} }}{{n^{2} }}} \\ \end{aligned} $$
(60)

By substituting from Eq. 60 in Eqs. 58 and 59, then simplifying the results we can easily conclude that \( \overline{X}_{rc} W_{i} = X_{rc} \overline{W}_{i} . \)

Here is also a practical example:

$$ {\text{Let}}\quad X = \left[ {\begin{array}{*{20}c} 5 & 1 \\ 3 & 7 \\ \end{array} } \right],\quad {\text{and}}\quad W = \left[ {\begin{array}{*{20}c} 6 & 5 \\ 9 & 8 \\ \end{array} } \right]. $$

Then the normalized matrices \( \overline{X}, \) and \( \overline{W} \) can be computed as:

$$ \overline{X} = \left[ {\begin{array}{*{20}c} 1 & { - 3} \\ { - 1} & 3 \\ \end{array} } \right],\quad {\rm and}\quad \overline{W} = \left[ {\begin{array}{*{20}c} { - 1} & { - 2} \\ 2 & 1 \\ \end{array} } \right]. $$

Now, the dot product between a normalized matrix and the other non-normalized one can be performed as follows:

$$ \overline{X} \cdot W = \left[ {\begin{array}{*{20}c} 1 & { - 3} \\ { - 1} & 3 \\ \end{array} } \right]\;\left[ {\begin{array}{*{20}c} 6 & 5 \\ 9 & 8 \\ \end{array} } \right] = 6 - 15 - 9 + 24 = 6 $$
$$ X \cdot \overline{W} = \left[ {\begin{array}{*{20}c} 5 & 1 \\ 3 & 7 \\ \end{array} } \right]\;\left[ {\begin{array}{*{20}c} { - 1} & { - 2} \\ 2 & 1 \\ \end{array} } \right] = - 5 - 2 + 6 + 7 = 6 $$

which means generally that the dot product between a normalized matrix X and non-normalized matrix W equals to the dot product between the normalized matrix W and non-normalized matrix X. On the other hand, the cross-correlation results are different as proved in Appendix 3.

Rights and permissions

Reprints and permissions

About this article

Cite this article

El-Bakry, H.M. A novel high-speed neural model for fast pattern recognition. Soft Comput 14, 647–666 (2010). https://doi.org/10.1007/s00500-009-0433-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00500-009-0433-1

Keywords

Navigation