Skip to main content
Log in

Primitive attempt to turn images into percepts

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Images are about pictures. Percepts are about information. People need images. Machines do not. This paper suggests that it may be possible for machines to perceive things, not just register images in real time. This goal inspired our search for methods that might work in real time to perceive their world. In our brains, the image is processed separately in different parts of the brain, and the results of those parallel operations are somehow fused to form the percept. We have done extensive development of a spectral segmentation of images using “Artificial Color” based on the way animals use spectral information. That research showed that there is a massively parallel approach to recognize targets or their background using Fourier texture discrimination. In this paper, a Fourier-based system that uses nonlinear discrimination to recognize shape, size, pose, and location of a target in a scene is discussed. The sought-after conversion of that information into an Artificial Percept in which a “cartoon” of the situation is formed with labeled targets giving their appearance (pose and size) and location. This is a primitive percept. The machine no longer has an image. Instead it “knows” what targets of interest are in the field, where they are, and their range and pose.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Caulfield HJ (2001) Independent Task Fourier filters. Opt Eng 40:2414–2418

    Article  Google Scholar 

  2. Caulfield HJ (2002) Templates for invention in the mathematical and physical sciences with applications to optics. In: Caulfield HJ (ed) Optical information processing: a tribute to Adolf Lohmann. SPIE Press, Bellingham, pp 131–148

    Google Scholar 

  3. Caulfield HJ (2003) Artificial Color. Neurocomputing 51:463–465

    Article  Google Scholar 

  4. Caulfield HJ, Fu J, Yoo SM (2004) Artificial Color Image Logic. Inf Sci 167:1–7

    Article  MATH  MathSciNet  Google Scholar 

  5. Caulfield HJ, Fu J (2007) Holographic spectral image discrimination and segmentation. J Hologr Speckle 3:112–116

    Article  Google Scholar 

  6. Caulfield HJ, Heidary K (2004) Margin Net: obtaining supergeneralization and target location by combining Fourier Correlation with Neural Networks. Opt Mem Neural Netw 13:15–25

    Google Scholar 

  7. Caulfield HJ, Heidary K (2005) The topology of discrimination with single layer perceptrons. Opt Mem Neural Netw 14:15–26

    Google Scholar 

  8. Caulfield HJ, Heidary K (2005) Exploring margin setting for good generalization in multiple class discrimination. Pattern Recognit 38:1225–1238

    Article  MATH  Google Scholar 

  9. Caulfield HJ, Karavolos A, Ludman J (2004) Improving optical Fourier pattern recognition by accommodating the missing information. Inf Sci 162:35–48

    Article  MATH  MathSciNet  Google Scholar 

  10. Fu J (2004) Joint exploration of artificial color and margin setting: an innovative approach in color image segmentation, Ph.D. Dissertation, University of Alabama in Huntsville, Huntsville, AL

  11. Fu J, Caulfield HJ (2007) Designing spectral sensitivity curves for use with Artificial Color. Pattern Recognit 40:2251–2260

    Article  MATH  Google Scholar 

  12. Fu J, Caulfield HJ (2007) Applying color discrimination to polarization discrimination in images. Opt Comm 272:362–366

    Article  Google Scholar 

  13. Fu J, Caulfield HJ, Pulusani SR (2004) Artificial Color vision: a preliminary study. J Elec Imaging 13:553–558

    Article  Google Scholar 

  14. Fu J, Caulfield HJ, Wu D, Tadesse W (2010) Hyperspectral image analysis using Artificial Color. J Appl Remote Sens 4:043514

    Article  Google Scholar 

  15. Fu J, Caulfield HJ, Wu D, Montgomery T (2010) Effects of hyperellipsoidal decision surfaces on image segmentation in Artificial Color. J Electron Imaging. doi:10.1117/1.3377146

    Google Scholar 

  16. Heidary K, Caulfield HJ (2005) Application of supergeneralized matched filters to target classification. Appl Opt 44:47–54

    Article  Google Scholar 

  17. Heidary K, Caulfield HJ (2006) Discrimination among similar looking noisy color patches using Margin Setting. Opt Express 15:62–75

    Article  Google Scholar 

  18. Knill DC, Richards W (1996) Perception as Bayesian inference. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  19. Rao RPN, Olshausen BA, Lewicki MS (2002) Probabilistic models of the brain: perception and neural function (neural information processing). MIT Press, Cambridge

    Google Scholar 

  20. Rodriguez-Sanchez R, Garcia JA, Fdez-Valdivia J, Fdez-Vidal XR (2000) Origins of illusory percepts in digital images. Pattern Recogn 33:2007–2017

    Article  MATH  Google Scholar 

  21. Small K, Roth D (2010) Margin-based active learning for structured predictions. Int J Mach Learn Cyber 1:3–25

    Article  Google Scholar 

  22. Solia SA, Leen TK, Muller KR (2000) Advances in Neural Information Processing Systems 12. MIT Press, Cambridge

    Google Scholar 

Download references

Acknowledgments

This work is supported under a contract by the US Army AMRDEC in Huntsville, AL.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jian Fu.

Appendix: Margin Setting Algorithm

Appendix: Margin Setting Algorithm

Sample feature vectors \( A = \{ \vec{x}_{1} , \ldots ,\vec{x}_{n} \} \subset [0,\;1]^{q} \subset R^{q} \) taking from the target class and the sample vectors \( B_{l} = \{ \vec{y}_{l,1} , \ldots ,\vec{y}_{{l,m_{l} }} \} \subset [0,\;1]^{q} \subset R^{q} \) (\( l = 1, \ldots ,k \)) taking from the other \( k \) objects. Denote \( B = \bigcup\limits_{l = 1}^{k} {B_{l} } \).

1.1 Notation:

Rand (\( F \)): random numbers taking from distribution \( F \).

Unif (\( D \)): uniform distribution function on set \( D \).

Card (\( D \)): the figure of merit of set \( D \).

\( [a,\;b]^{q} \; \): a \( q \)-dimensional cube.

\( O(\vec{z},\,R) \): a \( q \)-dimensional ball centered at \( \vec{z} \) with radius \( R \).

\( I \): designed number of generations.

\( \varepsilon \): designed error tolerance.

\( s \): designed number of samples taken for each generation.

\( \delta \): designed size of perturbation.

\( L \): designed number of mutations for each generation.

1.2 Algorithm:

  1. 1.

    Compute \( N \equiv \) Card(\( A \)), and set \( i = 0 \).

  2. 2.

    \( i: = i + 1 \). If \( i > I \) or Card(\( A \))/\( N \)<\( \varepsilon \), output \( \{ (\vec{c}_{j} ,\,r_{j} ;\,N_{j} ):\;j = 1, \ldots ,i - 1\} \), and stop!

  3. 3.

    Take \( \vec{c}_{i,1} , \ldots ,\vec{c}_{i,s} \) from Rand(Unif(\( [0,\;1]^{q} \))) and set \( t_{i} = 0 \). For \( j = 1, \ldots ,s \), compute \( r_{i,j} = \mathop {\hbox{min} }\limits_{{\vec{y} \in B}} |\vec{y} - \vec{c}_{i,j} | \). If \( N_{i,j} \) \( \equiv \) Card(\( A \cap O(\vec{c}_{i, j} ,\,r_{i, j} ) \) ) = 0, discard ball \( O(\vec{c}_{i,j} ,\,r_{i,j} ) \); otherwise, \( t_{i} : = t_{i} + 1 \), and record \( (\vec{c}_{i,j} ,r_{i,j} ;N_{i,j} ) \).

  4. 4.

    If \( t_{i} = 0 \), record \( (\vec{c}_{i} ,\,r_{i} ;\,N_{i} ) = (\vec{0},\,0;\,0) \), go to step 2; otherwise, for \( \{ (\vec{c}_{i,j} ,\,r_{i,j} ;\,N_{i,j} ):\;j = 1,\; \ldots ,t_{i} \} \), compute

$$ w_{i,j} = \frac{{N_{i,j} }}{{\sum\nolimits_{j = 1}^{{t_{i} }} {N_{i,j} } }},\quad j = 1,\,2, \ldots ,t_{i} . $$
  1. 5.

    Take \( v_{i} \) from Rand(Unif(\( [0,\;1] \))), if \( v_{i} \in [\sum\nolimits_{j = 1}^{{\ell_{i,0} - 1}} {w_{i,j} ,\;\sum\nolimits_{j = 1}^{{\ell_{i,0} }} {w_{i,j} } } ] \) for some \( \ell_{i,0} \in \{ 1,\,2, \cdots ,t_{i} \} \), Set \( N_{i} = N_{{i,\ell_{i,0} }} \) and the corresponding ball as \( O(\vec{c}_{i} ,\,r_{i} ) \); and discard all the other balls. Set \( l = 1 \).

  2. 6.

    Take \( \vec{s}_{l} \) from Rand(Unif(\( [ - \delta ,\;\delta ]^{q} \))), mutate \( \vec{c}_{i} \) to \( \vec{c}_{i}^{1} = \vec{c}_{i} + \vec{s}_{l} \), and compute \( r_{i,j} = \mathop {\hbox{min} }\limits_{{\vec{y} \in B}} |\vec{y} - \vec{c}_{i,j} | \). If Card(\( A \cap O(\vec{c}_{i}^{1} ,\,r_{i}^{1} ) \))\( \le N_{i} \), record \( (\vec{c}_{i} ,\,r_{i} ;N_{i} ) \) and set \( A: = A\backslash O(c^{i} ,R^{i} ) \), go to step 2; otherwise, \( N_{i} : = \)Card(\( A \cap O(\vec{c}_{i}^{1} ,\,r_{i}^{1} ) \)), \( \vec{c}_{i} : = \vec{c}_{i}^{1} \), \( r_{i} : = r_{i}^{1} \) and \( l: = l + 1 \). If \( l > L \), record \( (\vec{c}_{i} ,\,r_{i} ;N_{i} ) \) and set \( A: = A\backslash O(c^{i} ,R^{i} ) \), go to step 2; otherwise, go to step 6.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Fu, J., Caulfield, H.J. & Glenn, C. Primitive attempt to turn images into percepts. Int. J. Mach. Learn. & Cyber. 5, 963–970 (2014). https://doi.org/10.1007/s13042-013-0184-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-013-0184-2

Keywords

Navigation