Skip to main content

Advertisement

Log in

Lagrange Programming Neural Network Approaches for Robust Time-of-Arrival Localization

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

There are two interesting properties in human brain. One is its massively interconnected structure. Another one is that human can handle outlier data effectively. For instance, human is able to recognize an object from an image with non-Gaussian noise. Artificial neural network is one of biologically inspired techniques. From the structural point of view, many neural network models have massively interconnected structures. Since the traditional analog neural network approach cannot handle an l 1-norm-like objective function, it cannot be used to handle outlier data. This paper proposes two neural network models for the robust source localization problem in the time-of-arrival (TOA) model. Our development is based on the Lagrange programming neural network (LPNN) approach. To alleviate the influence of outliers, this paper introduces an l 1-norm objective function. However, in the traditional LPNN approach, the constraints and the objective function must be differentiable. We devise two methods to handle the non-differentiable l 1-norm term. The first method introduces an approximation to replace the l 1-norm term. The second one uses the concept of hidden state from the locally competitive algorithm (LCA) to avoid the computation of the gradient vector at non-differentiable points. We also present the local stability of the two proposed models. From the simulations, our proposed methods are capable to handle the outliers and their error performances are better than many existing TOA algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. So HC. Source localization: algorithms and analysis. In: Handbook of Position Location: Theory, Practice, and Advances, Zekavat SA, Buehrer RM, Eds Wiley; 2011.

  2. Hoshen J. Personal locator services emerge. IEEE Spectr. 2000;37(2):41–48.

    Article  Google Scholar 

  3. Xu E, Ding Z, Dasgupta S. Source localization in wireless sensor networks from signal time-of-arrival measurements. IEEE Trans Signal Process. 2011;59(6):2887–2897.

    Article  Google Scholar 

  4. Chen JC, Hudson RE, Yao K. Maximum-likelihood source localization and unknown sensor location estimation for wideband signals in the near field. IEEE Trans Signal Process. 2002;50(8):1843–1854.

    Article  Google Scholar 

  5. Chan YT, Ho KC. A simple and efficient estimator for hyperbolic location. IEEE Trans Signal Process. 1994;42(8):1905–1915.

    Article  Google Scholar 

  6. Chen HQ, Zeng ZG. Deformation prediction of landslide based on improved back-propagation neural network. Cogn Comput. 2013;5(1):56—62.

    Article  Google Scholar 

  7. Sun R. Moral judgment, human motivation, and neural networks. Cogn Comput. 2013;5(4):566—79.

    Article  Google Scholar 

  8. Pan J, Li X, Li X, et al. Incrementally detecting moving objects in video with sparsity and connectivity. Cogn Comput. 2016;8(3):420–428.

    Article  Google Scholar 

  9. Cochocki A, Unbehauen R. Neural networks for optimization and signal processing, Wiley; 1993.

  10. Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Nat Acad Sci. 1982;79(8):2554–2558.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  11. Chua L, Lin GN. Nonlinear programming without computation. IEEE Trans Circ Syst. 1984;31(2):182–188.

    Article  Google Scholar 

  12. Tank D, Hopfield J. Simple neural optimization networks: an A/D converter, signal decision circuit, and a linear programming circuit. IEEE Trans Circ Syst. 1986;33(5):533–541.

    Article  Google Scholar 

  13. Xia Y. An extended projection neural network for constrained optimization. Neural Comput. 2004;16(4):863–883.

    Article  Google Scholar 

  14. Liu Q, Wang J. A one-layer recurrent neural network with a discontinuous hard-limiting activation function for quadratic programming. IEEE Trans Neural Netw. 2008;19(4):558–570.

    Article  CAS  PubMed  Google Scholar 

  15. Xu C, Li P. Dynamics in four-neuron bidirectional associative memory networks with inertia and multiple delays. Cogn Comput. 2016;8(1):78–104.

    Article  Google Scholar 

  16. Hu X, Wang J. Solving pseudomonotone variational inequalities and pseudoconvex optimization problems using the projection neural network. IEEE Trans Neural Netw. 2006;17(6):1487–1499.

    Article  PubMed  Google Scholar 

  17. Hu X, Wang J. A recurrent neural network for solving a class of general variational inequalities. IEEE Trans Syst Man, Cybern B. 2007;37(3):528–539.

    Article  Google Scholar 

  18. Gao XB. Exponential stability of globally projected dynamic systems. IEEE Trans Neural Netw. 2003;14(2):426–431.

    Article  PubMed  Google Scholar 

  19. Hu X, Zhang B. An alternative recurrent neural network for solving variational inequalities and related optimization problems. IEEE Transx Syst Man, Cybern B. 2009;39(6):1640–1645.

    Article  Google Scholar 

  20. Liu Q, Dang C, Huang T. A one-layer recurrent neural network for real-time portfolio optimization with probability criterion. IEEE Trans Cybern. 2013;43(1):14–23.

    Article  PubMed  Google Scholar 

  21. Zhang S, Constantinides AG. Lagrange programming neural networks. IEEE Trans Circ Syst II: Analog Digit Signal Process. 1992;39(7):441–452.

    Article  Google Scholar 

  22. Zhu X, Zhang SW, Constantinides AG. Lagrange neural networks for linear programming. J Parallel Distrib Comput. 1992;14(3):354–360.

    Article  Google Scholar 

  23. Liang JL, So HC, Leung CS, Li J, Farina A. Waveform design with unit modulus and spectral shape constraints via Lagrange programming neural network. IEEE J Select Top Signal Process. 2015;9(8):1377–1386.

    Article  Google Scholar 

  24. Liang JL, Leung CS, So HC. Lagrange programming neural network approach for target localization in distributed MIMO radar. IEEE Trans Signal Process. 2016;64(6):1574–1585.

    Article  Google Scholar 

  25. Sharma V, Jha R, Naresh R. An augmented Lagrange programming optimization neural network for short term hydroelectric generation scheduling. Eng Optim. 2005;37(5):479–497.

    Article  Google Scholar 

  26. Leung CS, Sum J, So HC, Constantinides AG, Chan FK. Lagrange programming neural networks for time-of-arrival-based source localization. Neural Comput Appl. 2014;24(1):109–116.

    Article  Google Scholar 

  27. Feng R, Leung CS, Constantinides AG, Zeng WJ. Lagrange programming neural network for nondifferentiable optimization problems in sparse approximation, IEEE Trans. Neural Netw Learn Syst, (accepted); 2017.

  28. Zheng A, Xu M, Luo B, Zhou Z, Li C. CLASS: Collaborative low-rank and sparse separation for moving object detection. Cogn Comput. 2017;9(2):180–193.

    Article  Google Scholar 

  29. Zhang Z, Xiahou J, Bai ZJ, et al. Discriminative lasso. Cogn Comput. 2016;8(5):847–855.

    Article  Google Scholar 

  30. Xu J, Yang G, Yin Y, et al. Sparse-representation-based classification with structure-preserving dimension reduction. Cogn Comput. 2014;6(3):608–621.

    Article  Google Scholar 

  31. Tanveer M. Robust and sparse linear programming twin support vector machines. Cogn Comput. 2015;7(1):137–149.

    Article  Google Scholar 

  32. Rozell CJ, Johnson DH, Baraniuk RG, Olshausen BA. Sparse coding via thresholding and local competition in neural circuits. Neural Comput. 2008;20(10):2526–2563.

    Article  PubMed  Google Scholar 

  33. Balavoine A, Rozell CJ, Romberg J. Global convergence of the locally competitive algorithm. In: Proceedings of IEEE Digital Signal Processing Workshop and IEEE Signal Processing Education Workshop (DSP/SPE); 2011. p. 431–436.

  34. Balavoine A, Romberg J, Rozell C. Convergence and rate analysis of neural networks for sparse approximation. IEEE Trans Neural Netw. 2012;23(9):1377–1389.

    Article  Google Scholar 

  35. Sun GL, Guo W. Bootstrapping M-estimators for reducing errors due to non-line-of-sight (NLOS) propagation. IEEE Commun Lett. 2004;8(8):509–510.

    Article  Google Scholar 

  36. David AF. Statistical models: Theory and practice. Cambridge University Press, p. 128; 2009.

  37. Wang H, Feng R, Leung CS. A Robust TOA Source Localization Algorithm Based on LPNN. In: Proceedings of ICONIP 2016; 2016. p. 119–126.

  38. Luenberger DG, Ye Y. Linear and nonlinear programming, Springer Science and Business Media; 2008.

  39. Bertsekas DP. Constrained optimization and Lagrange multiplier methods, Academic Press; 2014.

  40. Donoho DL. For most large under determined systems of linear equations the minimal l1 norm solution is also the sparsest solution. Commun Pure Appl Math. 2006;59(6):797–829.

    Article  Google Scholar 

Download references

Acknowledgements

The work was supported by a research grant from the Government of the Hong Kong Special Administrative Region (CityU 11259516) and the Hong Kong innovation technology support programme (ITS308/15).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Andrew Chi Sing Leung.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, H., Feng, R., Leung, A.C.S. et al. Lagrange Programming Neural Network Approaches for Robust Time-of-Arrival Localization. Cogn Comput 10, 23–34 (2018). https://doi.org/10.1007/s12559-017-9495-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-017-9495-z

Keywords

Navigation