Skip to main content

On Rate-Constrained Estimation in Unreliable Sensor Networks

  • Conference paper
  • First Online:
Information Processing in Sensor Networks (IPSN 2003)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2634))

Included in the following conference series:

Abstract

We study a network of non-collaborating sensors that make noisy measurements of some physical process X and communicate their readings to a central processing unit. Limited power resources of the sensors severely restrict communication rates. Sensors and their communication links are both subject to failure; however, the central unit is guaranteed to receive data from a minimum fraction of the sensors, say k out of n sensors. The goal of the central unit is to optimally estimate X from the received transmissions under a specified distortion metric. In this work, we derive an information theoretically achievable rate-distortion region for this network under symmetric sensor measurement statistics.

When all processes are jointly Gaussian and independent, and we have a squared-error distortion metric, the proposed distributed encoding and estimation framework has the following interesting optimality property: when any k out of n rate-R bits/sec sensor transmissions are received, the central unit’s estimation quality matches the best estimation quality that can be achieved from a completely reliable network of k sensors, each transmitting at rate R. Furthermore, when more than k out of the n sensor transmissions are received, the estimation quality strictly improves.

When the network has clusters of collaborating sensors should clusters compress their raw measurements or should they first try to estimate the source from their measurements and compress the estimates instead. For some interesting cases, we show that there is no loss of performance in the distributed compression of local estimates over the distributed compression of raw data in a rate-distortion sense, i.e., encoding the local sufficient statistics is good enough.

This research was supported by NSF under grant CCR-0219722 and DARPA under grant F30602-00-2-0538.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. H. V. Poor, An Introduction to Signal Detection and Estimation. New York, NY: Springer-Verlag, 1994.

    MATH  Google Scholar 

  2. M. Gastpar, B. Rimoldi, and M. Vetterli, “To code, or not to code: Lossy source-channel communication revisited,” IEEE Transactions on Information Theory. To appear.

    Google Scholar 

  3. Y. Oohama, “The Rate-Distortion Function for the Quadratic Gaussian CEO Problem,” IEEE Transactions on Information Theory, vol. 44, pp. 55–67, May 1998.

    Google Scholar 

  4. D. Slepian and J. K. Wolf, “Noiseless Coding of Correlated Information Sources,” IEEE Transactions on Information Theory, vol. 19, pp. 471–480, July 1973.

    Google Scholar 

  5. A. Wyner and J. Ziv, “The Rate-Distortion Function for Source Coding with Side Information at the Decoder,” IEEE Transactions on Information Theory, vol. 22, pp. 1–10, January 1976.

    Google Scholar 

  6. S. S. Pradhan and K. Ramchandran, “Distributed Source Coding Using Syndromes (DISCUS): Design and Construction,” Proceedings of the Data Compression Conference (DCC), March, Snowbird, UT, 1999.

    Google Scholar 

  7. S. S. Pradhan and K. Ramchandran, “Distributed Source Coding: Symmetric Rates and Applications to Sensor Networks,” Proceedings of the Data Compression Conference (DCC), March, Snowbird, UT, 2000.

    Google Scholar 

  8. T. M. Cover and J. A. Thomas, Elements of Information Theory. New York: John Wiley and Sons, 1991.

    MATH  Google Scholar 

  9. S. S. Pradhan, R. Puri, and K. Ramchandran, “n-Channel Symmetric Multiple Descriptions — Part I: (n,k) Source-Channel Erasure Codes,” IEEE Transactions on Information Theory. To appear.

    Google Scholar 

  10. R. G. Gallager, Information Theory and Reliable Communication. New York: John Wiley and Sons, 1968.

    MATH  Google Scholar 

  11. D. J. Sakrison, “Source Encoding in the Presence of Random Disturbance,” IEEE Trans. on Information Theory, vol. IT-14, Jan. 1968.

    Google Scholar 

  12. S. S. Pradhan and K. Ramchandran, “Generalized Coset Codes for Symmetric Distributed Coding,” IEEE Transactions on Information Theory, 2003. Submitted.

    Google Scholar 

  13. T. Berger, “Multiterminal Source Coding,” Information Theory Approach to Communication, (CISM Courses and Lecture Notes No. 229), G. Longo, Ed., Wien and New York: Springer-Verlag, 1977.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ishwar, P., Puri, R., Pradhan, S.S., Ramchandran, K. (2003). On Rate-Constrained Estimation in Unreliable Sensor Networks. In: Zhao, F., Guibas, L. (eds) Information Processing in Sensor Networks. IPSN 2003. Lecture Notes in Computer Science, vol 2634. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36978-3_12

Download citation

  • DOI: https://doi.org/10.1007/3-540-36978-3_12

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-02111-7

  • Online ISBN: 978-3-540-36978-3

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics