skip to main content
10.1145/1295014.1295051acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
Article

Analysis of a deployed software

Published:03 September 2007Publication History

ABSTRACT

Analyzing a deployed software provides a means to characterize and leverage the software's runtime behavior as it is employed by its intended users. Preliminary studies have shown that leveraging the information obtained from the field provides engineers an opportunity to improve their software testing activities. The analysis of a deployed software can be performed in three stages: (1) the analysis to determine, before the software is deployed, where the instrumentation probes should be inserted into the software and what information that they should capture, (2) the analysis to determine when the field data should be sent back to the company during deployment, and (3) the analysis to leverage the field information after deployment. To make the analysis activities more feasible, we need to take into consideration that there are distinct characteristic differences between the development and the deployed environment. Deployed environment allows for less overhead, provides less control for the engineers, and requires highly scalable techniques due to the high volume of information. Hence, the existing approaches for in-house analysis may become ineffective, inefficient, or even useless when they are directly applied to the deployed environment. Existing approaches for analyzing deployed software also need to be more aware that a technique in one analysis stage may affect the performance of a technique in other analysis stage. This research proposal details the challenges that arise when analyzing a deployed software and seeks to develop a set techniques to address these challenges that can be applied to each stage or across the analysis stages.

References

  1. Bowring, J. F., Rehg, J. M., and Harrold, M. J., "Active learning for automatic classification of software behavior," Int. Symp. on Soft. Testing and Analysis, pp.195--205, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Diep, M. M., Cohen, M., and Elbaum, S., "Probe Distribution Techniques to Profile Events in Deployed Software," Int. Symp. on Soft. Reliability Eng., Washington, DC, USA, pp.331--342, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Elbaum, S., and Diep, M. M., "Profiling Deployed Software: Assessing Strategies and Testing Opportunities," IEEE Trans. Soft. Eng., Vol.31(4), pp.312--327, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Elbaum, S., Karre, S., and Rothermel, G., "Improving Web Application Testing with User Session Data," Int. Conf. on Soft. Eng, pp.49--59, May, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Haran, M., Karr, A., Last, M., Orso, A., Porter, A. A., Sanil, A., and Fouché, S., "Techniques for Classifying Executions of Deployed Software to Support Software Engineering Tasks," IEEE Trans. on Soft. Eng., Vol.33(5), pp.287--304, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Hilbert, D., and Redmiles, D., "An Approach to Large-Scale Collection of Application Usage Data Over the Internet," Int. Conf. on Soft. Eng., pp.136--145, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Liblit, B., Naik, M., Zheng, A. X., Aiken, A., and Jordan, M., "Scalable Statistical Bug Isolation," Conf. Prog. Lang. Design and Impl., pp.15--26, June, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Memon, A., Porter, A., Yilmaz, C., Nagarajan, A., Schmidt, D. C., and Natarajan, B., "Skoll: Distributed Continuous Quality Assurance," Int. Conf. on Soft. Eng., pp. 449--458, May, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Microsoft, "Microsoft Online Crash Analysis," http://oca.microsoft.com/en/Welcome.aspx.Google ScholarGoogle Scholar
  10. Netscape, "Netscape Quality Feedback System," home.netscape.com/communicator/navigator/v4.5/qfs1.html.Google ScholarGoogle Scholar
  11. Orso, A., Jones, J., and Harrold, M. J., "Visualization of program-execution data for deployed software," ACM Symp. on Soft. Visualization, pp.67-ff, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Orso, A., Apiwattanapong, T., and Harrold, M. J, "Leveraging Field Data for Impact Analysis and Regression Testing," Foundations of Soft. Eng., ACM, pp.128--137, Sept, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Orso, A., Liang, D., Harrold, M. J., and Lipton, R., "Gamma System: Continuous Evolution of Software after Deployment.," Int. Symp. on Soft. Testing and Analysis, pp.65--69, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Pavlopoulou, C., and Young, M., "Residual Test Coverage Monitoring," Int. Conf. of Soft. Eng., pp.277--284, May, 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Analysis of a deployed software

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ESEC-FSE companion '07: The 6th Joint Meeting on European software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering: companion papers
      September 2007
      189 pages
      ISBN:9781595938121
      DOI:10.1145/1295014

      Copyright © 2007 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 3 September 2007

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • Article

      Acceptance Rates

      Overall Acceptance Rate112of543submissions,21%
    • Article Metrics

      • Downloads (Last 12 months)1
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader