ABSTRACT
Analyzing a deployed software provides a means to characterize and leverage the software's runtime behavior as it is employed by its intended users. Preliminary studies have shown that leveraging the information obtained from the field provides engineers an opportunity to improve their software testing activities. The analysis of a deployed software can be performed in three stages: (1) the analysis to determine, before the software is deployed, where the instrumentation probes should be inserted into the software and what information that they should capture, (2) the analysis to determine when the field data should be sent back to the company during deployment, and (3) the analysis to leverage the field information after deployment. To make the analysis activities more feasible, we need to take into consideration that there are distinct characteristic differences between the development and the deployed environment. Deployed environment allows for less overhead, provides less control for the engineers, and requires highly scalable techniques due to the high volume of information. Hence, the existing approaches for in-house analysis may become ineffective, inefficient, or even useless when they are directly applied to the deployed environment. Existing approaches for analyzing deployed software also need to be more aware that a technique in one analysis stage may affect the performance of a technique in other analysis stage. This research proposal details the challenges that arise when analyzing a deployed software and seeks to develop a set techniques to address these challenges that can be applied to each stage or across the analysis stages.
- Bowring, J. F., Rehg, J. M., and Harrold, M. J., "Active learning for automatic classification of software behavior," Int. Symp. on Soft. Testing and Analysis, pp.195--205, 2004. Google ScholarDigital Library
- Diep, M. M., Cohen, M., and Elbaum, S., "Probe Distribution Techniques to Profile Events in Deployed Software," Int. Symp. on Soft. Reliability Eng., Washington, DC, USA, pp.331--342, 2006. Google ScholarDigital Library
- Elbaum, S., and Diep, M. M., "Profiling Deployed Software: Assessing Strategies and Testing Opportunities," IEEE Trans. Soft. Eng., Vol.31(4), pp.312--327, 2005. Google ScholarDigital Library
- Elbaum, S., Karre, S., and Rothermel, G., "Improving Web Application Testing with User Session Data," Int. Conf. on Soft. Eng, pp.49--59, May, 2003. Google ScholarDigital Library
- Haran, M., Karr, A., Last, M., Orso, A., Porter, A. A., Sanil, A., and Fouché, S., "Techniques for Classifying Executions of Deployed Software to Support Software Engineering Tasks," IEEE Trans. on Soft. Eng., Vol.33(5), pp.287--304, 2007. Google ScholarDigital Library
- Hilbert, D., and Redmiles, D., "An Approach to Large-Scale Collection of Application Usage Data Over the Internet," Int. Conf. on Soft. Eng., pp.136--145, 1998. Google ScholarDigital Library
- Liblit, B., Naik, M., Zheng, A. X., Aiken, A., and Jordan, M., "Scalable Statistical Bug Isolation," Conf. Prog. Lang. Design and Impl., pp.15--26, June, 2005. Google ScholarDigital Library
- Memon, A., Porter, A., Yilmaz, C., Nagarajan, A., Schmidt, D. C., and Natarajan, B., "Skoll: Distributed Continuous Quality Assurance," Int. Conf. on Soft. Eng., pp. 449--458, May, 2004. Google ScholarDigital Library
- Microsoft, "Microsoft Online Crash Analysis," http://oca.microsoft.com/en/Welcome.aspx.Google Scholar
- Netscape, "Netscape Quality Feedback System," home.netscape.com/communicator/navigator/v4.5/qfs1.html.Google Scholar
- Orso, A., Jones, J., and Harrold, M. J., "Visualization of program-execution data for deployed software," ACM Symp. on Soft. Visualization, pp.67-ff, 2003. Google ScholarDigital Library
- Orso, A., Apiwattanapong, T., and Harrold, M. J, "Leveraging Field Data for Impact Analysis and Regression Testing," Foundations of Soft. Eng., ACM, pp.128--137, Sept, 2003. Google ScholarDigital Library
- Orso, A., Liang, D., Harrold, M. J., and Lipton, R., "Gamma System: Continuous Evolution of Software after Deployment.," Int. Symp. on Soft. Testing and Analysis, pp.65--69, 2002. Google ScholarDigital Library
- Pavlopoulou, C., and Young, M., "Residual Test Coverage Monitoring," Int. Conf. of Soft. Eng., pp.277--284, May, 1999. Google ScholarDigital Library
Index Terms
- Analysis of a deployed software
Recommendations
Analysis of a deployed software
ESEC-FSE '07: Proceedings of the the 6th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineeringAnalyzing a deployed software provides a means to characterize and leverage the software's runtime behavior as it is employed by its intended users. Preliminary studies have shown that leveraging the information obtained from the field provides ...
Monitoring, analysis, and testing of deployed software
FoSER '10: Proceedings of the FSE/SDP workshop on Future of software engineering researchModern software is increasingly ubiquitous, commoditized, and (dynamically) configurable. Moreover, such software often must be able to operate in a varied set of heterogeneous environments. Because this software can behave very differently in different ...
Comments