skip to main content
10.1145/2642937.2643009acmconferencesArticle/Chapter ViewAbstractPublication PagesaseConference Proceedingsconference-collections
research-article

PrefFinder: getting the right preference in configurable software systems

Published:15 September 2014Publication History

ABSTRACT

Highly configurable software, such as web browsers,databases or office applications, have a large number of preferences that the user can customize, but documentation of them may be scarce or distributed. A user, tester or service technician may have to search through hundreds or thousands of choices in multiple documents when trying to identify which preference will modify a particular system behavior. In this paper we present PrefFinder, a natural language framework that finds (and changes) user preferences. It is tied into an application's preference system and static documentation. We have instantiated PrefFinder as a plugin on two open source applications, and as a stand-alone GUI for an industrial application. PrefFinder finds thecorrect answer between 76-96% of the time on more than 175 queries. When compared to asking questions on a help forum or through the company's service center, we can potentially save days or even weeks of time.

References

  1. Computer acronyms list. http://www.francesfarmersrevenge.com/stuff/archive/oldnews2/computeracronyms.htm.Google ScholarGoogle Scholar
  2. AskLibO. http://ask.libreoffice.org/, 2014.Google ScholarGoogle Scholar
  3. M. Attariyan and J. Flinn. Automating configuration troubleshooting with dynamic information flow analysis. In Proceedings of the 9th USENIX Conference on Operating Systems Design and Implementation, OSDI, pages 1--11, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. D. Binkley and D. Lawrie. Development: Information retrieval applications. In Encyclopedia of Software Engineering, pages 231--242. 2010.Google ScholarGoogle Scholar
  5. B. Dit, L. Guerrouj, D. Poshyvanyk, and G. Antoniol. Can better identifier splitting techniques help feature location? In International Conference on Program Comprehension (ICPC), pages 11--20, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. LibreOffice. http://libreoffice.org/, 2013.Google ScholarGoogle Scholar
  7. E. Dumlu, C. Yilmaz, M. B. Cohen, and A. Porter. Feedback driven adaptive combinatorial testing. In International Symposium on Software Testing and Analysis, ISSTA, pages 243--253, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. E. Enslen, E. Hill, L. Pollock, and K. Vijay-Shanker. Mining source code to automatically split identifiers for software analysis. In International Working Conference on Mining Software Repositories, MSR, pages 71--80, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. H. Feild, D. Binkley, and D. Lawrie. An empirical comparison of techniques for extracting concept abbreviations from identifiers. In In Proceedings of IASTED International Conference on Software Engineering and Applications (SEA 2006), 2006.Google ScholarGoogle Scholar
  10. Ispell. http://www.gnu.org/software/ispell/.Google ScholarGoogle Scholar
  11. E. Hill, Z. P. Fry, H. Boyd, G. Sridhara, Y. Novikova, L. Pollock, and K. Vijay-Shanker. AMAP: Automatically mining abbreviation expansions in programs to enhance software maintenance tools. In International Working Conference on Mining Software Repositories (MSR), pages 79--88, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. M. J. Howard, S. Gupta, L. Pollock, and K. Vijay-Shanker. Automatically mining software-based semantically-similar words from comment-code mappings. In Working Conference on Mining Software Repositories, May 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. D. Jin. Improving Preference Recommendation and Customization in Real World Highly Configurable Software Systems. Master's thesis, University of Nebraska-Lincoln, Lincoln, NE, USA, August 2014.Google ScholarGoogle Scholar
  14. D. Jin, X. Qu, M. B. Cohen, and B. Robinson. Configurations everywhere: Implications for testing and debugging in practice. In International Conference on Software Engineering Companion Volume, Software Engineering in Practice, SEIP, pages 215--224, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. C. H. P. Kim, D. Marinov, S. Khurshid, D. Batory, S. Souto, P. Barros, and M. D'Amorim. SPLat: Lightweight dynamic analysis for reducing combinatorics in testing configurable systems. In Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering, ESEC/FSE 2013, pages 257--267, New York, NY, USA, 2013. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. S. E. Kristian Wiklund, Daniel Sundmark and K. Lundqvist. Impediments for automated testing - an empirical analysis of a user support discussion board. In International Conference on Software Testing, Verification and Validation. IEEE, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. J. Liebig, A. von Rhein, C. Kästner, S. Apel, J. Dörre, and C. Lengauer. Scalable analysis of variable software. In Proceedings of the European Software Engineering Conference and ACM SIGSOFT Symposium on the Foundations of Software Engineering (ESEC/FSE), pages 81--91, New York, NY, 8 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. A. D. Lucia, F. Fasano, R. Oliveto, and G. Tortora. Recovering traceability links in software artifact management systems using information retrieval methods. ACM Transactions on Software Engineering and Methodology, 16(4), Sept. 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. C. D. Manning, P. Raghavan, and H. Schütze. Introduction to Information Retrieval. Cambridge University Press, New York, NY, USA, 2008. Google ScholarGoogle ScholarCross RefCross Ref
  20. C. McMillan, M. Grechanik, D., C. Fu, and Q. Xie. Exemplar: A source code search engine for finding highly relevant applications. IEEE Transactions on Software Engineering, 38(5):1069--1087, Sept. 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. G. A. Miller. WordNet: A lexical database for english. Communications of the ACM, 38:39--41, 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. mozdev.org. http://preferential.mozdev.org/preferences.html, 2014.Google ScholarGoogle Scholar
  23. XPCOM. https://developer.mozilla.org/en-US/docs/XPCOM.Google ScholarGoogle Scholar
  24. Mozilla Support. https://support.mozilla.org/, 2013.Google ScholarGoogle Scholar
  25. mozillaZine Knowledge Base. http://kb.mozillazine.org/Knowledge_Base, 2014.Google ScholarGoogle Scholar
  26. A. Panichella, C. McMillan, E. Moritz, D. Palmieri, R. Oliveto, D. Poshyvanyk, and A. D. Lucia. When and how using structural information to improve IR-Based traceability recovery. In European Conference on Software Maintenance and Reengineering, CSMR, pages 199--208, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. X. Qu, M. Acharya, and B. Robinson. Impact analysis of configuration changes for test case selection. In International Symposium on Software Reliability Engineering, ISSRE, pages 140--149, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. X. Qu, M. B. Cohen, and G. Rothermel. Configuration-aware regression testing: an empirical study of sampling and prioritization. In International Symposium On Software Testing and Analysis, pages 75--86, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. A. Rabkin and R. Katz. Precomputing possible configuration error diagnoses. In International Conference on Automated Software Engineering (ASE), pages 193--202, nov 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. A. Rabkin and R. Katz. Static extraction of program configuration options. In International Conference on Software Engineering, ICSE, pages 131--140, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. E. Reisner, C. Song, K.-K. Ma, J. S. Foster, and A. Porter. Using symbolic evaluation to understand behavior in configurable software systems. In International Conference on Software Engineering, ICSE, pages 445--454, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. B. Robinson and L. White. Testing of user-configurable software systems using firewalls. International Symposium on Software Reliability Engineering, ISSRE, pages 177--186, Nov. 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. G. Salton and C. Buckley. Term-weighting approaches in automatic text retrieval. In Information Processing and Management, pages 513--523, 1988. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. D. Shepherd, Z. P. Fry, E. Hill, L. Pollock, and K. Vijay-Shanker. Using natural language program analysis to locate and understand action-oriented concerns. In International Conference on Aspect-oriented Software Development, pages 212--224, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. C. Song, A. Porter, and J. S. Foster. iTree: efficiently discovering high-coverage configurations using interaction trees. In The International Conference on Software Engineering, ICSE, pages 903--913, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. H. Srikanth, M. B. Cohen, and X. QU. Reducing field failures in system configurable software: Cost-based prioritization. In International Symposium on Software Reliability Engineering, ISSRE, pages 61--70, Nov 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. K. T. Stolee and S. Elbaum. Toward semantic search via SMT solver. In International Symposium on the Foundations of Software Engineering (FSE), pages 25:1--25:4, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Y. Su, M. Attariyan, and J. Flinn. AutoBash: Improving configuration management with operating system causality analysis. SIGOPS Operating Systems Review, 41(6):237--250, Oct. 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Sun Microsystems. OpenOffice.org 3.1 Developer's Guide. Sun Microsystems, 2009.Google ScholarGoogle Scholar
  40. A. Whitaker, R. S. Cox, and S. D. Gribble. Configuration debugging as search: Finding the needle in the haystack. In Proceedings of the 6th Conference on Symposium on Operating Systems Design & Implementation - Volume 6, OSDI, pages 77--90, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Y. Xiong, A. Hubaux, S. She, and K. Czarnecki. Generating range fixes for software configuration. In International Conference on Software Engineering, ICSE 2012, pages 58--68, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. C. Yilmaz, M. B. Cohen, and A. Porter. Covering arrays for efficient fault characterization in complex configuration spaces. IEEE Transactions on Software Engineering, 31(1):20--34, Jan 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Z. Yin, X. Ma, J. Zheng, Y. Zhou, L. N. Bairavasundaram, and S. Pasupathy. An empirical study on configuration errors in commercial and open source systems. In Symposium on Operating Systems Principles, SOSP, pages 159--172, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. S. Zhang and M. D. Ernst. Automated diagnosis of software configuration errors. In International Conference on Software Engineering, ICSE, pages 312--321, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. S. Zhang and M. D. Ernst. Which configuration option should I change? In International Conference on Software Engineering, ICSE, pages 152--163, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. PrefFinder: getting the right preference in configurable software systems

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ASE '14: Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering
      September 2014
      934 pages
      ISBN:9781450330138
      DOI:10.1145/2642937

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 15 September 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ASE '14 Paper Acceptance Rate82of337submissions,24%Overall Acceptance Rate82of337submissions,24%

      Upcoming Conference

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader