skip to main content
10.1145/3338906.3338980acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

Preference-wise testing for Android applications

Published:12 August 2019Publication History

ABSTRACT

Preferences, the setting options provided by Android, are an essential part of Android apps. Preferences allow users to change app features and behaviors dynamically, and therefore, need to be thoroughly tested. Unfortunately, the specific preferences used in test cases are typically not explicitly specified, forcing testers to manually set options or blindly try different option combinations. To effectively test the impacts of different preference options, this paper presents PREFEST, as a preference-wise enhanced automatic testing approach, for Android apps. Given a set of test cases, PREFEST can locate the preferences that may affect the test cases with a static and dynamic combined analysis on the app under test, and execute these test cases only under necessary option combinations. The evaluation shows that PREFEST can improve 6.8% code coverage and 12.3% branch coverage and find five more real bugs compared to testing with the original test cases. The test cost is reduced by 99% for both the number of test cases and the testing time, compared to testing under pairwise combination of options.

References

  1. Bestoun S Ahmed and Kamal Z Zamli. 2010. PSTG: A T-Way Strategy Adopting Particle Swarm Optimization. In Proceedings of the 2010 Fourth Asia International Conference on Mathematical/Analytical Modelling and Computer Simulation. IEEE Computer Society, 1–5. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Domenico Amalfitano, Nicola Amatucci, Anna Rita Fasolino, Porfirio Tramontana, Emily Kowalczyk, and Atif M Memon. 2015. Exploiting the saturation effect in automatic random testing of Android applications. In Proceedings of the Second ACM International Conference on Mobile Software Engineering and Systems. IEEE Press, 33–43. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Domenico Amalfitano, Anna Rita Fasolino, Porfirio Tramontana, Salvatore De Carmine, and Atif M Memon. 2012. Using GUI ripping for automated testing of Android applications. In Proceedings of the 27th IEEE/ACM International Conference on Automated Software Engineering. ACM, 258–261. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Saswat Anand, Mayur Naik, Mary Jean Harrold, and Hongseok Yang. 2012. Automated concolic testing of smartphone apps. In Proceedings of the ACM SIGSOFT 20th International Symposium on the Foundations of Software Engineering. ACM, 59. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. AppiumConf. 2019. Appium : Automation for Apps. http://appium.io/. {online, accessed 15-Feb-2019}.Google ScholarGoogle Scholar
  6. Steven Arzt, Siegfried Rasthofer, Christian Fritz, Eric Bodden, Alexandre Bartel, Jacques Klein, Yves Le Traon, Damien Octeau, and Patrick McDaniel. 2014.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Flowdroid: Precise context, flow, field, object-sensitive and lifecycle-aware taint analysis for Android apps. Acm Sigplan Notices 49, 6 (2014), 259–269. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Tanzirul Azim and Iulian Neamtiu. 2013. Targeted and depth-first exploration for systematic testing of Android apps. In Acm Sigplan Notices, Vol. 48. ACM, 641–660. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Wontae Choi, George Necula, and Koushik Sen. 2013. Guided GUI testing of Android apps with minimal restart and approximate learning. In Acm Sigplan Notices, Vol. 48. ACM, 623–640. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Shauvik Roy Choudhary, Alessandra Gorla, and Alessandro Orso. 2015. Automated Test Input Generation for Android: Are We There Yet?(E). (2015), 429–440.Google ScholarGoogle Scholar
  11. David M Cohen, Siddhartha R Dalal, Michael L Fredman, and Gardner C Patton. 1997. The AETG System: An Approach to Testing Based on Combinatorial Design. IEEE Transactions on Software Engineering 23, 7 (1997), 437–444. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Myra B Cohen, Matthew B Dwyer, and Jiangfan Shi. 2008. Constructing Interaction Test Suites for Highly-Configurable Systems in the Presence of Constraints: A Greedy Approach. IEEE Transactions on Software Engineering 34, 5 (2008), 633–650. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Google Developers. 2018. Documentation of Settings for Android Developers. https://developer.android.com/guide/topics/ui/settings. {online, accessed 01-Sep- 2018}.Google ScholarGoogle Scholar
  14. Google Developers. 2019. UI/Application Exerciser Monkey. https://developer. android.com/studio/test/monkey. {online, accessed 15-Feb-2019}.Google ScholarGoogle Scholar
  15. EclEmmaTeam. 2019. JaCoCo: Java Code Coverage Library. https://www. eclemma.org/jacoco/. {online, accessed 15-Feb-2019}.Google ScholarGoogle Scholar
  16. Shuai Hao, Bin Liu, Suman Nath, William GJ Halfond, and Ramesh Govindan. 2014. PUMA: programmable UI-automation for large-scale dynamic analysis of mobile apps. In Proceedings of the 12th annual international conference on Mobile systems, applications, and services. ACM, 204–217. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Casper S Jensen, Mukul R Prasad, and Anders Møller. 2013. Automated testing with targeted event sequence generation. In Proceedings of the 2013 International Symposium on Software Testing and Analysis. ACM, 67–77. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Patrick Lam, Eric Bodden, Ondrej Lhoták, and Laurie Hendren. 2011. The Soot framework for Java program analysis: a retrospective. In Cetus Users and Compiler Infastructure Workshop (CETUS 2011), Vol. 15. 35.Google ScholarGoogle Scholar
  19. Yu Lei, Raghu Kacker, D Richard Kuhn, Vadim Okun, and James Lawrence. 2007. IPOG: A General Strategy for T-Way Software Testing. In Proceedings of the 14th Annual IEEE International Conference and Workshops on the Engineering of Computer-Based Systems. IEEE Computer Society, 549–556. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Aravind Machiry, Rohan Tahiliani, and Mayur Naik. 2013. Dynodroid: An input generation system for Android apps. In Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering. ACM, 224–234. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Riyadh Mahmood, Nariman Mirzaei, and Sam Malek. 2014. Evodroid: Segmented evolutionary testing of Android apps. In Proceedings of the 22nd ACM SIGSOFT International Symposium on Foundations of Software Engineering. ACM, 599–609. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Ke Mao, Mark Harman, and Yue Jia. 2016. Sapienz: multi-objective automated testing for Android applications. In Proceedings of the 25th International Symposium on Software Testing and Analysis. ACM, 94–105. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. CC Michael, GE McGraw, MA Schatz, and CC Walton. 1997. Genetic algorithms for dynamic test data generation. In Proceedings of the 12th international conference on Automated software engineering (formerly: KBSE). IEEE Computer Society, 307. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Nariman Mirzaei, Hamid Bagheri, Riyadh Mahmood, and Sam Malek. 2015. SIGDroid: Automated system input generation for Android applications. In Proceedings of the 2015 IEEE 26th International Symposium on Software Reliability Engineering (ISSRE). IEEE Computer Society, 461–471. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Nariman Mirzaei, Joshua Garcia, Hamid Bagheri, Alireza Sadeghi, and Sam Malek. 2016. Reducing combinatorics in GUI testing of Android applications. In Proceedings of the 38th International Conference on Software Engineering. ACM, 559–570. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. monkeyrunner. 2019. monkeyrunner. https://developer.android.com/studio/test/ monkeyrunner/. {online, accessed 15-Feb-2019}.Google ScholarGoogle Scholar
  27. Changhai Nie and Hareton Leung. 2011. A survey of combinatorial testing. ACM Computing Surveys (CSUR) 43, 2 (2011), 11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. pcqpcq. 2019. open-source-android-apps. https://github.com/pcqpcq/opensource-android-apps/. {online, accessed 15-Feb-2019}.Google ScholarGoogle Scholar
  29. RobotiumTech. 2019. Android UI Testing Robotium. https://github.com/ RobotiumTech/robotium. {online, accessed 15-Feb-2019}.Google ScholarGoogle Scholar
  30. Alireza Sadeghi, Reyhaneh Jabbarvand, and Sam Malek. 2017. PATDdroid: permission-aware GUI testing of Android. In Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering. ACM, 220–232. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Toshiaki Shiba, Tatsuhiro Tsuchiya, and Tohru Kikuno. 2004. Using Artificial Life Techniques to Generate Test Cases for Combinatorial Testing. In Proceedings of the 28th Annual International Computer Software and Applications Conference-Volume 01. IEEE Computer Society, 72–77. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Skylot. 2019. jadx : Dex to Java decompiler. https://github.com/skylot/jadx/. {online, accessed 15-Feb-2019}.Google ScholarGoogle Scholar
  33. Wei Song, Xiangxing Qian, and Jeff Huang. 2017. EHBDroid: beyond GUI testing for Android applications. In Proceedings of the 32nd IEEE/ACM International Conference on Automated Software Engineering. IEEE Press, 27–37. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Ting Su, Guozhu Meng, Yuting Chen, Ke Wu, Weiming Yang, Yao Yao, Geguang Pu, Yang Liu, and Zhendong Su. 2017. Guided, stochastic model-based GUI testing of Android apps. In Proceedings of the 2017 11th Joint Meeting on Foundations of Software Engineering. ACM, 245–256. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Preference-wise testing for Android applications

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ESEC/FSE 2019: Proceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering
      August 2019
      1264 pages
      ISBN:9781450355728
      DOI:10.1145/3338906

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 August 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate112of543submissions,21%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader