skip to main content
10.1145/1985793.1986006acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
extended-abstract

Mental models and parallel program maintenance

Published:21 May 2011Publication History

ABSTRACT

Parallel programs are difficult to write, test, and debug. This thesis explores how programmers build mental models about parallel programs, and demonstrates, through user evaluations, that maintenance activities can be improved by incorporating theories based on such models. By doing so, this work aims to increase the reliability and performance of today's information technology infrastructure by improving the practice of maintaining and testing parallel software.

References

  1. Practical parallel and concurrent programming course materials. http://ppcp.codeplex.com/.Google ScholarGoogle Scholar
  2. J. Choi and H. Srinivasan. Deterministic replay of Java multithreaded applications. In Symposium on Parallel and Distributed Tools (SPDT), 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. K. Ebcioglu, V. Sarkar, T. El-Ghazawi, J. Urbanic, and P. Center. An experiment in measuring the productivity of three parallel programming languages. In Workshop on Productivity and Performance in High-End Computing (P-PHEC), 2006.Google ScholarGoogle Scholar
  4. O. Edelstein, E. Farchi, E. Goldin, Y. Nir, G. Ratsaby, and S. Ur. Framework for testing multi-threaded java programs. Concurrency and Computation: Practice and Experience, 15:485--499, 2003.Google ScholarGoogle ScholarCross RefCross Ref
  5. V. Fix, S. Wiedenbeck, and J. Scholtz. Mental representations of programs by novices and experts. In Conference on Human factors In computing systems (CHI), pages 74--79, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. D. Fleming. Successful Strategies for Debugging Concurrent Software: An Empirical Investigation. PhD thesis, Michigan State Univ., 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. D. Gentner and A. Stevens. Mental Models. Erlbaum, 1983.Google ScholarGoogle Scholar
  8. J. Gould and P. Drongowski. An exploratory study of computer program debugging. Human Factors: The Journal of the Human Factors and Ergonomics Society, 16(3):258--277, 1974.Google ScholarGoogle ScholarCross RefCross Ref
  9. A. Hartman, A. Kirshin, and K. Nagin. A test execution environment running abstract tests for distributed software. In Conference on Software Engineering and Applications (SEA), 2002.Google ScholarGoogle Scholar
  10. L. Hochstein and V. Basili. The ASC-Alliance projects: A case study of large-scale parallel scientific code development. Computer, 41(3):50--58, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. L. Hochstein, J. Carver, F. Shull, S. Asgari, and V. Basili. Parallel programmer productivity: A case study of novice parallel programmers. In Supercomputing Conference (SC), 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. High Productivity Computing Systems. http://www.highproductivity.org/.Google ScholarGoogle Scholar
  13. J. Koenemann and S. Robertson. Expert problem solving strategies for program comprehension. In Conference on Human factors In computing systems (CHI), 1991. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. J. Lazar, J. Feng, and H. Hochheiser. Research Methods in Human-Computer Interaction. Wiley, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. D. Littman, J. Pinto, S. Letovsky, and E. Soloway. Mental models and software maintenance. Journal of Systems and Software, 7(4):341--355, 1987. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. M. Luff. Empirically investigating parallel programming paradigms: A null result. In Workshop on Evaluation and Usability of Programming Languages and Tools (PLATEAU), 2009.Google ScholarGoogle Scholar
  17. R. McCauley, S. Fitzgerald, G. Lewandowski, L. Murphy, B. Simon, L. Thomas, and C. Zander. Debugging: a review of the literature from an educational perspective. Computer Science Education, 18(2):67--92, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  18. M. Musuvathi, S. Qadeer, T. Ball, G. Basler, P. A. Nainar, and I. Neamtiu. Finding and reproducing heisenbugs in concurrent programs. In Operating Systems Design and Implementation (OSDI), 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. J. Nielsen. Usability Inspection Methods. Wiley, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. C. M. Pancake and S. Utter. Models for visualization in parallel debuggers. In Supercomputing Conference (SC), 1989. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. W. Pugh and N. Ayewah. Unit testing concurrent software. In International Conference on Automated Software Engineering (ASE), 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. V. Ramalingam, D. LaBelle, and S. Wiedenbeck. Self-efficacy and mental models in learning to program. In Symposium on Computer Science Education (SIGCSE), 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. C. Rossbach, O. Hofmann, and E. Witchel. Is transactional programming actually easier? In Symposium on Principles and Practice of Parallel Programming (PPoPP), 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. C. Sadowski, T. Ball, J. Bishop, S. Burckhardt, G. Gopalakrishnan, J. Mayo, S. Qadeer, M. Musuvathi, and S. Toub. Practical parallel and concurrent programming. In Symposium on Computer Science Education (SIGCSE), 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. C. Sadowski, S. N. Freund, and C. Flanagan. SingleTrack: A dynamic determinism checker for multithreaded programs. In European Symposium on Programming (ESOP), 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. C. Sadowski and S. Kurniawan. Heuristic evaluation of programming language features. Technical Report UCSC-SOE-11-06, UC Santa Cruz, 2010.Google ScholarGoogle Scholar
  27. C. Sadowski and J. Yi. Applying usability studies to correctness conditions: A case study of cooperability. In Workshop on Evaluation and Usability of Programming Languages and Tools (PLATEAU), 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. D. Schuler and A. Namioka, editors. Participatory Design: Principles and Practices. Lawrence Erlbaum Associates, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. K. Sen. Race directed random testing of concurrent programs. In Conference on Programming Language Design and Implementation (PLDI), 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. H. Sutter. The free lunch is over: A fundamental turn toward concurrency in software. Dr. Dobbs Journal, 30(3):16--20, 2005.Google ScholarGoogle Scholar
  31. D. Szafron, J. Schaeffer, and A. Edmonton. An experiment to measure the usability of parallel programming systems. Concurrency Practice and Experience 8(2):147--166, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  32. I. Vessey. Expertise in debugging computer programs: A process analysis. International Journal of Man-Machine Studies, 23(5):459--494, 1985.Google ScholarGoogle ScholarCross RefCross Ref
  33. A. Von Mayrhauser and A. Vans. From program comprehension to tool requirements for an industrial environment. In Workshop on Program Comprehension (WPC), pages 78--86, 2002.Google ScholarGoogle Scholar

Index Terms

  1. Mental models and parallel program maintenance

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            ICSE '11: Proceedings of the 33rd International Conference on Software Engineering
            May 2011
            1258 pages
            ISBN:9781450304450
            DOI:10.1145/1985793

            Copyright © 2011 Copyright is held by the owner/author(s)

            Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 21 May 2011

            Check for updates

            Qualifiers

            • extended-abstract

            Acceptance Rates

            Overall Acceptance Rate276of1,856submissions,15%

            Upcoming Conference

            ICSE 2025

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader