skip to main content
10.1145/3278122.3278135acmconferencesArticle/Chapter ViewAbstractPublication PagesgpceConference Proceedingsconference-collections
research-article

Orchestrating dynamic analyses of distributed processes for full-stack JavaScript programs

Published:05 November 2018Publication History

ABSTRACT

Dynamic analyses are commonly implemented by instrumenting the program under analysis. Examples of such analyses for JavaScript range from checkers of user- defined invariants to concolic testers. For a full-stack JavaScript program, these analyses would benefit from reasoning about the state of the client-side and server-side processes it is comprised of. Lifting a dynamic analysis so that it supports full-stack programs can be challenging. It involves distributed communication to maintain the analysis state across all processes, which has to be deadlock-free. In this paper, we advocate maintaining distributed analysis state in a centralized analysis process instead — which is communicated with from the processes under analysis. The approach is supported by a dynamic analysis platform that provides abstractions for this communication. We evaluate the approach through a case study. We use the platform to build a distributed origin analysis, capable of tracking the expressions from which values originate from across process boundaries, and deploy it on collaborative drawing application. The results show that our approach greatly simplifies the lifting process at the cost of a computational overhead. We deem this overhead acceptable for analyses intended for use at development time.

References

  1. Saba Alimadadi, Ali Mesbah, and Karthik Pattabiraman. 2016. Understanding Asynchronous Interactions in Full-stack JavaScript. In Proceedings of the 38th International Conference on Software Engineering (ICSE16). Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Esben Andreasen, Liang Gong, Anders Møller, Michael Pradel, Marija Selakovic, Koushik Sen, and Cristian-Alexandru Staicu. 2017. A Survey of Dynamic Analysis and Test Generation for JavaScript. ACM Computing Surveys (CSUR) 50, 5 (2017). Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Paul Barham, Rebecca Isaacs, Richard Mortier, and Dushyanth Narayanan. 2003. Magpie: Online Modelling and Performance-aware Systems.. In HotOS. 85–90. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bas Cornelissen, Andy Zaidman, Arie Van Deursen, Leon Moonen, and Rainer Koschke. 2009. A systematic survey of program comprehension through dynamic analysis. IEEE Transactions on Software Engineering 35, 5 (2009), 684–702. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Monika Dhok, Murali Krishna Ramanathan, and Nishant Sinha. 2016. Type-aware Concolic Testing of JavaScript Programs. In Proceedings of the 38th International Conference on Software Engineering (ICSE16). 168–179. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Rodrigo Fonseca, George Porter, Randy H Katz, Scott Shenker, and Ion Stoica. 2007. X-trace: A pervasive network tracing framework. In Proceedings of the 4th USENIX conference on Networked systems design & implementation. USENIX Association, 20–20. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Liang Gong, Michael Pradel, and Koushik Sen. 2015. JITProf: Pinpointing JIT-unfriendly JavaScript Code. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering (FSE15). Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Liang Gong, Michael Pradel, Manu Sridharan, and Koushik Sen. 2015. DLint: Dynamically Checking Bad Coding Practices in JavaScript. In Proceedings of the 2015 International Symposium on Software Testing and Analysis (ISSTA15). Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Wolfgang De Meuter Laurent Christophe, Elisa Gonzalez Boix and Coen De Roover. 2016. Linvail: A General-Purpose Platform for Shadow Execution of JavaScript. In Proceedings of the 23rd IEEE International Conference on Software Analysis, Evolution, and Reengineering (SANER 2016).Google ScholarGoogle Scholar
  10. Guodong Li, Esben Andreasen, and Indradeep Ghosh. 2014. SymJS: Automatic Symbolic Testing of JavaScript Web Applications. In Proceedings of the 22Nd ACM SIGSOFT International Symposium on Foundations of Software Engineering (FSE14). Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Chi-Keung Luk, Robert Cohn, Robert Muth, Harish Patil, Artur Klauser, Geoff Lowney, Steven Wallace, Vijay Janapa Reddi, and Kim Hazelwood. 2005. Pin: building customized program analysis tools with dynamic instrumentation. In Acm sigplan notices, Vol. 40. ACM, 190–200. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Magnus Madsen, Frank Tip, Esben Andreasen, Koushik Sen, and Anders Møller. 2016. Crowdie: Feedback-directed Instrumentation for Deployed JavaScript Applications. In Proceedings of the 38th International Conference on Software Engineering (ICSE16). Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Nicholas Nethercote and Julian Seward. 2007. Valgrind: a framework for heavyweight dynamic binary instrumentation. In ACM Sigplan notices, Vol. 42. ACM, 89–100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. James Newsome and Dawn Song. 2005. Dynamic taint analysis for automatic detection, analysis, and signature generation of exploits on commodity software. (2005).Google ScholarGoogle Scholar
  15. Jens Nicolay, Carlos Noguera, Coen De Roover, and Wolfgang De Meuter. 2015. Detecting Function Purity in JavaScript. In Proceedings of the 15th International Working Conference on Source Code Analysis and Manipulation (SCAM15).Google ScholarGoogle ScholarCross RefCross Ref
  16. Laure Philips, Joeri De Koster, Wolfgang De Meuter, and Coen De Roover. 2018. Search-based Tier Assignment for Optimising Offline Availability in Multi-tier Web Applications. The Art, Science, and Engineering of Programming 2, 2 (2018).Google ScholarGoogle Scholar
  17. Prateek Saxena, Devdatta Akhawe, Steve Hanna, Feng Mao, Stephen McCamant, and Dawn Song. 2010. A symbolic execution framework for javascript. In Security and Privacy (SP), 2010 IEEE Symposium on. IEEE, 513–528. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Koushik Sen and Gul Agha. 2006. CUTE and jCUTE: Concolic unit testing and explicit path model-checking tools. In International Conference on Computer Aided Verification. Springer, 419–423. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Koushik Sen, Swaroop Kalasapur, Tasneem Brutch, and Simon Gibbs. 2013. Jalangi: A Selective Record-replay and Dynamic Analysis Framework for JavaScript. In Proceedings of the 9th Joint Meeting on Foundations of Software Engineering (ESEC/FSE13). Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Koushik Sen, George Necula, Liang Gong, and Wontae Choi. 2015. MultiSE: Multi-path symbolic execution using value summaries. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering. ACM, 842–853. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Benjamin H Sigelman, Luiz Andre Barroso, Mike Burrows, Pat Stephenson, Manoj Plakal, Donald Beaver, Saul Jaspan, and Chandan Shanbhag. 2010. Dapper, a large-scale distributed systems tracing infrastructure. Technical Report. Technical report, Google, Inc.Google ScholarGoogle Scholar
  22. Haiyang Sun, Daniele Bonetta, Christian Humer, and Walter Binder. 2018. Efficient Dynamic Analysis for Node.Js. In Proceedings of the 27th International Conference on Compiler Construction (CC18). Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Tom Van Cutsem and Mark S Miller. 2010. Proxies: design principles for robust object-oriented intercession APIs. In ACM Sigplan Notices, Vol. 45. ACM, 59–72. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Tom Van Cutsem and Mark S Miller. 2013. Trustworthy proxies. In European Conference on Object-Oriented Programming. Springer, 154– 178. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Xu Zhao, Yongle Zhang, David Lion, Muhammad Faizan Ullah, Yu Luo, Ding Yuan, and Michael Stumm. 2014. lprof: A non-intrusive request flow profiler for distributed systems. In OSDI, Vol. 14. 629–644. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Orchestrating dynamic analyses of distributed processes for full-stack JavaScript programs

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GPCE 2018: Proceedings of the 17th ACM SIGPLAN International Conference on Generative Programming: Concepts and Experiences
      November 2018
      214 pages
      ISBN:9781450360456
      DOI:10.1145/3278122
      • cover image ACM SIGPLAN Notices
        ACM SIGPLAN Notices  Volume 53, Issue 9
        GPCE '18
        September 2018
        214 pages
        ISSN:0362-1340
        EISSN:1558-1160
        DOI:10.1145/3393934
        Issue’s Table of Contents

      Copyright © 2018 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 5 November 2018

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate56of180submissions,31%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader