skip to main content
10.1145/3167132.3167436acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
poster

Experience report: studying the readability of a domain specific language

Published:09 April 2018Publication History

ABSTRACT

Domain-specific languages (DSLs) are commonly expected to improve communication with domain experts compared to general-purpose programming languages (GPLs). However, there is a huge gap in the literature concerning how evidence can be given for this expected improvement---a phenomenon that is not only known from DSLs, but also from GPLs in general. This paper presents an experience report of applying an iterative process for evaluating DSL readability for a given DSL in the context of safety-critical software in robotics. The goal of this process is to conduct a randomized controlled trial that gives evidence for the better readability of the DSL in comparison to the readability of a GPL. In this experience report, we describe common pitfalls we identified and possible solutions to overcome these problems in the future.

References

  1. K.K. Aggarwal, Y. Singh, and J.K. Chhabra. 2002. An integrated measure of software maintainability. In Reliability and maintainability symposium, 2002. Proceedings. Annual. IEEE, 235--241.Google ScholarGoogle Scholar
  2. M. Akour and B. Falah. 2016. Application domain and programming language readability yardsticks. 2016 7th International Conference on Computer Science and Information Technology (CSIT) (2016), 1--6.Google ScholarGoogle ScholarCross RefCross Ref
  3. A. Barišić, V. Amaral, and M. Goul ao. 2018. Usability driven DSL development with USE-ME. COMLAN 51, Supplement C (2018), 118 -- 157. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. A. Bruun, P. Gull, L. Hofmeister, and J. Stage. 2009. Let your users do the testing: a comparison of three remote asynchronous usability testing methods. In Proc. SIGCHI Conf. Human Factors Computing Systems. ACM, 1619--1628. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Raymond P. L. Buse and Westley R. Weimer. 2010. Learning a Metric for Code Readability. IEEE Trans. Softw. Eng. 36, 4 (July 2010), 546--558. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. J.L. Elshoff and M. Marcotty. 1982. Improving Computer Program Readability to Aid Modification. Commun. ACM 25, 8 (Aug. 1982), 512--521. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. J. Feigenspan, C. Kästner, J. Liebig, S. Apel, and S. Hanenberg. 2012. Measuring programming experience. In Program Comprehension (ICPC), 2012 IEEE 20th International Conference on. IEEE, 73--82.Google ScholarGoogle Scholar
  8. S. Hanenberg. 2010. Faith, Hope, and Love: An Essay on Software Science's Neglect of Human Factors. In Proceedings of the ACM International Conference on Object Oriented Programming Systems Languages and Applications (OOPSLA '10). ACM, New York, NY, USA, 933--946. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Johann Thor Mogensen Ingibergsson, Dirk Kraft, and Ulrik Pagh Schultz. 2017. Safety Computer Vision Rules for Improved Sensor Certification. In The First IEEE International Conference on Robotics Computing (ICRC-17).Google ScholarGoogle Scholar
  10. K. Kaijanaho. 2015. Evidence-based programming language design : a philosophical and methodological exploration. University of Jyväskylä, Finnland.Google ScholarGoogle Scholar
  11. A. Kittur, E.H. Chi, and B. Suh. 2008. Crowdsourcing user studies with Mechanical Turk. In Proc. SIGCHI Conf. Human Factors Computing Systems. ACM, 453--456. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. A.J. Ko, T.D. Latoza, and M.M. Burnett. 2015. A practical guide to controlled experiments of software engineering tools with human participants. Empirical Software Engineering 20, 1 (2015), 110--141. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. M. Mernik, J. Heering, and A.M. Sloane. 2005. When and how to Develop Domain-Specific Languages. Comput. Surveys 37, 4 (2005), 316--344. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Brad A. Myers, John F. Pane, and Andy Ko. 2004. Natural Programming Languages and Environments. Commun. ACM 47, 9 (Sept. 2004), 47--52. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. D. Posnett, A. Hindle, and P. Devanbu. 2011. A Simpler Model of Software Readability. In Proceedings of the 8th Working Conference on Mining Software Repositories (MSR '11). ACM, New York, NY, USA, 73--82. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. R. Rosenthal and R.L. Rosnow. 1991. Essentials of behavioral research: Methods and data analysis. McGraw-Hill Humanities Social.Google ScholarGoogle Scholar
  17. A. Stefik and S. Hanenberg. 2014. The Programming Language Wars: Questions and Responsibilities for the Programming Language Community. In Proceedings of the 2014 ACM International Symposium on New Ideas, New Paradigms, and Reflections on Programming & Software (Onward! 2014). ACM, New York, NY, USA, 283--299. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. A. Stefik and S. Siebert. 2013. An empirical investigation into programming language syntax. ACM TOCE 13, 4 (2013), 19. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. J. Sunshine, J.D. Herbsleb, and J. Aldrich. 2014. Structuring documentation to support state search: A laboratory experiment about protocol programming. In ECOOP. Springer, 157--181. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. P.M. Uesbeck, A. Stefik, S. Hanenberg, J. Pedersen, and P. Daleiden. 2016. An empirical study on the impact of C++ lambdas and programmer experience. In Proc. 38th ICSE. ACM, 760--771. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. C. Wohlin, P. Runeson, M. Höst, M.C. Ohlsson, and B. Regnell. 2012. Experimentation in Software Engineering. Springer. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Experience report: studying the readability of a domain specific language

                Recommendations

                Comments

                Login options

                Check if you have access through your login credentials or your institution to get full access on this article.

                Sign in
                • Published in

                  cover image ACM Conferences
                  SAC '18: Proceedings of the 33rd Annual ACM Symposium on Applied Computing
                  April 2018
                  2327 pages
                  ISBN:9781450351911
                  DOI:10.1145/3167132

                  Copyright © 2018 Owner/Author

                  Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

                  Publisher

                  Association for Computing Machinery

                  New York, NY, United States

                  Publication History

                  • Published: 9 April 2018

                  Check for updates

                  Qualifiers

                  • poster

                  Acceptance Rates

                  Overall Acceptance Rate1,650of6,669submissions,25%

                PDF Format

                View or Download as a PDF file.

                PDF

                eReader

                View online with eReader.

                eReader