Skip to main content
Log in

Reporting computing projects through structured abstracts: a quasi-experiment

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Previous work has demonstrated that the use of structured abstracts can lead to greater completeness and clarity of information, making it easier for researchers to extract information about a study. In academic year 2007/08, Durham University’s Computer Science Department revised the format of the project report that final year students were required to write, from a ‘traditional dissertation’ format, using a conventional abstract, to that of a 20-page technical paper, together with a structured abstract. This study set out to determine whether inexperienced authors (students writing their final project reports for computing topics) find it easier to produce good abstracts, in terms of completeness and clarity, when using a structured form rather than a conventional form. We performed a controlled quasi-experiment in which a set of ‘judges’ each assessed one conventional and one structured abstract for its completeness and clarity. These abstracts were drawn from those produced by four cohorts of final year students: two preceding the change, and the two following. The assessments were performed using a form of checklist that is similar to those used for previous experimental studies. We used 40 abstracts (10 per cohort) and 20 student ‘judges’ to perform the evaluation. Scored on a scale of 0.1–1.0, the mean for completeness increased from 0.37 to 0.61 when using a structured form. For clarity, using a scale of 1–10, the mean score increased from 5.1 to 7.2. For a minimum goal of scoring 50% for both completeness and clarity, only 3 from 19 conventional abstracts achieved this level, while only 3 from 20 structured abstracts failed to reach it. We conclude that the use of a structured form for organising the material of an abstract can assist inexperienced authors with writing technical abstracts that are clearer and more complete than those produced without the framework provided by such a mechanism.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. The period of university study in England is normally three years.

  2. For non-UK readers, British universities usually use three classes for degrees, with the second class being split into upper and lower seconds.

References

  • Bayley L, Eldredge J (2003) The structured abstract: an essential tool for researchers. Hypothesis 17(1):11–13

    Google Scholar 

  • Berkenkotter C, Huckin TM (1995) Genre knowledge in disciplinary communication, chap 2 (News value in scientific articles). Erlbaum, Hillsdale, pp 27–44

    Google Scholar 

  • Brereton O, Kitchenham B, Budgen D, Turner M, Khalil M (2007) Lessons from applying the systematic literature review process within the software engineering domain. J Syst Softw 80(4):571–583

    Article  Google Scholar 

  • Budgen D, Kitchenham B, Charters S, Turner M, Brereton P, Linkman S (2007) Preliminary results of a study of the completeness and clarity of structured abstracts. In: EASE 2007: evaluation & assessment in software engineering, BCS-eWiC, pp 64–72

  • Budgen D, Kitchenham BA, Charters S, Turner M, Brereton P, Linkman S (2008) Presenting software engineering results using structured abstracts: a randomised experiment. Empir Softw Eng 13(4):435–468

    Article  Google Scholar 

  • Dixon-Woods M, Agarwal S, Jones D, Young B, Sutton A (2005) Synthesising qualitative and quantitative evidence: a review of possible methods. J Health Serv Res Policy 10(1):45–53

    Article  Google Scholar 

  • Dybå T, Dingsøyr T (2008) Empirical studies of agile software development: a systematic review. Inf Softw Technol 50:833–859

    Article  Google Scholar 

  • Flesch R (1948) A new readability yardstick. J Appl Psychol 32:221–233

    Article  Google Scholar 

  • Glass R, Ramesh V, Vessey I (2004) An analysis of research in computing disciplines. Commun ACM 47:89–94

    Article  Google Scholar 

  • Gunning R (1952) The technique of clear writing. McGraw-Hill, New York

    Google Scholar 

  • Hartley J (2002) Do structured abstracts take more space? And does it matter? J Inf Sci 28(5):417–422

    Article  Google Scholar 

  • Hartley J (2003) Improving the clarity of journal abstracts in psychology: the case for structure. Sci Commun 24:366–379

    Article  Google Scholar 

  • Hartley J (2004) Current findings from research on structured abstracts. J Med Libr Assoc 92:368–371

    Google Scholar 

  • Hartley J, Benjamin M (1998) An evaluation of structured abstracts in journals published by the British Psychological Society. Br J Educ Psychol 68:443–456

    Google Scholar 

  • Hartley J, Betts L (2009) Common weaknesses in traditional abstracts in the social sciences. J Am Soc Inf Sci Technol 60(10):2010–2018

    Article  Google Scholar 

  • Hartley J, Sydes M (1997) Are structured abstracts easier to read than traditional ones? J Res Read 20:122–136

    Article  Google Scholar 

  • Hartley J, Rock J, Fox C (2005) Teaching psychology students to write structured abstracts: an evaluation study. Psychol Teach Rev 11:2–11

    Google Scholar 

  • Jedlitschka A, Pfahl D (2005) Reporting guidelines for controlled experiments in software engineering. In: Proceedings of ACM/IEEE international symposium on empirical software engineering (ISESE’05). IEEE Computer Society Press, Los Alamitos, pp 95–104

    Google Scholar 

  • Jedlitschka A, Ciolkowski M, Pfahl D (2008) Reporting experiments in software engineering. In: Shull F, Singer J, Sjøberg D (eds) Guide to advanced empirical software engineering, chap 8. Springer, Berlin, Heidelberg, New York, pp 201–228

    Chapter  Google Scholar 

  • Kitchenham B (2004) Procedures for undertaking systematic reviews. Technical Report TR/SE-0401, Department of Computer Science, Keele University and National ICT, Australia Ltd, Joint Technical Report

  • Kitchenham B, Charters S (2007) Guidelines for performing systematic literature reviews in software engineering. Technical Report EBSE 2007-001, Keele University and Durham University Joint Report

  • Kitchenham B, Pfleeger SL, Pickard L, Jones P, Hoaglin D, Emam KE, Rosenberg J (2002) Preliminary guidelines for empirical research in software engineering. IEEE Trans Softw Eng 28:721–734

    Article  Google Scholar 

  • Kitchenham B, Brereton OP, Owen S, Butcher J, Jefferies C (2008) Length and readability of structure software engineering abstracts. IET Softw 2(1):37–45

    Article  Google Scholar 

  • Peterson K, Feldt R, Mujtaba S, Mattson M (2008) Systematic mapping studies in software engineering. In: Proceedings of EASE 2008, pp 1–10

  • Sharma S, Harrison JE (2006) Structured abstracts: do they improve the quality of information in abstracts? Am J Orthod Dentofac Orthop 130(4):523–530

    Article  Google Scholar 

  • Shaw M (2003) Writing good software engineering research papers. In: Proceedings of 25th international conference on software engineering (ICSE’03). IEEE Computer Society Press, Los Alamitos, pp 726–736

    Chapter  Google Scholar 

  • van der Tol M (2001) Abstracts as orientation tools in a modular electronic environment. Document Design 2(1):76–88

    Google Scholar 

  • Webster J, Watson R (2002) Analysing the past to prepare for the future: writing a literature review. MIS Q 26:xiii–xxiii

    Google Scholar 

Download references

Acknowledgements

This work was performed as part of the EPIC (Evidence-based Practices Informing Computing) project, funded by the UK’s Engineering & Physical Sciences Research Council (EPSRC). We would like to thank those students who took part in the study as judges, the anonymous referees for their helpful comments and suggestions and Professor Jim Hartley of Keele University for his advice about structured abstracts.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David Budgen.

Additional information

Editor: Claes Wohlin

Appendices

Appendix A: Task Description

The purpose of our study is to investigate how information about student projects in software engineering and computer science can be extracted from the abstracts provided with the final dissertations. You are asked to act as a judge for the abstracts taken from two sample dissertations (allocated randomly) and for each one to complete a copy of the evaluation form supplied.

To perform the tasks, we ask that you view the two abstracts in the order specified, and that you view each of them on a computer screen, preferably using a browser using a Mozilla engine, such as Firefox, since the layout provided has been optimised for this. You should complete one form for each abstract—please ensure that you complete Form 1 first and then Form 2 as the ordering is important. You may take as long as is necessary to perform the task, but we would not expect that the task should take longer than about ten minutes.

Can you please also complete the third (short) form that will help us classify your input.

David Budgen and Andy Burn.

Appendix B: Abstract Evaluation Form

Registration Code allocated to you:

Number/Title of abstract:

For each of the following questions about the abstract, you should provide one of the following responses Yes, No, Unsure or N/A (Not Applicable) by drawing a ring around your chosen response.

Please give an assessment of the clarity of this abstract by circling a number on the scale of 1–10 below, where a value of 1 represents Very Obscure and 10 represents Extremely Clearly Written.

1

2

3

4

5

6

7

8

9

10

Appendix C: Additional Questions

To assist us with analysing your responses, please provide us with some additional information about yourself and your previous experience. Again, please ring the relevant words where appropriate.

Is English your first language: Yes / No

(This is so that we can check whether structured abstracts are more readable for non-native English speakers.)

  1. 1.

    Did you have any knowledge about structured abstracts before taking part in this study? Yes / No

    If your answer was “Yes”, then please indicate the nature of your knowledge:

    1. a.

      Heard about them, but not seen them before: Yes / No

    2. b.

      Read papers with structured abstracts: Yes / No

    3. c.

      Created structured abstracts yourself: Yes / No

    4. d.

      Other (please specify):

  2. 2.

    Please describe up to three things that you like about conventional (non-structured) abstracts.

  3. 3.

    Please describe up to three things you like about structured abstracts.

  4. 4.

    Overall, do you prefer to read:

    1. a.

      Structured abstracts

    2. b.

      Conventional (non-structured) abstracts

    3. c.

      No preference

  5. 5.

    Overall, would you prefer to write:

    1. a.

      Structured abstracts

    2. b.

      Conventional (non-structured) abstracts

    3. c.

      No preference

Rights and permissions

Reprints and permissions

About this article

Cite this article

Budgen, D., Burn, A.J. & Kitchenham, B. Reporting computing projects through structured abstracts: a quasi-experiment. Empir Software Eng 16, 244–277 (2011). https://doi.org/10.1007/s10664-010-9139-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-010-9139-3

Keywords

Navigation