Skip to main content
Log in

Institutional research rankings via bibliometric analysis and direct peer review: A comparative case study with policy implications

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

Recent years have seen enormously increased interest in the comparative evaluation of research quality in the UK, with considerable resources devoted to ranking the output of academic institutions relative to one another at the sub-discipline level, and the disposition of even greater resources dependent on the outcome of this process. The preferred methodology has been that of traditional peer review, with expert groups of academics tasked to assess the relative worth of all research activity in ‘their’ field. Extension toinstitutional evaluation of a recently refined technique ofjournal ranking (Discipline Contribution Scoring) holds out the possibility of ‘automatic’ evaluation within a time-frame considerably less than would be required using methods based directly on citation counts within the corpus of academic work under review. This paper tests the feasibility of the technique in the sub-field of Business and Management Studies Research, producing rankings which are highly correlated with those generated by the much more complex and expensive direct peer review approach. More generally, the analysis also gives a rare opportunity directly to compare the equivalence of peer review bibliometric analysis over a whole sub-field of academic activity in a non-experimental setting.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. HEFCE.A Report for the UFC on the Conduct of the 1992 Research Assessment Exercise, Higher Education Funding Council for England: Bristol, June 1993.

    Google Scholar 

  2. HEFCE,Guidelines for the Conduct of the 1995 Research Assessment Exercise, Higher Education Funding Council for England: Bristol, 1994.

    Google Scholar 

  3. P. R. Thomas, Sive effects in the assessment of discipline-contribution scores: an example from the social sciences,Scientometrics, 33(2) (1995) 203–220.

    Article  Google Scholar 

  4. D. S. Watkins, Changes in the nature of UK small business research, 1980–1990. Part One: Changes in Producer Characteristics,Small Business and Enterprise Development, 1(3) (1994) 28–31.

    MathSciNet  Google Scholar 

  5. D. S. Watkins, Changes in the Nature of UK Small Business Research, 1980–1990. Part Two: Changes in the nature of the output,Small Business and Enterprise Development, 2(1) (1995) 59–66.

    Google Scholar 

  6. J. Taylor, Measuring research performance in Business and Management Studies in the United Kingdom: The 1992 research assessment exercise,British Journal of Management, 5(4) (1994) 275–288.

    Article  Google Scholar 

  7. C. Oppenheim, The correlation between citation counts and the 1992 research assessment exercise ratings for British Library and Information Science university departments,Journal of Documentation, 51(1) (1995) 18–27.

    MathSciNet  Google Scholar 

  8. A. M. Colman, D. Dhillon, B. Coulthard, A Bibliometric evaluation of the research performance of British university politics departments: Publications in leading journals,Scientometrics, 32 (1) (1995) 49–66.

    Article  Google Scholar 

  9. A. F. J. Van Raan, Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight studies,Scientometrics, 36(3) (1996) 397–420.

    Article  Google Scholar 

  10. H. Peters, A. Van Raan, On determinants of citation scores: a case study in Chemical Engineering,Journal of the American Society for Information Science, 45(1) (1994) 39–49.

    Article  Google Scholar 

  11. P. R. Thomas,op. cit. 3.

    Article  Google Scholar 

  12. G. M. Gilbert, Referencing as persuation,Social Studies of Science, 17 (1977) 113–122.

    Google Scholar 

  13. H. Peters, A. Van Raan,op. cit. 10.

    Article  Google Scholar 

  14. J. G. Shaw Article-by-article citation analysis of medical journals,Scientometrics, 12 (1987) 101–110.

    Article  Google Scholar 

  15. S. E. Wiberley, Journal rankings from citation studies: a comparison of national and local data from social work,Library Quarterly, 52(4) (1982) 348–359.

    Article  Google Scholar 

  16. P. Doreian, Measuring the relative standing of disciplinary journals,Information Processing and Management, 24(1) (1988) 45–56.

    Article  Google Scholar 

  17. P. R. Thomas,op. cit. 3.

    Article  Google Scholar 

  18. H. F. Moed, W. Burger, J. Frankfort, A. Van Raan, The use of bibliometric data for the measurement of university research performance,Research Policy, 14 (1985) 131–149.

    Article  Google Scholar 

  19. E. J. Rinica, C. de Lange, H. Moed Measuring national output in Physics: Delimination problems,Scientometrics, 28(1) (1993) 89–110.

    Article  Google Scholar 

  20. A. Bekavec, J. Petrak, Z. Buneta Citation behavior and place of publication in the authors from the scientific periphery: a matter of quality?Information Processing and Management, 30(1) (1994) 33–42.

    Article  Google Scholar 

  21. P. Doreian A Measure of standing for citation networks within a wider environment,Information Processing and Management, 30(1) (1994) 21–31.

    Article  MathSciNet  Google Scholar 

  22. R. Coe, I. Weinstock, Evaluating the management journals: a second look,Academy of Management Journal, 27 (1984) 660–666.

    Article  Google Scholar 

  23. L. R. Gomez-Meija, D. B. Balkin, Determinants of faculty pay: an agency theory perspective,Academy of Management Journal, 35 (1992) 921–955.

    Article  Google Scholar 

  24. A. M. Colman et al., op. cit. 8.

    Article  Google Scholar 

  25. M. M. Extejt, J. E. Smith, The behavioral sciences and management: an evaluation of relevant journals,Journal of Management, 16 (1990) 539–551.

    Article  Google Scholar 

  26. R. T. Gillett, Serious anomalies in the UGC comparative evaluation of the research performance of psychology departments,Bulletin of the British Psychological Society, 40 (1987) 42–49.

    Google Scholar 

  27. G. Johnes, Research performance indicators in the university sector,Higher Education Quarterly, 42 (1988) 54–71.

    Google Scholar 

  28. J. B. Bavelas, The social psychology of citation,Canadian Psychological Review, 19(2) (1978) 158–163.

    Google Scholar 

  29. J. L. Johnson, P. M. Podsakoff, Journal influence in the field of management: an analysis using Salancik's Index in a dependency network,Academy of Management Journal, 37(5) (1994) 1392–1407.

    Article  Google Scholar 

  30. K. E. Clark,America's Psychologists: A Survey of a Growing Profession, Washington: American Psychological Association, 1957.

    Google Scholar 

  31. F. Narin,Evaluative Bibliometrics, Cherry Hill, New Jersey: Computer Horizons, 1976.

    Google Scholar 

  32. L. R. Gomez-Meija,et al,op. cit. 23.

    Article  Google Scholar 

  33. A. Schubert, T. Braun, Reference standards for citation based assessments,Scientometrics, 26 (1993) 21–35.

    Article  Google Scholar 

  34. A. M. Colman et al., op. cit. 8.

    Article  Google Scholar 

  35. A. J. Nederhof, A. F. J. Van Raan, A bibliometric analysis of six economics research groups: A comparison with peer review,Research Policy, 22 (1993) 353–368.

    Article  Google Scholar 

  36. A. F. J. Van Raan,op. cit. 9.

    Article  Google Scholar 

  37. A. Sandison, The use of older literature and its obsolescence,Journal of Documentation, 17 (1971) 184–189.

    Google Scholar 

  38. E. Garfield, Citations-to divided by items-published gives the impact factor,Current Contents, 15 (1972) 6–7.

    Google Scholar 

  39. L. M. Raisig, Mathematical evaluation of the scientific serial,Science, 131 (1960) 1417.

    Google Scholar 

  40. P. Doreian,op. cit. 16..

    Article  Google Scholar 

  41. C. He, M. L. Pao, A discipline-scientific journal selection algorithm,Information Processing and Management, 22 (1986) 405–416.

    Article  Google Scholar 

  42. G. Hirst, Discipline impact factor: a method for determining core journal lists,Journal of the American Society for Information Science, 29 (1982) 171–172.

    Google Scholar 

  43. P. Pichappan, Identification of mainstream journals of science specialty: a method using the discipline-contribution score,Scientometrics, 27 (1993) 179–193.

    Article  Google Scholar 

  44. P. R. Thomas,op. cit. 3.

    Article  Google Scholar 

  45. P. R. Thomas,op. cit. 3. Appendix A.

    Article  Google Scholar 

  46. J. Taylor,op. cit. 6..

    Article  Google Scholar 

  47. J. L. Johnson, P. M. Podsakoff,op. cit. 29.

    Article  Google Scholar 

  48. A. M. Colman et al., op. cit. 8.

    Article  Google Scholar 

  49. C. Oppenheim,op. cit. 7.

    MathSciNet  Google Scholar 

  50. A. J. Nederhof, A. F. J. Van Raan,op. cit. 35.

    Article  Google Scholar 

  51. A. J. Nederhof, A. F. J. Van Raan,op. cit. 35 p. 413.

    Article  Google Scholar 

  52. HEFCE,op. cit. 1.

    Google Scholar 

  53. ABRC,Peer Review: A Report of the Advisory Board for the Research Councils from the Working Party on Peer Review (‘Boden Report’), London: ABRC, 1990.

    Google Scholar 

  54. J. S. Armstrong, Peer review for journals: Evidence on quality control, fairness, and innovation,Science and Engineering Ethics, 3(1) (1997) 63–84.

    Google Scholar 

  55. J. Taylor,op. cit. 6.

    Article  Google Scholar 

  56. ISI,Journal Citation Reports for 1993, Philidelphia: Institute for Scientific Information.

  57. G. S. Howard, D. A. Cole, S. E. Maxwell, Research productivity in psychology based on publication in the journals of the American Psychological Association,American Psychologist, 42 (1987) 975–986.

    Article  Google Scholar 

  58. C. Oppenheim,op. cit. 7.

    MathSciNet  Google Scholar 

  59. L. B. Seng, P. Willett, The citedness of publications by United Kingdom schools and departments of library and information studies,Journal of Information science, 21(1) 1995.

  60. D. S. Watkins,op. cit. 4.

    MathSciNet  Google Scholar 

  61. D. S. Watkins,op. cit. 5.

    Google Scholar 

  62. B. Fender,Speech to the Association of Business Schools Conference on RAE96, Harrogate, February 1997.

  63. DfEE,Report of the National Committee of Inquiry into Higher Education: Higher Education in the Learning Society (‘Dearing Report’), London: Department for Education and Employment, July 1997.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to D. S. Watkins.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Thomas, P.R., Watkins, D.S. Institutional research rankings via bibliometric analysis and direct peer review: A comparative case study with policy implications. Scientometrics 41, 335–355 (1998). https://doi.org/10.1007/BF02459050

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02459050

Keywords

Navigation