Skip to main content
Log in

Computer-assisted cantillation and chant research using content-aware web visualization tools

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Chant and cantillation research is particularly interesting as it explores the transition from oral to written transmission of music. The goal of this work to create web-based computational tools that can assist the study of how diverse recitation traditions, having their origin in primarily non-notated melodies, later became codified. One of the authors is a musicologist and music theorist who has guided the system design and development by providing manual annotations and participating in the design process. We describe novel content-based visualization and analysis algorithms that can be used for problem-seeking exploration of audio recordings of chant and recitations.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. Archived Examples from Hungarian Academy of Science (1968–1973).

  2. Archived Examples from Hungary and Morocco from the Feher Music Center at the Bet Hatfatsut, Tel Aviv, Israel.

  3. Godehard Joppich and Singphoniker: Gregorian Chant from St. Gallen (Gorgmarienhtte: CPO 999267-2, 1994).

  4. Examples from Indonesia and Egypt: in Approaching the Koran (Ashland: White Cloud, 1999).

  5. One cent is 1/100 of a semitone, corresponding to a frequency difference of about 0.06%.

  6. Thinking statistically, our scale is related to a distribution given the relative probability of each possible pitch. We can think of each F0 estimate (i.e each sampled value of the F0 envelope) as a sample drawn from this unknown distribution so our problem becomes one of estimation the unknown distribution given the samples.

  7. http://marsyas.sourceforge.net

References

  1. Boersma P (2001) Praat, a system for doing phonetics by computer. Glot Int 5(9/10):341–345

    Google Scholar 

  2. Camacho A (2007) A sawtooth waveform inspired pitch estimator for speech and music. PhD thesis, University of Florida

  3. Dannenberg RB, Birmingham WP, Pardo B, Hu N, Meek C, Tzanetakis G (2007) A comparative evaluation of search techniques for query-by-humming using the musart testbed. J Am Soc Inf Sci Technol 58(5):687–701

    Article  Google Scholar 

  4. Duggan B, O’ Shea B, Cunningham P (2008) A system for automatically annotating traditional irish music field recordings. In: Int workshop on content-based multimedia indexing (CBMI). IEEE, Piscataway

    Google Scholar 

  5. Ghias A, Logan J, Chamberlin D, Smith BC (1995) Query by humming: musical information retrieval in an audio database. In: MULTIMEDIA ‘95: proceedings of the third ACM international conference on multimedia. ACM, New York, pp 231–236

    Chapter  Google Scholar 

  6. Hanna P, Ferraro P (2007) Polyphonic music retrieval by local edition of quotiented sequences. In: Int workshop on content-based multimedia indexing (CBMI). IEEE, Piscataway

    Google Scholar 

  7. Hauptman A, Witbrock M (1997) Informedia: news-on-demand multimedia information acquisition and retrieval. MIT, Cambridge

    Google Scholar 

  8. Hauptman A et al (2003) Informedia at trec 2003: analyzing and searching broadcast news video. In: Proc of (VIDEO) TREC 2003, Gaithersburg, MD

  9. Karp T (1998) Aspects of orality and formularity in Gregorian chant. Northwestern University Press, Evanston

  10. Kodaly Z (1960) Folk music of Hungary. Corvina, Budapest

    Google Scholar 

  11. Krumhansl CL (1990) Cognitive foundations of musical pitch. Oxford University Press, Oxford

    Google Scholar 

  12. Levy K (1998) Gregorian chant and the Carolingians. Princeton University Press, Princeton

    Google Scholar 

  13. Nelson K (1985) The art of reciting the Koran. University of Texas Press, Austin

    Google Scholar 

  14. Ness S, Wright M, Martins L, Tzanetakis G (2008) Chants and orcas: semi-automatic tools for audio annotation and analysis in niche domains. In: Proc ACM multimedia, Vancouver, Canada

  15. Treitler L (1982) The early history of music writing in the west. J Am Musicol Soc 35

  16. Tzanetakis G (2008) Marsyas-0.2: a case study in implementing music information retrieval systems, chapter 2. In: Shen S, Shepherd J, Cui B, Liu L (eds) Intelligent music information systems: tools and methodologies. Information science reference, pp 31–49

  17. Tzanetakis G, Schloss KAW, Wright M (2007) Computational ethnomusicology. J Interdiscip Music Studies, 1(2), 2007

  18. Wigoder G et al (1989) Masora, the encyclopedia of Judaism. MacMillan, New York

    Google Scholar 

  19. Zimmermann H (2000) Untersuchungen zur Musikauffassung des rabbinischen Judentums. Peter Lang, Bern

    Google Scholar 

Download references

Acknowledgements

We would like to thank Matt Wright for initial work on this project and Emiru Tsunoo for the Marsyas implementation of dynamic time warping and similarity matrix computation used in the paper. We would also like to thank the National Sciences and Engineering Research Council (NSERC) and Social Sciences and Humanities Research Council (SSHRC) of Canada for their financial support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Steven R. Ness.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ness, S.R., Biró, D.P. & Tzanetakis, G. Computer-assisted cantillation and chant research using content-aware web visualization tools. Multimed Tools Appl 48, 207–224 (2010). https://doi.org/10.1007/s11042-009-0357-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-009-0357-x

Keywords

Navigation