skip to main content
10.1145/3526114.3558647acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
demonstration

Music Scope Pad: Video Selecting Interface by Natural Movement in VR Space

Published:28 October 2022Publication History

ABSTRACT

This paper describes a novel video selecting interface that enables us to select videos without having to click a mouse or touch a screen. Existing video players enable us to see and hear only one video at a time, and thus we have to play pieces individually to select the one we want to hear from numerous new videos such as music videos, which involves a large number of mouse and screen-touch operations. The main advantage of our video selecting interface is that it detects natural movements, such as head or hand movements when users are listening to sounds and they can focus on a particular sound source that they want to hear. By moving their head left or right, users can hear the source from a frontal position as the tablet detects changes in the direction they are facing. By putting their hand behind their ear, users can focus on a particular sound source.

Skip Supplemental Material Section

Supplemental Material

Uist2022.mp4

mp4

169 MB

Uist2022_30Sec.mp4

mp4

36.3 MB

Uist2022.mp4

mp4

169 MB

References

  1. Masataka Goto. 2003. SmartMusicKIOSK: Music Listening Station with Chorus-Search Function. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (Vancouver, Canada) (UIST ’03). Association for Computing Machinery, New York, NY, USA, 31–40. https://doi.org/10.1145/964696.964700Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Masataka Goto and Takayuki Goto. 2005. Musicream: New Music Playback Interface for Streaming, Sticking, Sorting, and Recalling Musical Pieces.. In Proceedings of the 6th International Conference on Music Information Retrieval. ISMIR, London, United Kingdom, 404–411. https://doi.org/10.5281/zenodo.1415842Google ScholarGoogle Scholar
  3. Camille Goudeseune and Hank Kaczmarski. 2001. Composing outdoor augmented-reality sound environments. In International Computer Music Conference. 83–86.Google ScholarGoogle Scholar
  4. Masatoshi Hamanaka. 2006. Music Scope Headphones: Natural User Interface for Selection of Music. In Proceedings of ISMIR 2006, 7th International Conference on Music Information Retrieval, Victoria, Canada, 8-12 October 2006,. 302–307.Google ScholarGoogle Scholar
  5. Masatoshi Hamanaka and SuengHee Lee. 2009. Sound Scope Headphones. In ACM SIGGRAPH 2009 Emerging Technologies (New Orleans, Louisiana) (SIGGRAPH ’09). Association for Computing Machinery, New York, NY, USA, Article 21, 1 pages. https://doi.org/10.1145/1597956.1597977Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Keiji Hirata and Shu Matsuda. 2002. Interactive Music Summarization based on GTTM. In Proceedings of the 3rd International Conference on Music Information Retrieval. ISMIR, 8 pages.Google ScholarGoogle Scholar
  7. Tristan Jehan, Paul Lamere, and Brian Whitman. 2010. Music Retrieval from Everything. In Proceedings of the International Conference on Multimedia Information Retrieval (Philadelphia, Pennsylvania, USA) (MIR ’10). Association for Computing Machinery, New York, NY, USA, 245–246. https://doi.org/10.1145/1743384.1743428Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Warusfel Oliver and Eckel Gerhard. 2004. LISTEN - Augmenting Eeveryday Environments Through Interactive Soundscapes. In Proceedings of IEEE Workshop on VR for public consumption, IEEE Virtural Reality. 268–275.Google ScholarGoogle Scholar
  9. OpenAL. n.d.. Cross Platform 3D Audio. https://www.openal.org/. (Accessed on July 14, 2022).Google ScholarGoogle Scholar
  10. François Pachet and Olivier Delerue. 1998. A Mixed 2D/3D Interface for Music Spatialization. In Virtual Worlds, Jean-Claude Heudin (Ed.). Springer Berlin Heidelberg, Berlin, Heidelberg, 298–307.Google ScholarGoogle Scholar
  11. Francois Pachet and Olivier Delerue. 2000. On-The-Fly Multi Track Mixing. In Proceedings of AES 109th Convention, Los Angeles. Audio Engineering Society.Google ScholarGoogle Scholar
  12. Elias Pampalk. 2004. A Matlab Toolbox to Compute Music Similarity from Audio.. In Proceedings of the 5th International Conference on Music Information Retrieval. ISMIR, Barcelona, Spain. https://doi.org/10.5281/zenodo.1418077Google ScholarGoogle Scholar
  13. Thomas Sødring and Alan F. Smeaton. 2002. Evaluating a Music Information Retrieval System - TREC Style.Google ScholarGoogle Scholar
  14. G. Tzanetakis and P. Cook. 2002. Musical genre classification of audio signals. IEEE Transactions on Speech and Audio Processing 10, 5(2002), 293–302. https://doi.org/10.1109/TSA.2002.800560Google ScholarGoogle ScholarCross RefCross Ref
  15. Alexandra L. Uitdenbogerd and Ron G. van Schyndel. 2002. A Review of Factors Affecting Music Recommender Success. In ISMIR.Google ScholarGoogle Scholar
  16. Fabio Vignoli and Steffen Pauws. 2005. A music retrieval system based on user-driven similarity and its evaluation. In Proceedings of the 6th International Conference on Music Information Retrieval. ISMIR, 272–279.Google ScholarGoogle Scholar
  17. Jiann-Rong Wu, Cha-Dong Duh, Ming Ouhyoung, and Jei-Tun Wu. 1997. Head Motion and Latency Compensation on Localization of 3D Sound in Virtual Reality. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (Lausanne, Switzerland) (VRST ’97). Association for Computing Machinery, New York, NY, USA, 15–20. https://doi.org/10.1145/261135.261140Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Music Scope Pad: Video Selecting Interface by Natural Movement in VR Space

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      UIST '22 Adjunct: Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology
      October 2022
      413 pages
      ISBN:9781450393218
      DOI:10.1145/3526114

      Copyright © 2022 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 28 October 2022

      Check for updates

      Qualifiers

      • demonstration
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate842of3,967submissions,21%

      Upcoming Conference

      UIST '24
    • Article Metrics

      • Downloads (Last 12 months)30
      • Downloads (Last 6 weeks)4

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format