skip to main content
10.1145/3325291.3325373acmotherconferencesArticle/Chapter ViewAbstractPublication PagesacitConference Proceedingsconference-collections
research-article

Development of Learning Support Equipment for Sign Language and Fingerspelling by Mixed Reality

Authors Info & Claims
Published:29 May 2019Publication History

ABSTRACT

Our purpose of this study is to increase efficiency of learning sign language and fingerspelling, the visual language.

In general, illustrations and videos are used as learning materials to study these languages. However, the learning materials are drawn from the view point of someone to talk to. Therefore, learners need to imagine finger shape that looked from the person while they study. Understanding the motion is easier if learners can watch it in three dimensions. Therefore, taking lessons by a teacher who has already mastered it will be the best way. However, the method is not appropriate to study by themselves.

Then, we devise a new method to increase the efficiency of learning by increasing learning opportunities by developing equipment that learners can study alone. We propose learning support equipment using glasses-shaped wearable device and mixed reality. By using mixed reality, learners can watch sign language and fingerspelling motion in three dimensions as if learners looked at these languages in the real world.

In the previous studies, the equipment was developed by using augmented reality. AR marker was used for the operation. With this equipment, learners have controlled movement and rotation of 3D model.

In this study, we develop the equipment used mixed reality by selecting the holographic button with gesture. Therefore, learners can operate the equipment without using the other device and this make them possible to study more intuitive. With glasses-shaped wearable device, learners can also study sign language and fingerspelling mimicking the gesture with both hands.

As we build sign language learning support contents, learners can study sign language and fingerspelling with them. However, the present equipment is short of 3D animation of sign language. Therefore, we are planning to make 3D animation by capturing the motions that hearing-impaired person use.

We had a trial test for hearing-impaired person and a sign language interpreter in order to get real impression. As a result, we got an opinion that some 3D animations are not accurate and complicated to understand. Also, we got an opinion that 3D model direction is not clear after the rotation.

According to these opinions, we corrected as follows: the correction of 3D animation, adding function of displaying the track of the finger, and the function of selecting the holographic reset button for the 3D model direction.

References

  1. Yoshinori Fujisawa, Shoichi Ito, Kenya Kobayashi: Development of Learning Support System for Fingerspelling by Augmented Reality, Proceedings of The 5th International Conference on Intelligent Systems and Image Processing 2017, p. 492--495, 2017Google ScholarGoogle ScholarCross RefCross Ref
  2. Japan Audiological Society: https://audiology-japan.jp/audiology-japan/wp-content/uploads/2014/12/a1360e77a580a13ce7e259a406858656.pdfGoogle ScholarGoogle Scholar
  3. Ministry of Health, Labour and Welfare: https://www.mhlw.go.jp/file/06-Seisakujouhou-12200000-Shakaiengokyokushougaihokenfukushibu/0000172197.pdfGoogle ScholarGoogle Scholar
  4. Ministry of Health, Labour and Welfare: https://www.mhlw.go.jp/toukei/saikin/hw/shintai/06/dl/01.pdfGoogle ScholarGoogle Scholar
  5. Microsoft HoloLens: https://www.microsoft.com/ja-jp/hololensGoogle ScholarGoogle Scholar
  6. Blender: https://www.blender.org/Google ScholarGoogle Scholar
  7. PERCEPTION NEURON: https://neuronmocap.com/jaGoogle ScholarGoogle Scholar

Index Terms

  1. Development of Learning Support Equipment for Sign Language and Fingerspelling by Mixed Reality

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      ACIT '19: Proceedings of the 7th ACIS International Conference on Applied Computing and Information Technology
      May 2019
      248 pages
      ISBN:9781450371735
      DOI:10.1145/3325291

      Copyright © 2019 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 29 May 2019

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article
      • Research
      • Refereed limited

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader