skip to main content
10.1145/3384419.3430467acmconferencesArticle/Chapter ViewAbstractPublication PagessensysConference Proceedingsconference-collections
short-paper

IWannaPlay - an eye-tracking based AI tutoring chinese chess system: poster abstract

Published: 16 November 2020 Publication History

Abstract

In this poster, we present IWannaPlay, an eye-tracking based Artificial Intelligence (AI) tutoring Chinese chess system. Through ordinary webcams, the IWannaPlay system captures the player's sight and facial expressions to acquire his/her gaze points and emotional status. Combining this information with the game situation, the system intelligently provides AI suggestions with visualizations. With the guidance of the AI, the player can strategize more thoroughly and thus make better decisions, achieving a better training outcome. When playing Chinese chess on IWannaPlay, the system interface displays a real-time game analysis and warns against potentially dangerous pieces within the game. Upon detection of the user's anxiety or after a period of long contemplation, the system intelligently provides graphic visualizations of dynamic multi-step strategies to assist the user in making decisions. We conducted experiments to verify the usability of our system, the user feedback demonstrated that IWannaPlay does provide effective guidance to players.

References

[1]
Chen, B. N., Chen, J. Y., Chen, J. C., Hsu, T. S., Liu, P., Hsu, S. C. (2009). An Intelligent tutoring system of Chinese chess. In International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems. 350--359.
[2]
Cowie, R., Douglas-Cowie, E., Tsapatsoulis N., Votsis G., Kollias S., Fellenz W., Taylor J. G. (2001). Emotion recognition in human-computer interaction. IEEE Signal processing magazine. 18(1), 32--80.
[3]
Elephant eye. https://github.com/xqbase/eleeye.
[4]
Face++ emtion recognition api. https://www.faceplusplus.com/emotion-recognition/.
[5]
Face++ gaze recognition api. https://www.faceplusplus.com/gaze-estimation/.
[6]
Huang, M. X., Kwok, T. C., Ngai, G., Chan, S. C., & Leong, H. V. (2016). Building a personalized, auto-calibrating eye tracker from user interactions. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, 5169--5179.
[7]
Leventhal, Dennis, A. (1978). The Chess of China. Taipei, Taiwan: Mei Ya Publications.
[8]
Rosch, J. L., Vogel-Walcutt, J. J. (2013). A review of eye-tracking applications as tools for training. Cognition, technology & work, 15(3), 313--327.
[9]
Sadikov, A., Možina, M., Guid, M., Krivec, J., Bratko, I. (2006). Automated chess tutor. In International Conference on Computers and Games, 13--25.
[10]
TianTianXiangQi. https://xiangqi.qq.com/.

Cited By

View all
  • (2023)Exploring Intuitive Visuo-Tactile Interaction Design for Culture Education: A Chinese-Chess-Based Case StudyInternational Journal of Human–Computer Interaction10.1080/10447318.2023.2223863(1-21)Online publication date: 25-Jun-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SenSys '20: Proceedings of the 18th Conference on Embedded Networked Sensor Systems
November 2020
852 pages
ISBN:9781450375900
DOI:10.1145/3384419
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 16 November 2020

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Short-paper

Funding Sources

  • Japan Science and Technology Agency (JST)

Conference

Acceptance Rates

Overall Acceptance Rate 198 of 990 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)15
  • Downloads (Last 6 weeks)1
Reflects downloads up to 18 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Exploring Intuitive Visuo-Tactile Interaction Design for Culture Education: A Chinese-Chess-Based Case StudyInternational Journal of Human–Computer Interaction10.1080/10447318.2023.2223863(1-21)Online publication date: 25-Jun-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media