ABSTRACT
Thrii is a multimodal interactive installation that explores levels of movement similarity among its participants. Each of the three participants manipulates a large spherical object whose movement is tracked via an embedded accelerometer. An analysis engine computes the similarity of movement for each possible pair of objects, as well as self-similarity (e.g., repetition of movement over time) for each object. The extent of similarity among the movements of each object is communicated by a visualization projected on a three-sided pyramid, a non-directional audio environment, and lighting produced by the spherical objects. The installation's focus is intended to examine notions of collaboration between participants. We have found that participants engage with Thrii through exploration of collaborative gestures.
- Britten, B. BBOSC - Cocoa implementation of OSC protocol. http://code.google.com/p/bbosc/Google Scholar
- Eliasson, O. 360 Room For All Colours. http://www.olafureliasson.net/works/360_room_for_all_colours.html.Google Scholar
- Freed, A. and Schmeder, 2009. A. Features and Future of Open Sound Control version 1.1 for NIME. New Interfaces for Musical Expression'09.Google Scholar
- Incubator Workshop http://ame.asu.edu/events/incubator/Google Scholar
- Knep, B. Healing Pool. http://www.blep.com/healingPool/Google Scholar
- Lieberman, Z., Watson, T., Castro, A., et al. openFrameworks. http://www.openframeworks.cc/Google Scholar
- Olson, L. Dash. http://ame4.hc.asu.edu/dash/index.php/Main_PageGoogle Scholar
- Rajko, S et al. AME Patterns library. http://ame4.hc.asu.edu/amelia/patterns/Google Scholar
- Rajko, S et al. ofxPatterns. http://ame4.hc.asu.edu/amelia/ofxpatterns/Google Scholar
- Snibbe, S. Boundary Functions. http://snibbe.com/scott/bf/Google Scholar
- Thrii Demo Video http://ame2.asu.edu/projects/thrii/THRII_Demo.m4vGoogle Scholar
- Wang, G. and Cook, P. ChucK: Strongly-timed, Concurrent, and On-the-fly Audio Programming Language. http://chuck.stanford.edu/Google Scholar
Index Terms
- Thrii
Recommendations
Gripmarks: Using Hand Grips to Transform In-Hand Objects into Mixed Reality Input
CHI '20: Proceedings of the 2020 CHI Conference on Human Factors in Computing SystemsWe introduce Gripmarks, a system that enables users to opportunistically use objects they are already holding as input surfaces for mixed reality head-mounted displays (HMD). Leveraging handheld objects reduces the need for users to free up their hands ...
Smart Hand Device Gesture Recognition with Dynamic Time-Warping Method
BDIOT '17: Proceedings of the International Conference on Big Data and Internet of ThingIn this paper, we present a smart wearable hand-gesture recognition system based on the movement of the hand and fingers. The proposed smart wearable system is built using the fewest sensors necessary for gesture recognition. Thus, motion sensors are ...
Albanian Dynamic Dactyls Recognition using Kinect Technology and DTW
BCI '17: Proceedings of the 8th Balkan Conference in InformaticsSign language is the primary way of communication for the people with hearing impairment. Every region and country has its own sign language. Albanian finger-spelling gestures for expressing the alphabet letters are composed of static and dynamic ...
Comments