Abstract:
Object manipulation is a foundational behavior that emerges in infancy and improves with age. Research on children's interactions with objects is a cornerstone for unders...Show MoreMetadata
Abstract:
Object manipulation is a foundational behavior that emerges in infancy and improves with age. Research on children's interactions with objects is a cornerstone for understanding cognitive and motor development. Traditionally, developmental researchers rely on human video annotation to analyze object interactions by categorizing behavioral events (e.g., “banging,” “constructing”). However, human video annotation cannot provide precise, quantitative details from moment to moment about the location and orientation of objects and the movements of each hand and finger during reach, grasp, and manipulation. To overcome the challenges in acquiring real-time, continuous, quantitative data, researchers turned to high-speed motion tracking and inertial measurement units—which require children to wear markers—and to instrumenting the objects. However, “wearables” present a new set of complications as they disrupt the natural spontaneity of children's movements and sensors may fail to accurately track changes in object position and orientation. Critically, only video data captures the subtleties and complexities of manual behavior and the surrounding context. Consequently, we devised a novel, video-based, marker- and sensor-free approach that enables real-time quantification of children's coordination patterns during object interaction. We demonstrate the power of our approach in a tower-building study with children (2 to 8 years) and adults. This approach marks a paradigm shift in testing the evolving dynamics of object manipulation over development.
Date of Conference: 20-23 May 2024
Date Added to IEEE Xplore: 27 August 2024
ISBN Information: