handInHand()

handInHand() explores new forms of human-machine interaction through embodied gestures. Using hand motion detection, user gestures are sent to the Gemini API, which responds by generating virtual hands that react in real time. This interaction creates an evolving, choreographed exchange where human gestures and machine responses harmonize. By transforming gestures into a shared creative language, the project examines intuitive, expressive interactivity and the potential for machines to act as co-creators in creative processes.

ml5.js p5.js Gemini API
Video documenting a user interaction
Gesture-based interaction with AI-generated hand Virtual hand responding to gestures
Process of developing handInHand()

Process

The development process began with integrating the Gemini API to generate 21 keypoints in the shape of a hand, triggered by user input. Experimentation with AI responses led to intriguing outcomes, including gesture-based interactions that appeared to mimic movement or reach toward the user’s hand. Further exploration involved prompting the AI to generate textual reflections based on these interactions, expanding the project’s conceptual scope. Visual refinements, including the implementation of 3D geometry, enhanced the overall experience, making the interactions feel more fluid and immersive.

Particle-based visualization of hand gestures Generative dots responding to motion
Text generated by AI in response to gestures AI text overlaying hand movement data

Publication

To expand on the project, a publication was designed using transparent sheets to overlay different AI-generated hand reactions onto various positions of a human hand. This layering technique created a tangible representation of the evolving interaction between human gestures and machine responses, visually capturing the dialogue between movement and artificial intelligence. The publication served as both a documentation of the process and an extension of the project's exploration into the expressive potential of AI-driven gestures.

Layered transparent sheet showing AI hand responses Overlay experiment with AI hand visuals Printed AI-generated hand movement overlay Combining human and AI hand movements Gesture mapping experiment Hand motion overlayed with AI responses Printed text interacting with gesture data Final printed composition of AI gesture responses
Exhibition display of handInHand() project
Visitors interacting with handInHand() Close-up of real-time AI interaction
AI-driven hand responding to user input