Project Collaborators: Nikhil Thomas and Samarth Reddy
Project Description
In our project “Digital Puppeteers: Invisible Threads in Motion,” we draw inspiration from the traditional Rajasthani Kathputli puppet shows, infusing this ancient art form with cutting-edge technology. We use an Arduino Nano 33 IoT to control the movement of puppets through servo motors, which respond to data received from a Processing-based blob/color recognition code. This code analyzes live video by a webcam, translating colors into movement.
Our project represents a blend of the tactile charm of traditional puppetry and the precision of modern computer vision techniques. The servo motors, acting as the digital equivalent of puppet strings, are programmed to replicate the fluidity and expressiveness inherent in Kathputli performances. By employing the Firmata protocol, we establish a communication channel between the Processing environment and the Arduino, ensuring seamless data flow and puppet movement.
This project not only pays homage to the cultural richness of traditional puppetry but also opens new avenues in interactive and automated performances, where technology augments human creativity. The project is both a conceptual exploration of how folk arts can be reinterpreted in the digital age for preservation and a technical venture showcasing the potential of IoT devices and programming in artistic applications.
Final Video
Hello World
In the initial stage we were exploring different possibilities of actuation from processing to Arduino. In the below video, we divided the camera view into four equal sections. When a face is detected in each section, the corresponding servo rotates, and if no face is detected, it resets to the initial position. For this code, we utilized OpenCV, Video, and the Firmata protocol.
System Diagram
The software used include Arduino IDE for (firmata communication) Actuation control and Processing for programming and handling video inputs, along with computer vision and audio tasks. Hardware elements consist of a computer, a webcam, the Arduino Nano 33 IoT board, and four servo motors powered by an external source. Communication protocols involve Arduino Firmata, which allows for interfacing between Arduino and Processing by sending data which is used to control the servo motors. Libraries in use are the Audio and Video, the OpenCV Library for computer vision, and the Blob Detection Library for identifying color in live video, all of which connect to the Processing to manage the input from the webcam and control the hardware accordingly.
Project Context
During our research, we came across several projects, but two that particularly stood out were Availabot and Incredibox, thanks to their unique mechanisms and user-friendly experiences.
Availabot served as a significant inspiration for our project, “Digital Puppeteers.” This Instant Messenger avatar displayed a distinctive behavior by standing upright when a chat buddy came online and falling down when they logged off. We were fascinated by this personalized interaction and the simplicity of its mechanism, which inspired us to create a similar concept. In “Digital Puppeteers,” we aimed to bridge the physical and digital realms by providing users with a tangible touchpoint within a physical space. Just like Availabot, our project incorporates the use of puppets that move up and down, symbolizing interaction during play and fostering a stronger sense of connection.
On the other hand, Incredibox, available as an app on various platforms, allowed users to create their own music using different beatboxers in nine unique environments. These beatboxers look like puppets which we control using the platform’s interface. This app not only provided an exciting audio experience but also offered visually appealing graphics, animations, and interactivity. While developing “Digital Puppeteers,” we drew inspiration from Rajasthani folklore puppetry, and Incredibox played a pivotal role in giving a voice to our project. We experimented with the app to understand how the sound captured the player’s attention. We then recorded and edited our own sounds to mimic our characters.
These two projects, Availabot and Incredibox, influenced our creative process and helped shape the unique features of “Digital Puppeteers.”
How it Works
The puppeteer wears gloves with colored tips and waves their fingers in front of a camera. This camera captures live video and sends it to a Processing software. Here, a blob detection algorithm identifies the distinct colors. Each color is linked to a specific servo motor, which in turn is connected to characters on the puppet stage. Whenever a color is detected by the camera, the corresponding servo motor is activated, resulting in movement. The system currently features four motors, each associated with a different color. The design of this project is scalable, allowing for the integration of both hands or all fingers, and can accommodate an increased number of moving objects on the puppet stage.
Circuit
Github Repo: calluxpore/CC3 (github.com)
List of Parts:
List of Libraries:
Note: We took help of ChatGPT to learn and modify some part of the Processing code at the initial stage.
Audio
We (Thomas) recorded audio imitating the characters we created for the project. These audios were then processed in Audacity for noise removal and pitch modulation. Below are the four audio files used when the corresponding servo motor is activated.
Physical Structure
The design for the structure, which houses four motors, an Arduino, and the key elements – the puppets, began with brainstorming and sketching on paper. This was followed by developing a 3D model to evaluate the structure’s total dimensions. To assemble the structure, we used laser-cut MDF sheets that provide a base for all the Arduino components and securely hold the characters attached to the servo motors.
Reflection
The project was more challenging than we initially anticipated. The excitement of using multiple platforms to connect a physical component was thrilling at first, but it gradually became more complex towards the end. Despite these challenges, we gained significant knowledge from the experience. We applied our Arduino knowledge from Project 1 and our p5.js coding skills from Project 2 to create a unique synthesis of the two. The most demanding part was researching and understanding the coding aspects, particularly the base files and examples, and modifying them to fit our needs. We constructed the physical structure in a day, and it suits the project well. We’d like to highlight a few specific challenges that were particularly time-consuming:
We have learned to navigate these challenges and are continuously learning more. As of now, we are proud of the outcome.