H(n)MI
Last updated
Last updated
During these two weeks of the H(n)MI course, we had the opportunity to explore the intersection of the human body, technology, and creative expression. The workshop delved into concepts such as affective computing, embodiment, and creative coding, focusing on how bodily behaviors can be captured and translated into digital or sound representations. Throughout the sessions, we worked with Arduino and sensors to gather data from physical actions, used p5.js to manipulate visuals and sound, and explored ml.js for integrating machine learning into our projects. The course emphasized how the body can act as both an interface and a source of data, encouraging us to design interactive prototypes that respond to human movement and emotions.
In the first week, we built DIY pressure sensor, learning to fabricate and read them to gather real-time data from the pressure sensed . We also experimented with sound as an input medium. The combination of physical computing and digital tools allowed us to create visual and auditory representations of bodily interactions.
For the final workshop during the second week, my group and I decided to build on the pressure sensors we had created earlier. We integrated them into Lucrezia’s ballet shoes, strategically placing one sensor in the heel and another in the sole of each shoe. The goal was to generate real-time visualizations corresponding to various dance steps and styles. The system captured the exact moment of pressure, identifying which sensor was activated and how much force was applied — with the intensity of the color saturation reflecting the strength of the pressure. This created a dynamic and responsive visual output that translated the dancer’s movements into a live digital art piece.
Personally, experimenting with p5.js, Arduino, and the concept of embodied interaction opened up new creative possibilities for me. I found the process of connecting physical actions to digital outputs both challenging and exciting. How can I integrate data visualization and interactive elements into my current work?
You can find all of the documentation here! :)
git@github.com:LKField/h_n_mi.git