TECH

Blindfolded in the Air:  Towards the Design of Interactive Aerial Play

Topaz,A., Montoya, M., Patibanda, R., Andres, J., Mueller, F..

This ACM Publication will be out at the end of November as part of the first SportsHCI Conference

Abstract:

The intersection of aerial acrobatics (movement on a suspended apparatus where the performer is off the ground) and interactive technology remains an underexplored area in HCI.  In this autoethnographic study, we investigate the interplay between augmented eyesight and proprioception in adapting to the suspended environment.  We developed a motion-sensitive blindfold mixed-reality headset application that enables wearers to transition between visibility and darkness based on their body’s orientation while rotating in a two-point harness.  Analyzing videos, somaesthetic maps, and interviews, we observed that our design reduces visual and social distractions, facilitating inward focus on movement and breath.  However, acclimation to both physical and mixed-reality systems is necessary for people to interact comfortably.  The findings extend our understanding of designing interactive real-time visuomotor couplings between movements and mixed-reality in suspended environments, offering four themes and six design considerations to support the active body, aiming to enrich the possibilities for augmented aerial play.  

Presentation Video

 

Focused Hearing

ANU Cybernetics Group Project 2024  Izak Lindsay, Bill McAlister, Shi Pui Ng, Amanda Topaz

Focused hearing is augmented hearing to enable natural communication in noisy environments.  Utilizing camera as well as microphone sensors, we combined a lip reading and speech recognition model in “cross-modal fusion” which then generated closed captioning on a screen or mobile phone. The camera could be directed by the wearer of the device giving agency to the user. One version was a hard hat for construction workers.  Glasses were meant to be an alternative to hearing aids. 

A customized Needleman-Wunsch algorithm was used to align the two transcripts generated from the lip-reading and speech to text models to ensure that the addition of lip-reading would increase the accuracy of the transcript before displaying results on a screen.  

This concept was very important to me as a hearing aid user for many years. Because the camera is directed by the wearer of the device, agency transfers to the user rather than an AI for sound and noise decision making.

Focused Hearing was created in 6 weeks.

Focused Hearing Prototype Documentation

Focused Hearing Github

Presentation Video


Get Lost in the Music

ANU Cybernetics Individual Build Project 2024

Get Lost in the Music is a mixed-reality MP3 player that forces users to walk to their song in a minecraft-like setting. A GPS tracks the user’s movement which in turn moves their avatar in game world. There is no correlation between the game map and the physical world, so the user may get lost physically as a result. This “CPS relic” was created with three future’s in mind:

Future One: The music and where it is placed in the game will always be curated by humans and can be interchanged between friends to create a digital music place where a Parisian Jazz club might be across the street from CBGBs.

Future 2: The music will be curated by an algorithm/AI and like Spotify, continue to homogenize musical culture.

Future 3: The music will be curated by a human or AI as a way to share the joy of music and movement. Perhaps the AI could place a song somewhere in the digital map that would lead the human to a physical place the AI had never seen and wanted to go. It could be like having a human uber guided by song.

Get Lost in the Music took three months from conception to exhibition. Read the process here:

Get Lost in the Music Build Journey

Presentation Video

Interface

The assignment was to create an interface between any two systems. It didn’t have to be electronic, but I enjoy the challenges of circuitry. I wanted to explore interfaces between humans and nature.

As a long-time city dweller, it doesn’t come naturally for me to go outside and enjoy nature. The sound of wildlife is often the only thing which will compel me to stop whatever I am doing and immediately exit my enclosure. Hence, I chose to use a distance sensor that triggered a magpie’s song, to compel a human (me) to go outside.

Presentation Video

Creature: Scrap-Py

For this Skills Work we were instructed to make a “Creature” out of a Circuit Playground microprocessor. I decided the creature should be a representation of the first month of school that had passed where we went as a group to view Jordan Wolfson’s Body Sculpture, were shown a CPS Story presentation that included a very memorable axolotl head costume, and various catchphrases from class such as “Close Your Tabs”. I used a servo-motor to emulate the movement and sound from Jordan Wolfson’s sculpture and my iphone to capture sound bites from the cohort. This was the first video we were requested to make and I felt a bit cheeky about it – I used demonstration videos of the fifties as an inspiration, theme being, “My imperfections make me more human”. (Sadly there wasn’t time to procure a lab coat.) After many years of using movement as a language, I found the mode of communication through speech a bit rusty.

Presentation Video

もう少しとお待ち下さい:

Monash Exertion Games Lab XR Aerial Play collaboration will be linked after publication.