Impulse 02 // CoSA A(R)dventure

A(R)dventure – CoSA | CoSA – Center of Science Activities

For my second impusle blog post I decided to visit an exhibition at the CoSa Museum in Graz together with my fellow student Vinzenz. The A(R)dventure 3: Habitat Red 6. This installation combines augmented reality through Microsoft’s HoloLens with physical interactions, creating an experience that was not only fun & engaging, but also made me think about climate change and the future possibilities of AR technology in the field of interaction design.

When I arrived, I was greeted by an enthusiastic member of the Project, who explained the concept thoroughly and helped me get started using the HoloLens. The adventure began as I followed H.I.G.G.S., a digital drone guide, through a fascinating time vortex that took me digitally through the lenses and physically by foot to a habitat on another planet in need of maintenance and care.
The first task was to manage the atmosphere control system. What made this interaction particularly interesting was the combination of physical valve controls with digital displays. As I turned the real knobs, I could see the atmospheric parameters changing on virtual graphs floating in front of my eyes. This immediate feedback loop between physical action and digital response created a natural and intuitive experience.

The next task took me to the power management system, where I found myself adjusting solar panels using a physical turning wheel. Through the HoloLens, I could see a window showing the actual panels moving outside the habitat, demonstrating how my actions directly affected the station’s power supply. This mix of tangible control and virtual feedback made adjusting something as small as a solar panel very important and interesting, and made me realise that solar power is very location dependent.
One of the less interactive tasks was plant pollination. Starting in a physical test chamber, I learned about the process on a small scale through an animation, before seeing it applied to a larger virtual plantation visible through digital windows. This move from small scale learning to larger scale implementation helped by first understanding the concept and then seeing it applied and its effects on the displayed plantation biosphere.
After pollination, the water filtration system needed some attention. Using a physical joystick, I operated a virtual crane to service the water purification system. Replacing an old filter with a new one. The amount of work required to have clean water was highlighted by this task, showing the importance of clean water, something we often take for granted on Earth.

Perhaps the most tactile experience was selecting materials for heat shield repairs. I was able to physically touch and examine different materials while receiving digital information about their properties and suitability for the repair job. This combination of sensory feedback and augmented information created a powerful learning experience about materials science.
The final task involved fine-tuning the habitat’s environmental controls using spring-loaded dials. As I adjusted humidity and other parameters, the digital graphs responded in real time and, importantly, these changes, like all other changes, persisted in the virtual environment, creating an immersive feeling that demonstrated the lasting impact of my actions.

The exhibition successfully communicated its underlying message about the Earth’s ecosystem without hitting you on the head, which is a difficult line to walk. By asking visitors to maintain artificial living conditions, it effectively illustrated the complexity of our planet’s natural systems, which we often take for granted. Each challenge in maintaining the habitat served as a reminder of the Earth’s perfectly balanced ecosystem. All in all, I found it a great experience and would advise anyone to try something similar if they have the opportunity.

What struck me most about this experience was how it challenged my previous assumptions about calm technology. As someone researching this area for my Master’s thesis, I had initially leaned towards purely analogue solutions and viewed AR and VR as potentially intrusive technologies. However, Habitat Red 6 demonstrated that a thoughtful combination of physical and digital interactions can create experiences that are both engaging and intuitive. The digital layer added depth and dynamics to the physical interactions, while the tangible elements grounded the experience in reality. This hybrid approach retained the benefits of physical feedback, while using digital technology to provide additional context and visualisation that would not be possible with purely analogue interfaces.

This experience significantly influenced my perspective on interaction design. Instead of seeing analogue and digital as opposing approaches, I now see the potential in combining tangible objects with digital layers of information to create interfaces that offer the best of both worlds. This insight will definitely influence the direction of my Master’s research, opening up new possibilities for designing more intuitive and meaningful interactions.

Calm Interfaces

In this excursion into sound design, I’ll be exploring a paper from the International Conference on New Interfaces for Musical Expression (NIME23). The paper is about a new type of interface that transforms brush movements into electronic sounds, creating a method for creating electronic music in a natural and expressive way.

The paper presents the Brushing Interface, a DIY multi-touch interface designed to translate brushing gestures into expressive musical performances. It consists of 216 handmade force-sensitive resistive sensors and 8 piezo microphones for precise gesture tracking and sound production. The interface combines a unique gesture mapping strategy with continuous gesture tracking, enabling flexible and expressive performances. The hardware system, including the sensors, was built inexpensively and the software was developed using Max7 for real-time sound processing and gesture mapping. The interface offers four performative approaches: using the standard brush sound, applying audio effects, real-time audio synthesis and changing presets. A composition called „Drifting“ demonstrates the interface’s capabilities. Overall, the Brushing Interface expands the possibilities of gestural expression in musical performance, offering richness and versatility.1

As an interaction design student, I find Jaehoon Choi’s work on the brushing interface fascinating. The concept of transforming brushing gestures into a true musical/sonic performance opens up new avenues for exploring embodied interaction and expressive communication through technology. The DIY approach to building the hardware system is in line with the interaction design idea of iterating and testing with self-created prototypes before scaling up to finished, industrialised products. It also emphasises hands-on experimentation and customisation, which can empower designers and users alike to create personalised and meaningful experiences.

One aspect of the paper that stands out is the integration of multi-dimensional parameter mapping and continuous gesture tracking, enabling an expressive performance that can be configured in a variety of ways. This emphasis on flexibility and adaptability is very much in line with the principles of interaction design, which prioritise designing for different user needs and contexts. The Brushing interface is an example of how technology can be designed to support nuanced and intuitive forms of interaction, encouraging deeper engagement and creative expression.

However, while the paper provides a comprehensive overview of the design and implementation of the brush interface, there are some areas that could be further elaborated or addressed. For example, while the DIY approach is commendable for its affordability and accessibility, there may be limitations in terms of scalability, reliability and reproducibility, especially when considering larger scale applications or commercialisation. In addition, while the paper touches on the potential for improvisational performance, further research is needed into how the interface can support more planned & structured inputs or outputs, and how easy it is to learn and repeat to produce the same output again.

In terms of relevance to Calm Technologies, the Brushing Interface offers an interesting perspective on how technology can be seamlessly integrated into our daily lives in a subtle and non-intrusive way. By utilising the tactile and familiar action of brushing, the interface invites users to engage in a calming and almost natural interaction.

In conclusion, the Brushing interface represents an innovative fusion of art, design and technology, with implications for both musical performance and interaction design. While there are areas for further refinement and exploration, the work serves as a valuable contribution to the field, inspiring future research and creative endeavours in the realm of expressive gestural interfaces for musical performance, as well as Calm interfaces for our everyday interactions with the digital ecosphere.

  1. Jaehoon Choi. 2023. Brushing Interface – DIY multi-touch interface for expressive gestural performance. Proceedings of the International Conference on New Interfaces for Musical Expression.  ↩︎