Hey creative minds! I’m on a mission to explore this topic in a practical and doable way. Here’s my game plan:

  1. Dive into Research: I’m starting by digging into research on gamification and design thinking. I’ll comb through articles and case studies to understand the latest insights.
  2. Mini Experiments:
    • Online Simulations: Instead of big workshops, I’ll organize bite-sized online simulations. Think interactive challenges that participants can tackle from the comfort of their own screens.
    • Weekly Creativity Checks: I’ll conduct weekly creativity check-ins with a small group of participants. We’ll explore different gamified techniques and track how they influence creativity over time.
    • Virtual Brainstorming: Harnessing the power of technology, I’ll host virtual brainstorming sessions with gamified elements. It’s like bringing the fun of gamification to our design discussions, minus the logistical headache!
  3. Measure What Matters:
    • Creative Output: I’ll develop simple creativity assessments to measure the quality and originality of ideas generated during our experiments.
    • Engagement Surveys: Quick surveys will help me gauge participant engagement and satisfaction with the gamified activities.
    • Process Observations: I’ll keep a close eye on the design thinking process during our experiments, noting any changes in approach or problem-solving strategies.
  4. Simplicity is Key: Keeping it simple is my motto. I’ll focus on small-scale experiments that don’t require a big budget or logistical gymnastics.
  5. Share the Knowledge: Once I’ve collected some insights, I’ll share them with the world. Whether through blog posts, social media updates, or informal chats with classmates, I’ll spread the word about what I’ve learned.

So, that’s my plan to explore gamification in design thinking, one manageable experiment at a time. It’s all about curiosity, creativity, and making the most of the resources we have as students. Let’s dive in and see where this journey takes us! ✨🎮🚀

20 | Designing A Digital Fashion Garment – The Final Project

For my final project, I decided to draft a pattern for pants in CLO3D. I am following a tutorial for the basic shape and will modify the garment according to my design preferences.

Preparing the Avatar

To begin the process, I decided to modify one of the default avatars from the CLO library to my own measurements. This was relatively easy and it’s a quick method to get a fairly accurate representation of your body in the digital space. However, the problem with this method is that body proportions vary greatly from person to person and the distribution of certain measurements creates a very different base shape of the body. Even though CLO allows you to input a lot of specific measurements, such as knee, leg and calf circumference, it does still calculate the spaces between the input custom measurements automatically, resulting in certain idiosyncratic body shapes to be lost in the process. For example, the difference between my high and low hip measurement is quite substantial and creates a dip at the hip area but there is no way (that I could find) to represent the shape between those two measurements inside the avatar editor. So to get a very accurate representation of your own body, you would have to make a custom model from scratch and load it into the program. For this project, I decided to go with the approximately correct model achieved by modifying the default avatar.

Drafting the Pattern

After the Avatar was prepared, I started developing the pants pattern according to this tutorial. 

First, I had to calculate some measurements as listed below.

Pants measurements

Avatar:
Hip: 97 cm  (31.2in)
Half hip: 48,5 cm (19in)
Waist: 65 cm (25.6in)
Outseam: 95cm (37.4in)
Inseam: 75cm (29.5in)
Knees: 18in
Ankles: 13.75in

Measurement Percentages (of half hip in inches):
21% 3.99in
55% 10.45in
47% 8.93in
10% 1.9in
20% 3.8in

Math:
Front Waist: 25.6 / 4 + .375 = 6.775
Back Waist: 25.6 / 4 – .375 = 6.025
Front Knee: 18 / 2 +.75 = 9.75
Front Ankle: 13.75 / 2 – .375 = 6.5
Back Ankle: 13.75 / 2 + .375 = 7.25 

The pattern drafting process was very similar to the analog process on paper and gave me lots of flashbacks to pattern drafting class. Especially when following an instruction with different measurements, there can be some confusing moments in the process and my lack of practice over the last years definitely showed while I was developing the pattern. 

Following the tutorial was a very helpful way of getting familiar with all the tools in the 2D Pattern drafting window. Some of them I found to be less intuitive than I originally thought while going though them in the overview tutorial series. For example, the curving tools don’t respond in the way that I would expect and the way I know these types of tools from other programs, so they were challenging to use in the drafting process. 

At the end, I did manage to construct the pattern and I could move on to arranging the pieces on the avatar and sew and simulate the garment.

Fit Issues, Modifications and Fabric Choices 

I tried to correct this issue by re-measuring the inseam of my avatar, going back to the pattern in the 2D window and comparing the measurement to the inseam on the pattern. By gradually editing the curvature and approximating it to the inseam measurements, I managed to somewhat resolve the issue, although the result was still not perfect. 

After sewing, I noticed some significant bunching  issues around the crotch area, a typical issue in pant patterns.

After I was mostly happy with the base pattern, I started modifying it by changing the leg shape, adding a waistband and a zipper. I also experimented with the fabric options from the CLO library and settled on a black woven cotton fabric for the final pant model. 

Sewing Issues and Draping

After I made my modifications, I simulated the sewing again and ran into some issues, especially resulting from new pattern pieces on the waistband and zipper fly that where supposed to be sewn over top of each other. As you can see in the screen capture below, the program was confused as to how to arrange the overlapping seam lines and unfortunately, I was not yet able to find a solution for this issue. Another problem I had was with the developing of the waistband, which was not included in the pattern pieces from my original pattern so I had to develop it myself and as mentioned before, my lack of practice showed once again. The waistband could be improved, especially in the curvature on the side seams.

After the sewing, I draped the garment on the avatar, which was fun but also confusing sometimes, because the mesh of the avatar sometimes interacts strangely with the mesh of the garment model.

When I was happy with the draping, I took the garment model into the final processing step in CLO.

Animation, Render and Import into Blender

As a final step in CLO I took the model into the animation workspace. This workspace is quite easy to use and intuitive for anyone who has used a similar 3D program before.

I chose a preset walk cycle from the CLO library, which worked very well and smooth. For the final render, I went with the invisible avatar render. I exported the model, including the walk cycle animation as an alembic (.abc) file. I then loaded the model into Blender, where I had some issues with the fabric of the model not translating well from CLO. Because I wasn’t able to find the cause of the issue, I decided to use a similar cotton fabric material from Blenderkit for the pants model. Finally, I created a little scene, animated some camera movement and rendered the model with Cycles.

Final Product

Final Thoughts

I really enjoyed the process of getting familiar with CLO3D. During the initial phase of learning the basic functions of the program, I had the impression that it would be quite easy to work out the final project. However, while working on my final piece, I realized which functions of the program are still not as clear to me as I thought and which features I need more practice with. Additionally, I learned that there are other methods of developing pattern in CLO, as I have seen some tutorials where users cut out and drape on the 3D model in a more free-form approach. This is something I would like to try out in future projects, because I realized while developing the pattern in the „traditional“ way, that my knowledge on pattern development is quite rusty and I need to either brush up on it or find other methods for reaching a good end product. Another aspect I would like to work more on in the future is the fabric export and modification options because I had some issues with this in my final project.

Overall, I will definitely keep learning more about CLO and hopefully get better and figure out solutions to probelms I was not able to fix this time around.

Compositional methods and Implementation

Compositional Methods

Before addressing the composition for each scene individually, it is essential to ensure that the game’s soundtrack feels cohesive. A unified musical experience maintains player immersion while balancing it with unique characteristics for each scene to keep the experience interesting. To achieve this, I plan to use common instrumentation across various tracks, incorporating unique elements for each scene.

The soundtrack will blend orchestral and electronic elements. The orchestral components, controlled by MIDI instruments, align with the game’s historical theme and match the not-so-high fidelity of the game’s visuals. This mix of virtual acoustic and electronic instruments enhances the fantasy world of Mechanical God Saga. Matching the music’s fidelity with the visuals is deliberate; it creates a coherent experience and can positively impact the game, although sometimes not matching the music to the fidelity of the visuals can also work. This discrepancy can be used to have a positive effect on the game, uplifting it. This is sometimes used in cases where the games are developed for smaller devices without so much computer power, which is not the case for our game.

For composing specific scenarios, it is crucial to convey the appropriate emotions for each scene, such as peacefulness, relief, or stress. Understanding the scene’s context and potential events is key to achieve this. Triggers for musical changes can happen at any time, necessitating quick synchronization with gameplay. It is imperative to identify every possible moment in the music where a change might be needed and adjust accordingly to maintain immersion.

For example, in the forest scene, different musical changes are applied depending on which section of the song is playing when enemies appear. Figure 1 illustrates the initial mapping to achieve a state of tension for this scenario. Other music themes will require multiple states of tension, depending on the evolving situations. However, for the initial forest scene, only a single state of tension is necessary.

By carefully mapping out these musical changes and ensuring quick adaptation to gameplay triggers, the soundtrack will enhance the overall experience of Mechanical God Saga, creating a seamless and immersive journey for players.

The music structure goes as following:

Intro > Segment A > Segment B > Segment C

And for each section there is a need of mapping the changes, as the next table shows, regarding a first state of tension that we want to portray. Synthesized sounds with a metallic aspect are being considered to be associated with the enemy soldiers as for their metallic vests and its connection to the mystery of the nuclear event. As the structure is not linear, the structure mentioned above refers to the way it was composed. Different sections will be looped depending on gameplay, which doesn’t include the intro.

INTROSegMent ASegment bSegment C
Always the same, only plays once.Increased cutoff frequency for the bass.Replace instrumentation to more synthesized/metallic sounds, pumping effect.Increased cutoff frequency for the bass.
Replace flutes with synthesized trumpets.
Change in the melody.
Replace flutes and violins with detuned synthesized sounds.

Fig.1 Adaptive process for a first state of tension.

As the intro starts during an introductory dialogue, it is not possible to get to the enemies while it is playing, therefore there is no need to map changes to this initial part of the music. The other sections have defined changes that are designed to increase a state of alertness and a sense of danger.

Technical Implementation

Creating adaptive music for Mechanical God Saga requires audio middleware to connect the music to the game, reacting to triggers from the game engine. For this project, we will use FMOD, which can be integrated with RPG Maker, the game’s programming software. While there is no official connection between FMOD and RPG Maker, additional programming through JavaScript will be necessary to establish a functional link. Daniel Malhadas, the game’s creator and programmer, is currently researching and developing this connection.

Within the adaptive system, there are branching and layering needs. As a starting point to understand these systems, the next figure shows a simple system designed for the first forest theme, as there is only one trigger happening, related to a state of tension. The red segments represent the altered music for the state of tension.

Fig. 2 Adaptive mapping system for the initial forest theme.

All transitions and overlay triggers need to be quantized to the BPM of the source cue and have meter and quantization grid setup. All starts and stops should have adjustable fade in and fade out lengths. This will ensure a smooth and quick transition between segments.

For the layering approach, instead of exporting individual layers, I exported each version as a full mix and will blend between them when a trigger occurs. This method addresses the need for continuous changes, such as increasing the bass filter’s cutoff frequency to achieve tension. Ideally, these adjustments would be controlled within the audio engine, but due to software limitations with the virtual instruments used, this is not possible. This approach offers a practical solution to those limitations. FMOD’s built-in parameters, such as equalization curves, pitch, reverb, and delay, will be used to create effects like dulling the music when the player is injured, enhancing immersion with equalization curves and possibly stereo field reduction.

As mentioned before, control knobs—either continuous or stepped— are planned to adjust the music based on the scene’s tension, the protagonist’s position, and the health or success rate during battles. If the game’s programmer can access this knob system, applying the necessary changes will be much easier, simplifying communication between the music and game sections.

Scene Analysis and Adaptive Mapping

How to adapt the music to specific situations of the game is a key element of the research, as well as considering all possible scenarios in order to succeed in supporting every possible situation. In this paper there will be an explanation of the current process, from scene and emotional description to the mapping of adaptive needs and the subsequent changes in the music. This methods are tailored for Mechanical God Saga’s game but can also be applied to other games with similar nature.

Firstly we are gonna analyze the scenes of the game’s first episode, which includes Iridir Forest and Iridir Prison. Story telling should be part of the music as well or support the narrative, therefore it is very relevant to understand what is happening in each of the scenes.

Iridir Forest

  • Scene Description 

This is the first scene of the game where the player has to infiltrate the prison to “set god free” or in other words, execute him. In this scene we encounter ourselves in a forest with a lake. The atmosphere is calm and peaceful until the soldiers appear for a first fight.

  • Story and Gameplay

Our character has the possibility to wander freely in the forest until he is approached by three soldiers who claim that he killed various soldiers of their squad. He had swore allegiance to them at the time, they are part of the “empire”. Then there is the first fight against them. After the battle there will be more soldiers walking around, and there is always the possibility to fight them individually (although sometimes then there appear more soldiers in the fight) if the player chooses to, earning points and getting stronger in case of winning. Later there will be another fight with a so called “beast”, which is stronger than the common soldiers.

  • Music Soundtrack

For this scene I thought appropriate to have two musical pieces, excluding the fighting music, so that the player doesn’t stay on each one for too much time as it can take some time to trigger the next piece.

What should the music portray?

First Song (forest wandering): Peacefulness; sense of having to conquer something; the start of a new journey; fantasy.

Second Song (after winning the fight): Relief; sense of victory; positive feeling;

  • Triggers

First Song

  1. When the first three soldiers appear. Add tension.

Second Song

1. Transitioning to the prison side. Subtle hint of progress, add instrumentation.

2. Getting to the “rug” that leads to the prison, when the first beast figure appears. It should get more tense. But be careful as there seems to exist a possibility of leaving.

3. In case of winning the battle, get back to a less tense and more relaxed version. This segment will end quickly as we transition to the next scene inside the prison.

Iridir Prison

  • Scene Description 

The goal of this location is to rescue an old friend who is inprisioned. It is an underground location, illuminated by old fire lamps. There should be an eerie sense of danger and challenge as it is a location controlled by the empire and anything can happen at any moment. Metal is a relevant element that might be used in the composition as it represents the jails.

  • Story and Gameplay

The inprisioned friend was caught robbing the city’s medicine cabinet to save his mother. He was also being watched as it was claimed that he started a union of workers and incited strikes, which is true. 

At first, when entering the prison, there is the possibility to read a book called “Destructionism and the will to live”, which claims that demons are “man-made” and it reflects on their will to live. Then the prisoner can be found in the next room, and to release him the player has to find a switch in the room to unlock the prison, by solving a small “puzzle”. After this, the prison lock opens and two beasts appear to fight shortly after. This fight is more advanced in terms of strategy, there is more complexity and use of nature elements. 

After the fight there is the possibility of carrying on to other zones, with several more enemies to possibly fight and treasure chests with important items to be collected. There are 5 different zones, and from the last one it is possible to get to the Warden Chambers, where the final beast will be. Here there will be the final “boss” fight of this scene and the music should be different, more complex and perhaps using uncommon time measures. The battle can take until approximately 10 minutes. After the fight there will be another music piece and it should become very triumphant when using the travel circle at the end!

  • Music Soundtrack

For this scene there will be two musical pieces, excluding the fighting music. The first will accompany the player throughout the quest and the other one will represent the success of having accomplished the desired goals.

What should the music portray?

First Song: Eeriness, danger, challenge, use metallic elements to represent the jails.

Second Song: Sense of victory; triumphant, conclusion of an important chapter.

  • Triggers

First song:

  1. Entering Iridir Prison F1 room, as a subtle sign of progress.
  2. Switching the switch that opens the prison. Friend is released. Coloration change, refreshment, higher frequencies.
  3. Beasts appearing after friend is released. More tension.
  4. Entering the second F1 zone. Various fights will happen here, transmit more tension.
  5. Entering Iridir F2, subtle sign of progress.
  6. Entering Iridir F3, subtle sign of progress.
  7. Entering Iridir F4, subtle sign of progress.
  8. Opening the golden chest, subtle sign of progress.
  9. Entering F5 zone, subtle sign of progress.
  10. Entering the Warden Chambers where the final beast is located. Peak of tension, epic.

Notes:

Getting back to previous zones is a necessary step in order to progress within this scene, so there shouldn’t be a musical regression when going back to a previous room. Second song is shorter and has no necessary triggers, aside the one to start it. It should be triggered when getting to the travel circle. Getting to the forest again represents the end of this first episode.

Adaptive Elements

The adaptive elements are triggered depending on control signals from the game. Typical controls signals are mentioned below.

  • Position of the protagonist;
  • Position or distance from the antagonist;
  • Environment: weather/time of the day;
  • Game status (tension/relaxation);
  • Emotional state of the player;
  • Health status of the protagonist;
  • Interaction with Non Playable Characters (NPC’s).

For Mechanical God Saga, the most relevant musical adaptations are linked to the game status and player’s position. These aspects are crucial due to the game’s inherent contrast between tense battle scenes and more relaxed exploratory moments. The music will change as the player moves between different rooms, subtly guiding and marking progress. Additionally, the health status and success of the protagonist in battles will influence the musical adaptation during combat.

Musical elements such as structure, tempo, melodic contour, harmony, and loudness can convey a wide range of emotions to support various gameplay scenarios. To evoke tension, common techniques include increasing tempo, changing time signatures, adding low percussion elements, raising the cutoff frequency of filters, incorporating horn instruments, and using tremolo effects. For instance, Age of Empires enhances tension with low percussion and tremolo strings, while Corrupted Dungeons increases percussion and uses horns when enemies appear.

During battles, injury is often portrayed by a decrease in sensors, which can be represented by dulling the sound, typically using a low-pass filter to make audio elements less distinct. Enhanced breathing and heartbeat sounds can also indicate exhaustion, although these techniques are more effective in a 3D first-person environment and may not be suitable for Mechanical God Saga. Another interesting technique is the reduction of the stereo field as used in Nier: Automata, which will be tested in our game.

For technical implementation, control knobs are planned to adjust the music based on the scene’s tension and the protagonist’s position and also health or success rate during battles. The specific implementation details will be explored in the technical chapter.

Regarding battle outcomes, different musical endings will be triggered depending on whether the player wins or loses. Drawing inspiration from Final Fantasy VII, a victory will be marked by an energetic and triumphant arrangement, while a loss will have a dramatic tone that shifts to a hopeful piece, encouraging the player to try again. This principle will be integrated into our game, as players have the opportunity to instantly retry battles.

Game Research

In this documentation, we take a step further into the music creation for the Role-Playing Game (RPG) Mechanical Saga: Life’s Ultimate Gamble and its adaptive elements. After exploring dynamic implementation techniques in the previous semester, we now apply and tailor them to best serve the gameplay experience. Firstly it is crucial to understand the game’s concept and how it is played, which will be analyzed in this paper. After understanding the game, there is a better foundation on how to go about the music, what qualities it should have, also having in mind that it should ideally sound cohesive throughout the game, maintaining a sense of unity between different scenes.

What is the game about?

 “Mechanical God Saga”is a video game that places a strong emphasis on storytelling, exploration and impactful consequences ofplayer choices. It blends key elements of both Japanese and WesternRPG’s, aiming to revolutionize both genres and craft an immersive experience.

The narrative seeks to explore the themes of totalitarianism, genetic mutation, and the clash between humanity and altered beings in a post-apocalyptic society. It delves into the nature of power, the impact of historical cataclysms, and the potential for coexistence amid oppression and prejudice.

Centuries after a devastating nuclear event, the remnants of humanity are trapped in a totalitarian society that has emerged from the ruins of their former civilization. Genetic mutations, stemming from the nuclear disaster and historical experiments, have given some insects and animals extraordinary abilities, enabling them to coexist with humans as equals.

Among humans, a specific gene continues to manifest, resulting in physical deformities such as wings or scarred skin, which society deems as demonic. This has led to persecution by the God Emperor, who seeks to enslave or exterminate these individuals as a religious campaign.

Amid this turmoil, a mysterious entity known as „The Mechanical God“ observes the unfolding events, while its presence and connection to the nuclear catastrophe remain enigmatic.

The narrative raises critical questions:

  • What drives the God Emperor’s genocidal obsession?
  • Can humans and these so-called demons ever achieve peaceful coexistence, or will one group always dominate the other?
  • What dark secrets link the Mechanical God to the origins of the nuclear disaster and the current societal conflicts?

Through these inquiries, the story examines the possibility of overcoming deeply ingrained prejudices and the quest for truth in a world scarred by its past.

What is the goal of the gameplay?

As a Role-Playing Game, there is a significant emphasis on battles and the evolution of playable characters. The player is encouraged to continually enhance the fighting abilities of the characters, whether by exploring various scenarios to find better items and equipment or by manipulating the flow of battle to maximize gained experience.

Gameplay also serves the narrative (and vice versa) in the sense that the player will constantly have to make decisions, sometimes in dialogues but often in actions during battles, affecting how the game world reacts to the characters. Therefore, on the journey to becoming increasingly capable in battles, the player will inevitably have to decide what kind of character they are building and what actions they are willing to take to accumulate more power. These decisions will be, but not limited to, based on moral and interpersonal dilemmas, prompting the player to seek a balance between the moral compass envisioned for the main character and their insatiable quest for power: essential to overcome and surpass the challenges presented by the narrative.

How is it played?

There are different ways of interacting with the game:

  • Exploration

The player will physically navigate the character through a series of interconnected scenarios filled with secrets and hidden optional paths. For players who prefer a more straightforward experience, there will be a linear path available to follow from beginning to end. However, players that take the time to explore and carefully consider clues presented through dialogue and visual context will be rewarded with unique items and abilities that, when used strategically, can significantly alter their gameplay experience. 

  • Dialogue and Decisions

Throughout the game, the player will encounter numerous opportunities to interact with other characters and the world by choosing from multiple options. These choices not only affect the overall narrative but also have an impact on how characters and the world react to the player. For example, consistently exhibiting aggressive behavior toward a faction will lead its members to share that information and respond with displeasure. This principle applies equally to villagers, city residents, and even allies.

  • Battles

The game features a complex and challenging battle system. Battles occur in turns, with each character performing their action in sequence.

Unlike other games with more traditional turn-based systems, where battle interaction is limited to choosing options from a menu, in Mechanical God Saga, the player has the possibility to perform a sequence of inputs to maximize damage to the enemy. Similarly, when under attack, the player, by pressing the defense button at the exact moment the enemy attack hits, can defend themselves, reducing damage by 50%. Thus, actions in battle are interactive and dynamic.

When attacking, the player accumulates a combo counter. By successfully defending enemy attacks in their entirety, they can maintain the combo from turn to turn, increasing it each time. Increasing the combo also proportionally increases the damage the player inflicts on enemies. Consequently, by increasing the damage done with each action, the player also proportionally increases the experience points earned when executing their turn. In this way, the defense mechanism is not only suitable for reducing received damage but also for increasing the maximum potential damage to be dealt and for gaining experience points more quickly, efficiently, and intelligently.

This combo counter is retained from battle to battle. This requires the player to always be attentive and carefully execute their actions, even when facing weaker enemies on the path to a challenging enemy, as maintaining this combo can be the difference between victory and defeat.

Audio Programming & Mapping Process

So the audio programming process basically took the entire semester, and it continues to be a focus as I refine it to better align with the compositional idea and ensure the patches are as practical as possible.

I started by mapping the sensors in Max MSP. Each sensor can be mapped to seven parameters using MIDI numbers, the 7th parameter is a measure of total acceleration obtained directly from the SOMI-1 receiver. It’s calculated using a Pythagorean formula based on acceleration data, excluding gravitational force to improve accuracy.

Figure I. Movement parameters of SOMI-1 around the X, Y, Z axis

What I learned from the mapping process is that even though I tried to be specific with the movements, generally you cannot isolate just one of these parameters while moving. This is a key difference I need to consider when mapping movement sensors compared to other stable

MIDI controllers. To effectively map the sensors and keep in mind the application of movements, I divided the motions into the 7 parameters for each sensor:

  • Rotation X
  • Rotation Y
  • Rotation Z
  • Acceleration X
  • Acceleration Y
  • Acceleration Z
  • Total Acceleration

   Figure II. An overview of sensor mapping in Max MSP

After completing the initial movement mapping, I began creating patches for the interaction part using the aforementioned list. This process is ongoing and will likely continue until the project’s completion. Meanwhile, a crucial aspect I am keen to focus on this summer is implementing patches to the sensors and testing their compatibility both independently and with the violin.

Throughout my learning, I am aware that due to the violin, my movement mapping is limited, and I am still trying to figure out how I can wear the sensors either on my hands or elsewhere, but it is also clear that the majority of the mapping is for the right hand or for the bowing part. However, there is also the possibility to map the sensors not only to the usual gestures that occur while playing the violin but also to some unconventional movements to trigger certain parameters, which would require more precise mapping in this case. In general, there are a few possibilities for the mapping process that I need to consider and examine thoroughly.

There are several types of mapping strategies that can be employed, regardless of whether the relationship between the control input and the parameter output is linear or non-linear:

One-to-One Mapping: This is the simplest form of mapping where each control or sensor input directly corresponds to a specific parameter or output

Multi-Parameter Mapping: This approach allows a single sensor input to influence several aspects of the sound to control multiple parameters simultaneously or sequentially. There is also this possibility to change the works of the sensors via pedal to have the combination of different tasks for the sensors

What I also have in mind is to avoid counter-intuitive mapping, which involves controlling parameters in unexpected ways, as it adds an element of unpredictability to the performance. I believe this is unnecessary for the project. Instead, my goal for the mapping part is to ensure that my movements have a clear and meaningful relationship with the parameters they control.

Types of Interactions

After finalizing the compositional idea, I attempted to compile a list of all the signal processing types and audio effects that I intend to use in my piece. To practice with fixed elements, the aim was to control the parameters rather than randomize them, so that I can bring the project closer to what I expected and make it executable as well. In other words, I determined which aspects should be under my control and which should not.

Throughout the semester, I experimented with these different types of interactions to explore my sound possibilities and predict how they would affect the sounds of the violin.

  • Multiple Loops (evolving musical patterns) On/Off through monitor?
    On/Off via Sensors? It needs more precise and complicated patches, and the patches should track unnormal gesture and not the way I usually play with violin
  • Audio Signal Processing Pitch shifting
    Time stretching Amplitude/Dynamic Reverb/Delay Timbre
  • Microsoft Word – Exposé II.docx
  • Chorus: to create a thicker sound by duplicating the signal and slightly varying the pitch and timing
  • Sound Synthesis Variations Additive synthesis
    FM (frequency modulation) synthesis Granular synthesis
  • Granular Synthesis
    Grain size:
    shorter grains noisier and more textural, longer grains are more faithful to the recorded sounds of the buffer Grain Density: higher density sounds thicker and more continuous, lower density scattered texture and more noticeable individual grains Grain shape (windowing function): the envelope applied to each grain to shape its amplitude over time, currently I am using the Hamming window Grain position (start time): the start pointing of each grain within the original audio sample Grain playback direction: forward or backward!
    Grain spatialization: just have an idea that the grains move around the listeners from everywhere like a rain!
    Grain sequencing: different order of the grains’ playback for more chaotic textures Randomize the parameters: Which is not my goal, but it’s also another possibility!
  • Spatialization The goal is to begin with headphones that track movements and initially use a binaural mix. Additionally, I plan to explore spatialization, utilizing various IEM plugins, including those designed for granular purposes.

Compositional Aspect

As I started with audio programming, it felt like I walked into an unknown world with many possibilities, and it wasn’t clear to me which direction I should take. I came across a conference proceeding on New Interfaces for Musical Expression by Perry Cook15, which helped clarify my initial steps. In the conference paper mentioned that, “Make a piece, not an instrument or controller.” So, I decided to come up with the idea of a piece or specific musical composition because I didn’t want to create interesting research questions without a real product or future direction.
During the semester, I came up with two compositional ideas, and I aim to work on both of them since the interaction parts between them are not significantly different.

Piece No. I
The first concept comes from my own experience as a classical violinist who transitioned into electronic music. The overall idea involves contrasting aspects of my experience, demonstrating how every action reflects on other elements, and showcasing the complexity of identity. This piece has three sections:

Section I
1. Introduction
2. Then it goes to looping and layering
3. Atonality
4. Dissonant Intervals
5. Full of aggressive bowing
6. Using violin extended techniques
• For the transition part from section I to section II, I will either put the violin aside or have it in my hands but not playing it, instead I will just control the recorded sound and manipulate it with the sensors.
• Then will control the amplitude and do a smooth decrescendo to be prepared for the second section.

Section II
1. After some choreography, build the contrast on top of distorted and fading out loop
2. Play tonal patterns
3. With clear sentences
4. Subtle movements
5. Everything in this section is about contrast
6. Probably no loops at all!

Section III
1. Back to the chaos but this time overlapping and layering consonant intervals and melodies
2. End the piece immediately!
So the concept for this piece is mostly based on multiple loops, evolving patterns, and layering them to create a some sort of polyphony. The goal is to alter these layers through various types of interaction to showcase the role of the sensors. I will also build a contrast between the manipulated sounds and the raw sound of the violin, primarily in section two, and then take ideas from each of the two sections to build the final section based on that, for the conclusion part.

Piece No. II
The second piece also evolves from the first one, but with a difference: we start with the violin just for the beginning, and then the focus is more on the transition from acoustic to electronic sounds.
1. Playing violin melodically and record it into the buffer and create loops
2. Put the violin aside
3. Wear the sensors
4. Start with different types of interactions step by step
5. The end consists of many synth loops blending into each other

Conclusion and Last Interview

General information about the interviewee: 

Age: 22 years
Gender: Female
Occupation: Student

1.  Understanding Facial Emotion Recognition:

  • Can you describe what the term „Facial Emotion Recognition“ means to you?

The process of identifying human emotion. Interpretating human emotions from facial expressions through analysing facial features such as movement of the eyes, eyebrows, or mouth and identifying feelings such as happiness, anger, fear, or surprise. 

2.  Analyzing Facial Expressions:

  • What helps you analyze the facial expression of another person when you see them?

–       The eyes: Changes in the size of the eyes, squinting, or the direction of gaze.

–       The eyebrows: Raised, furrowed, or drawn together.

–       The mouth: Smiles, frowns, pursed lips, or other changes in the mouth’s shape.

–       The environment and context in which the interaction takes place. 

–       Observing if the facial expression is consistent with the body language. 

  • Which specific features and aspects do you pay attention to in this scenario?

I look at whether the eyes are wide or narrow, if there is direct eye contact or not, also in combination with the eyebrows if they are raised or furrowed. I look at the mouth, if someone is smiling, tight-lipped or frowning. I look at the forehead. 

3.  Proportion and Arrangement:

  • Are the proportions and arrangement of facial features important to you when perceiving a face? If so, how?

Yes, the proportions and arrangement of facial features are important when perceiving a face. I perceive symmetrical faces generally more attractive. Balanced features tend to create a more pleasing and harmonious look. On the other hand, unique proportions and arrangements can make a face more memorable. It helps to recognize and remember faces. 

4.  Influence on Emotions:

  • Can certain facial features and expressions influence the way you feel? How do they affect your emotions?

Yes, certain facial features and expressions can indeed influence the way I feel. Seeing someone smile can naturally make me feel happier and more connected. Just as a smile can create happiness in me,while witnessing someone’s sadness, anger, or fear will make me feel also sad, angry or fear. If someone shows a fearful expression, I might also feel a sense of urgency or alertness, especially if the context suggests a shared threat or danger. Positive facial expressions, such as nodding and smiling, can provide positive reinforcement, boosting my mood and confidence. 

5.  Attractiveness of Faces:

  • What attributes make a face attractive to you? What role does symmetry play?

Faces that are more symmetrical are generally perceived as more attractive. The overall harmony and balance of facial features contribute to attractiveness. Features should complement each other without any one feature overpowering the others. A genuine smile that reaches the eyes. I find that unique or distinctive features, such as a particular eye shape, freckles, or a charming smile, can make a face stand out and be more memorable.

·      Is there a difference for you between the attractiveness of male and female faces?

There is a difference in the way I perceive the attractiveness of male and female faces. Certain attributes make a male face more attractive and vice versa. For example, facial hair like a beard or stubble, can add to the attractiveness by emphasizing masculinity. On the other hand softer and more delicate features emphasize femininity. 

6.  Preferences for Makeup:

  • Do you prefer faces with a lot of makeup, very little makeup, or no makeup at all? What degree of makeup do you like the most and why?

I appreciate faces with a natural look enhanced by a touch of makeup. A light touch of makeup can enhance natural features like the eyes, cheekbones and lips but without overpowering them. Minimal makeup maintains the authenticity of the person’s appearance, making them look more genuine and approachable. 

7.  Facial Shapes in Everyday Objects:

  • Do you sometimes recognize facial shapes in everyday objects around you? If yes, can you provide a few examples?

No, I don’t recognize facial shapes in everyday objects around me. 

  • Does this recognition affect the way you feel about those objects or the way you interact with them?

No recognition of facial shapes in everyday objects, therefore no influence on the way I feel or interact with them. 

8.  Controlling Facial Expressions:

  • Do you sometimes try to control your own facial expressions to hide your true feelings or to display emotions you aren’t actually feeling? If yes, why do you do it, and how do you manage it?

I rarely do it, only in particular social situations out of politeness. Sometimes, I prefer to hide anger or frustration to prevent escalating a conflict. Maintaining a neutral or positive expression in professional settings is important for appearing competent and approachable. For example, in roles that involve customer interaction, showing positive emotions can enhance customer satisfaction, even if those emotions aren’t genuine. Deep breathing can help in calming downs and maintaining a neutral expression. Moreover, being aware of your own facial expressions and body language can help to manage it. 

9.  Relevance to Art and Design:

  • How do you think understanding facial shapes and expressions can benefit artists and designers in their work?

Potentially understanding facial shapes can help artists create more realistic and accurate portraits. Artists can create more dynamic and expressive faces. In addition, it can potentially help designers create more effective marketing campaigns that evoke specific emotions and therefore make the campaign more compelling and persuasive. Looking at the future, it could also help with creating realistic and expressive avatars enhancing user interaction and experience of virtual and augmented reality. 

  • Can you think of any examples where the perception of facial features has influenced a piece of art or a design project?

The first art piece that I thought of is Leonardo da Vinci’s “Mona Lisa”, as it has intrigued viewers for so many years. Another design could be the Memoji or Animoji by Apple. It is a feature on the phone that uses facial recognition technology to create a character that mimic user’s facial expressions. Advertisements often use faces with specific expressions to evoke emotions such as happiness, trust or excitement. 

Conclusion

Working with a variety of media—graphical images in different styles, photography, and direct interviews—proved to be highly engaging and insightful for our experiments. This diversity enriched our design research by providing multiple perspectives and deeper insights into user experiences and emotions. The next step is to utilize this material to create a comprehensive survey for a larger test group. By simplifying the topic, directly asking targeted questions, and incorporating different images to evoke and assign emotions, we aim to gather valuable data to further refine our design approach.

19. Series of Interviews 02

General information about the interviewee:

Age: 62
Gender: Male
Occupation: communications engineer

1.  Understanding Facial Emotion Recognition:

  • Can you describe what the term „Facial Emotion Recognition“ means to you?

    Facial Emotion Recognitionis a technique for recognising human mood and emotions based on a person’s facial expression. It plays a role in areas such as medicine, security, market research and social research. Automated methods, AI and machine learning algorithms offer great potential for the future use of this technology.

2.  Analyzing Facial Expressions:

  • What helps you analyze the facial expression of another person when you see them?

    When analysing another person’s facial expression, visual cues such as movements of the eyebrows, eyes, mouth and other facial muscles play a crucial role.
  • Which specific features and aspects do you pay attention to in this scenario?

    In the course of evolution, reactions such as the movement of the eyebrows, eyes, mouth, forehead or other areas of the face have become established in human facial expressions. The purpose of these facial expressions is non-verbal communication with the other person, which takes place unconsciously. These facial reactions, which cannot be fully controlled, can be used to draw conclusions about the emotions that trigger them.

3.  Proportion and Arrangement:

  • Are the proportions and arrangement of facial features important to you when perceiving a face? If so, how?

    Yes, they play an important role in distinguishing between different faces. Computer algorithms for facial recognition work with a few distinctive proportions that can identify a face quite clearly. In addition to other features such as colour or skin texture, proportions play an important role in assessing the beauty of a face. Both general and personal standards are used to assess the beauty of a face.

4.  Influence on Emotions:

  • Can certain facial features and expressions influence the way you feel? How do they affect your emotions?

    Certain facial features do have an effect on how a person is perceived. A person with a pretty face is perceived as more likeable. Emotions in the other person’s face directly influence your own emotions, such as pity, joy, confidence or affection.

5.  Attractiveness of Faces:

  • What attributes make a face attractive to you? What role does symmetry play?

    Attributes that make a face attractive are colour, symmetry, proportionality, clear skin, youthfulness, and positive expressions. Symmetry plays a crucial role because it signals genetic health and developmental stability, making the face more visually appealing.
  • Is there a difference for you between the attractiveness of male and female faces?

    Absolutely! Hard and angular faces can definitely look attractive on men, whereas I prefer soft faces on women. A woman’s beard is irritating, whereas it is normal for a man.

6.  Preferences for Makeup:

  • Do you prefer faces with a lot of makeup, very little makeup, or no makeup at all? What degree of makeup do you like the most and why?

    I find make-up rather distracting on men. If at all, it has to be very minimal and inconspicuous.
    For women, I also prefer less make-up, which is always appropriate and gives a natural look. However, some types can tolerate more make-up, which can create a somewhat more exotic look. However, this comes with the risk that it may not suit your own type.

7.  Facial Shapes in Everyday Objects:

  • Do you sometimes recognize facial shapes in everyday objects around you? If yes, can you provide a few examples?

    Yes, sometimes you can recognise facial shapes in objects. A well-known example is the man in the moon, but you can also recognise facial shapes in clouds or in the leaves of trees.
    Some things, such as emojis or pumpkin faces, are modelled on the face.
  • Does this recognition affect the way you feel about those objects or the way you interact with them?

    Absolutely. The moon, for example, is humanised, given a kind of soul. The „smile“ emoji, for example, was developed to generate a positive reaction.

8.  Controlling Facial Expressions:

  • Do you sometimes try to control your own facial expressions to hide your true feelings or to display emotions you aren’t actually feeling? If yes, why do you do it, and how do you manage it?

    Yes, it is often necessary to hide your emotions as much as possible. It is often not customary to show emotions at business meetings. Emotions can signal weakness to the other person, which can be undesirable, especially in competitive situations.
    Finally, a tendency to partially conceal feelings has also developed as a result of upbringing.
    However, certain facial expressions occur unconsciously and can only be controlled imperfectly.

9.  Relevance to Art and Design:

  • How do you think understanding facial shapes and expressions can benefit artists and designers in their work?

    There are two aspects to assessing facial shapes and expressions in art and design.
    Firstly, artists often want to convey or express a mood or emotion. In doing so, they can utilise the effect of facial expressions on other people, which has developed over the course of evolution.
    On the other hand, artists are often dependent on consumer feedback. Here too, an understanding of facial shapes and expressions can be useful.
  • Can you think of any examples where the perception of facial features has influenced a piece of art or a design project?

    Mona Lisa, for example. Or the painting „The Scream“ by Edvard Munch.
    The Greek statues had rather flat facial features, many monarchs had themselves portrayed with determined facial features. In film art, facial expression is an essential element.

General information about the interviewee:

Age: 24 years
Gender: Male
Occupation: PhD Student in Visual Analytics
TU Graz

1. Understanding Facial Emotion Recognition:

Can you describe what the term „Facial Emotion Recognition“ means to you?

When we want to express our feelings, we show them to others by changing our face muscles – known as facial expressions. Another person picks up these emotions through our face which is facial emotion recognition. 

2. Analyzing Facial Expressions:

What helps you analyze the facial expression of another person when you see them?

Shape of mouth (laughing, engineer smile 😐, etc.), muscles around eyes, eyebrows

Which specific features and aspects do you pay attention to in this scenario? See previous question

3. Proportion and Arrangement:

Are the proportions and arrangement of facial features important to you when perceiving a face? If so, how?
A symmetrical face is beautiful to look at, but asymmetrical or disproportional features catches our attention easier.

4. Influence on Emotions:

Can certain facial features and expressions influence the way you feel? How do they affect your emotions?
When a person expresses sadness through hanging eyebrows, sad smile, tears streaming down their face, we naturally won’t feell extremely happy about that – it also makes us feel sadness in some way.

5. Attractiveness of Faces:

What attributes make a face attractive to you? What role does symmetry play?
Correct proportions – this is possible through symmetry. Well adjusted lips – not that botox stuff, especially when the botox thing goes wrong, it destroys symmetry. A skin with correct color tone indicates healthiness -> attractive. 

Is there a difference for you between the attractiveness of male and female faces?
Female faces are attractive by their majestic and elegant shape, they look smooth and symmetrical. Attractive male faces have distinctive features, they look more angular, more “rough” which might indicate strength and robustness in some way (I believe?) 

6. Preferences for Makeup:

Do you prefer faces with a lot of makeup, very little makeup, or no makeup at all? What degree of makeup do you like the most and why?
Of course a shitload of makeup is certainly not attractive and makes the face look artificial and plastic. Very little makeup might help to hide unwanted features or put more emphasis on good features such as lips.

7. Facial Shapes in Everyday Objects:

Do you sometimes recognize facial shapes in everyday objects around you? If yes, can you provide a few examples?
Cars – every single car has that face with the two headlights and the radiator grille. There is even a saying in Austria “Do hob I erstmoi gschaut wie a Auto”

Does this recognition affect the way you feel about those objects or the way you interact with them
Probably a question for psychologists, since this belongs to the research area of our subconscious behavior.

8. Controlling Facial Expressions:

Do you sometimes try to control your own facial expressions to hide your true feelings or to display emotions you aren’t actually feeling? If yes, why do you do it, and how do you manage it?
There are certain funny situations, for example when playing a game with friends which is about lying, you need to have a “Pokerface”

9. Relevance to Art and Design:

How do you think understanding facial shapes and expressions can benefit artists and designers in their work?
If you want to make people buy certain products, you want them to feel happy – so maybe arouse their subconciousness with the help of a “hidden” happy face.

Can you think of any examples where the perception of facial features has influenced a piece of art or a design project? Sorry, no.