#20 | Demonstration video & reflections

At the end of this semester I would like to give a short demonstration of how my prototype works. Therefore I created a short video that shows the functionality of the prototype.

Reflections

All in all, I enjoyed the process of Design & Research this semester. This time the work was more hands-on, consolidating my research from the first semester into a rough prototype. I was able to overcome my initial doubts as to how I could make a valuable contribution to my chosen topic, as there are already existing solutions. The potential I saw in my idea was confirmed by the feedback interview I conducted with the Institut für Epilepsie in Graz.

As one can see, this prototype is at a very early stage. It needs to be refined based on future feedback, in it’s interaction logic and real content, as well as in the sound and visual design to address emotional perception as well. This prototype could be an test object in evaluation practices such as expert reviews, interviews and tests to further develop this concept.

Resources

Freesound.org. Downtown Vancouver BC Canada by theblockofsound235. Retrieved June 26, 2024, from https://freesound.org/people/theblockofsound235/sounds/350951/

Freesound.org. VIZZINI_Romain_2018_2019_Emergency alarm by univ_lyon3. Retrieved June 26, 2024, from https://freesound.org/people/univ_lyon3/sounds/443074/

Freesound.org. Busy cafeteria environment in university – Ambient by lastraindrop. Retrieved June 26, 2024, from https://freesound.org/people/lastraindrop/sounds/717748/

#18 | Prototyping tailored emergency information

Now for the most important feature: Allowing people with epilepsy to tailor emergency information to their own form of the disease could actually make a difference, since epilepsy can manifest itself in many ways. I iterated on this for the digital prototype.

Start an alert manually

As I found out in my interview with the Epilepsy Institute in Graz, the use of sensors to detect an incoming seizure should be more precise and reliable. Also, many people do not feel the so-called aura before a seizure. For this reason, a warning to bystanders should be triggered automatically. But providing a manual way to start the alarm could make additional detection devices unnecessary for some people with epilepsy. So I rewrote the label on the app’s start screen to add the „manually“ hint.

Providing general and customized emergency information

In order to differentiate between the default pre-made content for general seizure care and the ability to display customized emergency information, I changed the selection component. I replaced the dropdown, mainly for multiple options, with a single switch between default and custom states to simplify the interaction.

The default „General seizure information“ cannot be customized, but is displayed if selected. Once a user selects „Custom Information“, each emergency step can be changed visually and textually in the input fields. At this point, the default information serves as a starting point so the user doesn’t have to start writing from scratch and can keep some information. In addition, users can optionally upload graphics or capture photos to visually support the points they are making.

Amount of emergency information

While designing the emergency information settings, I asked myself how much information should be displayed in the settings and what’s the maximum amount of information that people with epilepsy should create.

Because of the tricky circumstances in emergency situations that require short instructions, I decided to limit the number of steps to five and the ambulance information to no more than the default nine. The content of five steps is more likely to be remembered, but uncertain under these circumstances. Also it is easier to navigate between less steps. It is always possible and preferable to choose less information.

Aspects to be discussed and next steps

For the sake of customization, many unanswered questions came up during the prototyping process.

  • Custom emergency information: How do we bold text or add bullets? Do we need these aspects here?
  • User profile: Where do we customize information about the person with epilepsy, such as their name, picture, and how they want to address first responders before the emergency steps are displayed?
  • Organization of custom emergency steps: How do we delete or rearrange emergency steps? How do we reset custom emergency steps to the default content?
  • Thank you screen: Should the thank you screen be customizable and where can it be edited?

Answering these questions may be necessary in the future, but in order to validate this app concept, to see if potential users will even understand it and get value out of it, I will leave it at that for now.

#17 | Prototyping gratitude advancement

In order to improve the interaction and overall experience between the person with epilepsy and their first responder, I introduced a new idea in the paper prototype: Once the emergency steps are complete, first responders can optionally leave their contact information in case the person is unconscious or is taken by ambulance. Using the contact information, the person with epilepsy can contact them afterwards to show their gratitude. I iterated on this for the digital prototype.

Types of contact

The first thing I considered for this feature was how to get back to your first responder and what type of contact would be best for that purpose. As it is likely that both people will be strangers, the contact method should be as low-threshold as possible when it comes to personal data. Either leaving a phone number or an email address would meet these requirements from my point of view. Since the input is not mandatory, I gave the possibility to skip this step with a button.

Thoughts on individualization

During this iteration, a thought emerged. Reaching out personally could be one option, but leaving a personal message such as a video or voice recording could be another. The first option would require less effort in terms of how people with epilepsy have to set up their app, but the latter option could provide more direct feedback. However, a recording of any kind would need to be more generalized, as the circumstances of incidents can vary. For this reason, I decided to go with the first option, which seemed simpler.

End of usage

The last consideration was how to end the use of the app for a first responder after they have started the process, followed the emergency steps, completed the process, and optionally left contact information. Now that the first responder’s task is complete, their access to the smartphone should be minimized. This could be done by returning to the lock screen. If the person with epilepsy needs to rest for a while, or if medical professionals take over after the incident, it is still useful to provide access to the medical ID and make it visible on the lock screen.

Next steps

So far, I have focused mainly on the practical use of this prototype. But for a holistic user experience, the emotional aspect needs to be added in any case. This could be done by getting more graphical and defining the overall visual design.

#16 | Prototyping emergency steps

I iterated on my paper prototype by working on the emergency steps and putting more thought into it. For advanced interactivity, I decided to use Figma as a digital prototyping tool.

Added Done button

First, my paper prototype did not yet include an interaction to end an alert after a person with epilepsy has had a seizure. Considering all the possible interactions that need to be displayed on these screens, I decided to place a button next to the elapsed time and named it „Done“. Since this interaction is related to the whole screen and the elapsed time, this seemed to be the right area.

This position could also prevent people from unintentionally ending the emergency information. It is still questionable whether this button is needed on every single screen throughout the steps, but I think it makes sense.

However, the main interactions of displaying the medical ID or making an emergency call can remain in the bottom area this way.

Added confirmation to end emergency steps

To prevent unintentional exits, I also added a confirmation dialog to make sure people know when and if they really want to end the emergency steps.

Thoughts about displaying a person’s name

At the same time, I was thinking about where to display the name of the person with epilepsy. By giving first responders the ability to refer to a person with epilepsy by their name, I assume they will have a more personal address. Since the emergency step screens already have a lot of elements on them, I decided to display the name once at the beginning before going into the emergency steps.

Added overlay for „When to call an ambulance“

I also added a collapsible element to the prototype that contains information about when it is actually necessary for first responders to call an ambulance. This can be opened at any time during the emergency steps. This should prevent the user from making premature emergency calls that are not necessary in every situation. I tried to keep the information short and easy to scan.

Next steps

At this stage of the prototype, two supporting aspects of the emergency steps are still missing: Visual support through graphical material for each step, and auditory cues to make the whole process more noticeable and usable when not looking at the phone. These features will need to be added in further development.

05 – Coding with AI to prototype

Introduction

Alternate Reality Games (ARGs) are an intriguing fusion of real-world and digital experiences, designed to engage participants in interactive narratives and puzzles. These games rely heavily on web technologies to bridge the gap between fictional elements and players interacting in the real world. My participation in an AI Coding workshop during the International Design Week at FH-Joanneum has opened my eyes to the revolutionary potential of Artificial Intelligence (AI) in web development, specifically for creating rapid prototypes of websites. In this blogpost I want to explores how AI can be leveraged to enhance the development process of ARGs, ensuring both efficiency and innovation.

What can AI do for an ARG creator?

AI’s integration into software development has been transformative, offering tools that automate coding, streamline processes, and optimize user interactions. In web development, AI technologies have begun to play a pivotal role, especially in automating repetitive tasks and generating code from natural language inputs. For ARGs, which require dynamic and immersive web environments, AI can be a game-changer, offering rapid prototyping capabilities that accommodate the complex, evolving nature of these games.

GPT4, with its 14 million tokens, is an extremely powerful tool to develop HTML, CSS and Javascript code, it’s able to use specific libraries such P5, ML5 or Openprocessing.

The AI Coding workshop at the International Design Week provided practical insights into this tools. One key takeaway was the capacity of AI to not only understand and generate code but also to adapt to the developer’s style and project-specific requirements, which is crucial for the unique narratives and interactive elements of ARGs. The workshop emphasized AI’s role as a collaborator, enabling a more intuitive design process that aligns with the creative demands of ARG developer, which in the case of my thesis project it’s me.

In the context of ARGs, AI can streamline the entire development lifecycle. During the initial concept phase, AI can help simulate different narrative pathways, allowing developers to refine the story before coding begins. In the design phase, AI-powered tools can suggest web design elements that match the theme of the game. For coding, AI can quickly generate responsive layouts and interactive elements, essential for an ARG that might include puzzles or clues embedded in the website.

Conclusion

The integration of AI into the development of ARG websites, as inspired by the AI Coding workshop at the International Design Week of FH-Joanneum, presents a compelling advancement in how interactive narratives are crafted and experienced. As AI tools continue to evolve, so too will the possibilities for creating more engaging, immersive, and personalized interactive narratives.

#14 | Paper prototyping Ad-Hoc First Aid Care

With regards to the example of Aura: Seizure First Aid mentioned in the last blog post, I iterated on a paper prototype regarding „Ad Hoc First Aid Collaboration with the Public“. This prototype had its origin in an exercise at the very beginning of the course Design & Research 2.

Underlaying concept idea

The present prototype represents a mobile app, which has to be manually activated by persons with epilepsy in case of emergency, if they notice an upcoming aura. Surrounding bystanders are then addressed by visual and auditory cues to pay attention to the smartphone.

Having a look on the smartphone, bystanders are shown what’s the matter with the person and how to help. This follows a general set of emergency information which is applicable to nearly every type of epileptic seizure. Also bystanders receive information about when it is necessary to call an ambulance.

Advancement: Tailored emergency information

One of the biggest advancements might be the possibility to individually customize emergency information to an affected person’s specific condition. For this I added customizability to the app’s settings. Besides the general emergency information, which is set as a default, users are able specify the shown information steps to their own needs. A visual representation (picture, pre-made illustration etc.) and a short textual instruction in a few sentences can be chosen, and further steps can be added.

Advancement: Findability of medical ID

Another addition is a lock screen widget, once an alert has been activated by the person with epilepsy. In case of a locked phone, this piece helps to give an understanding and access to the first aid instructions as well as the medical ID. The latter is a often hidden feature on mobile operating systems, which gets more visible with the widget once an alert has been started.

Advancement: Expressing gratitude to bystanders

Lastly another advancement could be an extended way to end the app experience, when a seizure has been overcome: Besides communicating appreciation and recognition for the provided help, bystanders can optionally leave their contact details (e.g. a mobile number) for the affected person. Afterwards the rescued person is able to get in touch with its helper.

Auditory cues

As recommended by experts mentioned in the previous blog post the auditory level should play an important role as well, when it comes to the smartphone’s findability and support to follow first aid instructions. Obviously a paper prototype can not provide sound due to its material. This is where a, at least partly, digital component have to come in.

Next steps

Iterating and extending the paper prototype was quiet easy and quick. However its detail has to undergo refinements and the eventualities for various circumstances have to be considered. At this point it makes sense to transfer this haptic paper prototype into a digital prototype, to be able to add interactivity and sound.

First Prototypes

In the first lecture of our course Design & Research 2, we were instructed to develop 1-3 prototypes within a three-day timeframe. These prototypes were expected to be „quick and dirty“, yet still relevant to our chosen topic. Initially, I thought this would be challenging and time-consuming. However, it turned out to be both fun and helpful. 

I began by brainstorming ideas. Because I am still in the early stages of the design process, I found it hard to already envision potential solutions. Setting a four-minute timer, I began writing down any ideas that came to mind on post-it notes. Under time pressure, my brain generated numerous concepts. I then collected these ideas and proceeded to sketch out various concepts. 

Using this method results in the creation of low-fidelity concepts without extensive contemplation. I might not end up using any of the concepts, but it was a great way to start the creative thinking. 

PROTOTYPES

I ended up creating prototypes out of two different ideas: 

Multiplayer Keyboard

This concept is inspired by the observation that many musicians experience a lack of social aspect in their practice. They often end up prioritizing team sports such as football over solo instruments, drawn to the motivational aspects of teamwork. Playing in a band or orchestra is an option, but this idea aims to facilitate collaborative piano playing among friends. Whether with two or four players, the proposed product would enable the creation of harmonious melodies and the exploration of various tones and melodies with friends. 

Improvisation Motivator

During my research, I observed benefits of improvisation in music. Musicians in all ages can struggle with motivation and to find joy in playing their instruments. It typically becomes overly rigid and challenging, leading to decrease of self-esteem in music. With this proposed concept, the aim is to make music practice more fun and playful. The product contains a built-in metronome and a speaker capable of playing various chords and drumbeats. Intentionally, the improvisation motivator should lower the threshold for solo improvisation practice. Traditional music sheets can be limiting, hindering users from learning in a more enjoyable and interactive manner. To utilize this product, users would simply press the blue button located on the top, enabling them to play over self-selected chords and beats. This approach makes it easier for individuals to experiment with the right tones and create new melodies on their own.