Case Studies of VR in Industrial Training

1. https://www.boeing.com/

2. https://www.siemens.com/

3. https://www.ford.com/

4. https://www.nasa.gov/

5. https://www.ge.com/

User Experience Design for Industrial VR Applications

Calm Technology // 15

This week I tackled the problem of the power supply and decided to make a more polished prototype for it, as I do not expect any changes to this part of the project for the time being. This means that I want to enclose all the functional parts in housings and make it look like a normal power supply as used in household or consumer electronics.

To start with, I ordered a 240V to 12V transformer with a 2.5A output, which will give me a bit of headroom and also power the Wemos board. The rest of the set up consists of a textile power cord, some Arduinio wires, a plug, some shrink sleeves and my custom printed case for the transformer. As you can see in the picture below, there are two versions of my custom printed case. I would like to say it is an iteration, but it is because of a measurement error on my part and the first version is not usable. Which just goes to show that the old carpenter’s adage „measure twice, cut once“ also applies to 3D printing in rapid prototyping.

Parts

I started by connecting the textile power cord to my custom case, and while the glue was drying on that end, I attached the plug to one side and the arduino wires for use in the breadboard to the other. I started by attaching the textile power cord to my custom housing, and while the glue was drying on that end, I attached the plug to one side and the arduino wires for use in the breadboard to the other. When that was done, I connected both ends of the cable to the transformer and fixed the transformer in its custom housing. The final result with the build together housing is shown in the picture below.

Connecting & Soldering
Final Outcome

Now that the power supply is finished, the setup for controlling both motors at full power is complete. The next step will be to build a rough functional prototype capable of generating the movements needed for Tap.

LoopBoxes – Evaluation of a Collaborative AccessibleDigital Musical Instrument

The article „LoopBoxes – Evaluation of a Collaborative Accessible Digital Musical Instrument“ sheds light on an innovative approach to music-making, specifically tailored for children with special educational needs (SEN). The article presents the development and evaluation of LoopBoxes, an accessible digital musical instrument aimed at children with special educational needs (SEN). The instrument comprises three modules designed to facilitate both individual and collaborative music creation. The study includes a pilot study conducted at a music festival in Berlin and a case study in a SEN school setting.

In the pilot study, informal observation and questionnaires from 39 participants were used to assess the instrument’s functionality. The feedback indicated that LoopBoxes successfully enabled music making for individuals of all ages and musical backgrounds, fostering collaborative musical processes. Participants found the instrument aesthetically appealing and easy to use, with positive feedback on its tangible interaction and direct feedback mechanisms.

The case study in the SEN school setting involved observations during music workshops and a semi-structured interview with a music teacher. The study revealed that while some students were able to engage with the instrument independently, others, particularly those with complex disabilities, faced challenges with certain interaction aspects. The modular design of LoopBoxes was praised for its flexibility, allowing students of varying abilities to participate in music-making activities. Future improvements for LoopBoxes include expanding the modules to cater to a wider range of users, providing didactic materials to support collaborative music making, and enhancing the instrument’s flexibility and ease of use. The article emphasizes the importance of individual exploration, scaffolding for collaborative music making, and the need for technology that is ready for immediate use in school settings.

Let’s delve into some aspects of this research and its implications.

Firstly, the development of LoopBoxes demonstrates a commendable effort to create an inclusive musical environment. By targeting children with SEN, the researchers address a demographic often overlooked in mainstream music education. The modular design of the instrument is particularly noteworthy, as it allows for customization and adaptation to suit varying abilities and preferences. This flexibility is crucial in catering to the diverse needs of the target audience.

The findings from both the pilot study and the case study provide valuable insights into the effectiveness and usability of LoopBoxes. The positive feedback regarding the instrument’s functionality and accessibility is encouraging, indicating its potential to facilitate music-making experiences for individuals of all ages and backgrounds. Moreover, the emphasis on collaborative music creation is commendable, as it promotes social interaction and teamwork among participants.

However, the study also highlights some challenges and areas for improvement. It is concerning that certain students, particularly those with complex disabilities, faced difficulties with certain interaction aspects of the instrument. This underscores the importance of continuous refinement and adaptation of assistive technologies to ensure they truly meet the needs of all users. Additionally, the need for didactic materials to support collaborative music-making suggests a gap in resource provision, which should be addressed to maximize the educational benefits of LoopBoxes in school settings.

Looking ahead, the proposed future improvements for LoopBoxes offer promising avenues for enhancing its functionality and accessibility. Expanding the modules to cater to a wider range of users and enhancing flexibility and ease of use are crucial steps in ensuring inclusivity and usability. Moreover, the emphasis on individual exploration and scaffolding for collaborative music-making aligns with best practices in special education and should be further integrated into the design and implementation of LoopBoxes.

In conclusion, the research on LoopBoxes represents a significant step towards creating inclusive musical environments for children with special educational needs. While there are challenges to overcome and areas for improvement, the findings underscore the potential of digital musical instruments to foster creativity, social interaction, and learning among individuals of diverse abilities. With continued refinement and investment, LoopBoxes have the potential to make a meaningful impact in the field of music education and accessibility.

ID1 – Where Few NIMEs Have Gone Before: Lessons in instrument design from Star Trek

I picked the paper titled „Where Few NIMEs Have Gone Before: Lessons in instrument design from Star Trek“ by S. M. Astrid Bin. I chose this paper specifically because I was intrigued by the novel idea of basing scientific research and experimentation on ideas shown in popular media and culture. Following, I will give a short summary of the paper and provide my own reflections and opinions about it.

In the paper the author details her journey from her discovery of musical instruments shown in the Sci-fi multimedia series „Star Trek“, to analyzing its structure and purpose in the show and the creation of a functional prototype, DMI (Digital Musical Instrument), of one of such instruments.

She starts of by talking about the role which music and musical instruments serve in Star Trek. Namely, they serve as storytelling devices – they indicate intelligence, are a symbol for humanity and serve to show how different (or similar) alien cultures are to our own. The author briefly talks about three instruments, but ultimately focuses on a fourth one, the Aldean instrument.

A clip showing the instrument as it appears in the show. Its main function serves as a storytelling device and not as an actual, usable musical instrument.

She starts by analyzing the instrument and identifying its core features. She also mentions how she was able to get in contact with Andrew Probert, the prop designer for the instrument. Once she had a basic set of guidelines to follow, she started work on the physical prototype. The main body of the instrument was made by laser cutting layers of board and sticking them together. Then other features such as lights, a hand grip and decorative designs were added.

I found the way she translated the instruments‘ ability (as shown in the show) to „translate thoughts“ (aka, to express one’s thoughts/feelings through sound) into actual reality by utilizing just two sensors quite fascinating. The author decided to keep it simple and decided to recreate the two basic responses shown in the show, which are calmness and excitement. To do so, she utilized an acceleration sensor (which we have also already utilized via ZigSim/Sensors2OSC) and a Trill Bar to detect touch (how many fingers are touching? How hard are they gripping/touching?). With just these simple sensors, she was already able to achieve quite a high level of sophistication with regard to actually playing the instrument.

She also talked about how she combined sound and interaction. She created the sounds in Ableton Live and made them interactable in the software Pure Data.

The finished reconstruction of the Aldean instrument.
Image taken from the paper.

Lastly, the author talks about how it feels to play it and about her reflections and takeaways. The instrument isn’t hard to control but requires a certain level of self-control and calmness. Depending on how many fingers are used, how tightly it is gripped and the way it is moved, the sound is either a soft, sparkly sound or a more aggressive sound.

I was able to find a video of her playing the instrument. I was also able to find a link to her website which can be found here: S. Astrid Bin.

One aspect the author mentions, which I found quite intriguing, is how designing a functioning DMI and a TV prop differ. When designing a DMI, the technology is always the main focus and center of the design process. Here, questions such as, „how is it powered?“ and „what can it do and connect to?“ are of vital importance.
The Star Trek instruments on the other hand were designed totally in opposition to this. They did not have to work because the sounds were just added later in postproduction. Instead, they were designed as storytelling devices – how can this instrument convey the message the writers want to get across? What kind of alien culture would come up with such an instrument? How is played? When is music played in this culture? Even though both viewpoints deal with the creation of a musical instrument, they are radically different viewpoints. I found this an interesting point.

The most important
takeaway from this process is that an instrument’s cultural
context, its life as an object that has a role and purpose
in an artistic setting, provides more useful boundaries for
instrument design than technical requirements or available
technical affordances.

Quotation directly from the paper – see 5. Conclusion

I also found it interesting how she came up with the idea of using an accelerometer for the instrument. When she was texting on Instagram, she dropped her phone and an error message popped up saying „rage shake“ – the app thinking that she was shaking her phone in rage (quite interesting how the most common response to frustration seems to be to shake the device). While also being a humorous and unexpected, it also raises some questions why Instagram would be tracking one’s accelerometer data. In any case, I found this quite interesting – inspiration can come unexpectedly and from any place.

All in all I found this paper to be quite interesting. It was approachable for somebody who is not well versed in music/audio production and terminology, such as myself, and focused a lot on interaction design, which was very interesting to me. Basing it on a commonly known, popular media franchise also served to create interest.

15 | Singing Aid

During my exploration of digitalisation in theatre, I noticed that I was not making progress and had difficulties in developing concrete solutions. In our other project „Projection Mapping“, where we create stage visualisations, our group places great emphasis on real-time visualisation and audio reactivity. Therefore, my thoughts often revolved around sound, music, audio, and real-time feedback. This made me aware of an issue in my everyday life that, while not directly related to theatre, is still artistically relevant.

At the beginning of the semester, I decided to join a choir and start singing again. I quickly realised that little remained of my past choral experience and my voice was equally rusty. During rehearsals, I had great difficulty hearing myself to determine if I was hitting the notes and the rhythm, especially in four-part singing. It is enormously difficult to hit the correct note precisely after a long break. I often wished to receive direct feedback during rehearsal to understand if my self-perception matched the actual singing and to adjust the pitch immediately.

I have developed a small prototype of how I envision such assistance. Initially, it was important to determine which aspects make up singing and where direct feedback is useful:

  • Pitch
  • Beat rhythm
  • Melodic rhythm
  • Volume
  • Emphasis, pronunciation

For the prototype, I focused on feedback regarding pitch and melodic rhythm.

Idea

My idea is based on a small device that provides direct feedback through vibration about correct and incorrect intonation. If sung too high or too low, it gives corresponding vibration feedback. An extension of this tool could also relay the rhythm of the piece through vibration feedback.

Concept

The tool consists of a small microphone and a vibration motor. This device can be clipped near the mouth on one’s clothing, capturing the individual’s singing and vibrating if sung incorrectly.

Another type of silent feedback could be sending a small light signal. With this approach, it would be possible to connect a small type of clip with the microphone that can be attached to the music score. This too would signal errors by lighting up if the singing is too high or too low.

Since analysing mistakes after rehearsal is also crucial, an AR app would be a possible implementation to display errors directly on the music score. If the score is also available digitally, this could be directly entered in the app. This allows for preparation for the next rehearsal.

05 – Coding with AI to prototype

Introduction

Alternate Reality Games (ARGs) are an intriguing fusion of real-world and digital experiences, designed to engage participants in interactive narratives and puzzles. These games rely heavily on web technologies to bridge the gap between fictional elements and players interacting in the real world. My participation in an AI Coding workshop during the International Design Week at FH-Joanneum has opened my eyes to the revolutionary potential of Artificial Intelligence (AI) in web development, specifically for creating rapid prototypes of websites. In this blogpost I want to explores how AI can be leveraged to enhance the development process of ARGs, ensuring both efficiency and innovation.

What can AI do for an ARG creator?

AI’s integration into software development has been transformative, offering tools that automate coding, streamline processes, and optimize user interactions. In web development, AI technologies have begun to play a pivotal role, especially in automating repetitive tasks and generating code from natural language inputs. For ARGs, which require dynamic and immersive web environments, AI can be a game-changer, offering rapid prototyping capabilities that accommodate the complex, evolving nature of these games.

GPT4, with its 14 million tokens, is an extremely powerful tool to develop HTML, CSS and Javascript code, it’s able to use specific libraries such P5, ML5 or Openprocessing.

The AI Coding workshop at the International Design Week provided practical insights into this tools. One key takeaway was the capacity of AI to not only understand and generate code but also to adapt to the developer’s style and project-specific requirements, which is crucial for the unique narratives and interactive elements of ARGs. The workshop emphasized AI’s role as a collaborator, enabling a more intuitive design process that aligns with the creative demands of ARG developer, which in the case of my thesis project it’s me.

In the context of ARGs, AI can streamline the entire development lifecycle. During the initial concept phase, AI can help simulate different narrative pathways, allowing developers to refine the story before coding begins. In the design phase, AI-powered tools can suggest web design elements that match the theme of the game. For coding, AI can quickly generate responsive layouts and interactive elements, essential for an ARG that might include puzzles or clues embedded in the website.

Conclusion

The integration of AI into the development of ARG websites, as inspired by the AI Coding workshop at the International Design Week of FH-Joanneum, presents a compelling advancement in how interactive narratives are crafted and experienced. As AI tools continue to evolve, so too will the possibilities for creating more engaging, immersive, and personalized interactive narratives.

04 – Plum St: Live Digital Storytelling with Remote Browsers

Introduction

The paper Plum St: Live Digital Storytelling with Remote Browsers by Ben Taylor and Jesse Allison is really in line with my topic of research, and that’s why I wanted to analyse what it’s their take on Internet Art and Digital Storytelling.

The paper explores the integration of Internet art into remote music performances, focusing on live audiovisual storytelling through web browsers. It discusses the use of socket technology to establish real-time connections between performers and audiences, enabling direct control of audiovisual media within the audience’s browsers. The authors present „Plum Street“ as an example of an online multimedia performance that utilizes various web media, including Google Maps and Web Audio, to convey stories and engage with audiences in a contemporary context.

Summary

These are the main points they go through:

  1. Context: The paper discusses the evolution of remote music performance paradigms and the emergence of internet art movements, highlighting the use of web-based tools and interactive installations in artistic expression.
  2. Plum Street: The authors introduce Plum Street as a platform for remote storytelling, leveraging web technologies like sockets and JavaScript to enable real-time interaction with audiences through their web browsers.
  3. Gesture Distribution: Plum Street enables performers to control audience browsers, creating a distributed performance experience where viewers can actively participate in the narrative.
  4. Media Components: The performance utilizes various web elements such as HTML, JavaScript, APIs, and Web Audio to craft its narrative, focusing on themes of absence, invisibility, and daily life experiences.
  5. Conclusion: The paper concludes by highlighting the potential of web browsers as a medium for live performance, particularly in blending electronic music composition with networked media art. It suggests that advancements in technology, such as JavaScript server toolkits and the Web Audio API, offer exciting opportunities for innovative performance paradigms.

Conclusions

The paper underscores the transformative potential of integrating internet art with remote music performance, presenting Plum Street as an innovative example. It emphasizes the significance of utilizing web browsers as dynamic instruments for storytelling and suggests that the convergence of technology and artistic expression in online performances opens new avenues for creative exploration at the intersection of electronic music composition and networked media art. I believe that this is a piece in the puzzle of my research that I can actually take and utilize as one part of my phygital prototype.

Reference

Ben Taylor, and Jesse Allison. 2013. Plum St: Live Digital Storytelling with Remote Browsers. Proceedings of the International Conference on New Interfaces for Musical Expression

03 – Community based storytelling – Lore & Creepypasta

Introduction

Exit Reality is a book written by Valentina Tanni, Digital Art Historian and associated professor at Politecnico di Milano as well as my Master’s Thesis mentor. The book talks about internet aesthetics, trends and phenomena that are born online, create community and sense of belonging for those that spend time watching or creating that content. In this blogpost I’m going to analyze the 4th chapter of the book called Lore – Vertigini procedurali: creepypasta, Backrooms e liminal spaces and draw inspiration from these concept for my ARG project, its shape and content.

The Slender Man, SCP Foundation, Backrooms

Creepypastas are horror stories that start, spread and get famous through internet and its communities. They are then expanded, modified and spread into mainstream media through the generation of fan made content that creates a web of connected narrative pieces that paint a hidden lore in the intangible internet canvas. One of the most famous examples is The Slender man a creepy pasta that starts from a blurry photoshopped photo and a post on the famous forum website 4chan then, spreading as if it was a virus, the story took form in the mind of the online userbase through more forum posts, videogames, videos, books and movies. All of them generated by community memeber

First image of the Slenderman by Eric Knudsen on Something Awful forum

The SCP Foundation project starts from /x/paranormal, a famous 4chan thread and uses a storytelling method that is slightly different from the previous example. In fact, if Slender man was the subject of many users ideas that would create and post to anyone online, the SCP Foundation it’s a collaborative narrative, users of the forum are welcome to submit their stories and after a check by the head members of the website they can be accepted if they adhere to the tone of the main story, creating a complex and detailed lore.

The Backrooms is a more recent example of creepypasta turned ARG. A photo of an empty room with yellow floor and walls, and bright neon lights on the ceiling, the uncanny sensation of familiarity spark in the 4chan users a sense of mistery. Story bits and pieces are created by community members, speculating on what that strange building could be, until the youtube channel by the name Kane Pixels starts uploading a series of videos on his channel depicting people entering this “backrooms” and this is were the ARG starts. An clear example of the “This is not a game” concept, online communities start theorising where this place can be found, what are they and who created them. This ARG is still on and many of them are trying to piece the story together to understand not only what is being shown but why, what’s the message that the author, or Puppet Master in ARG terms, wants to convey.

Differences between the Traditional Storytelling Model and The Transmedia Model according to media artist Jeff Watson

Insights and conclusions

I personally find extremely interesting how these different storytelling methods were born as a byproduct of the internet decentralised structure. Both user generated content, like the Slenderman and its lore, and collaborative narratives like the SCP Foundation are, in my opinion, extremely interesting and effective ways to tell a compelling and engaging story, the work done by Kane Pixel demonstrates that starting from that and then evolving it into an ARG is a very coherent process, that’s why for my project it could be useful to start from an existing fan base or internet subject and adding a piece of the narrative myself.

Reference

02 – What is Interactive Storytelling and why ARG are an effective form of it

Introduction

In a world where narratives are increasingly multi-dimensional and user-driven, the concept of interactive storytelling has emerged as a powerful medium through which audiences are not just passive consumers but active participants in narrative experiences. This blogpost delves into the intricacies of interactive storytelling, highlighting its history, various forms, and the defining characteristics of Alternate Reality Games (ARGs). Furthermore, it explores the strategic use of digital and physical interactions in crafting compelling stories and concludes with practical insights on effectively integrating these elements into ARG projects.

Definition of Interactive Storytelling

Interactive storytelling refers to a narrative form that allows the audience to influence or shape the story’s progression through their decisions and actions. Unlike traditional storytelling, where the narrative is fixed, interactive storytelling is dynamic, with multiple potential outcomes and pathways that depend on audience engagement. This form of storytelling is characterized by a high degree of participant agency, which can significantly affect the narrative’s course and conclusion.

History of Interactive Storytelling and Media Examples

Interactive storytelling is not a new phenomenon. Its roots can be traced back to early role-playing games and choose-your-own-adventure books, which allowed readers to make choices that affected the story’s outcome. With technological advancements, interactive storytelling has expanded into various media including:

  • Video Games: Games like „The Walking Dead“ by Telltale Games and „Mass Effect“ offer narrative choices that impact the game’s world and outcomes.
  • Interactive Cinema: Movies such as „Bandersnatch“ on Netflix allow viewers to make decisions that alter the story’s direction.
  • Virtual and Augmented Reality: These technologies provide immersive experiences where users can interact with the narrative environment in meaningful ways.
  • Online and Social Media Platforms: These platforms facilitate interactive web series and social experiments where audience inputs directly influence the unfolding events.

ARG and Its Role in Interactive Storytelling

Alternate Reality Games (ARGs) represent a unique blend of real-world and digital storytelling, where players collaboratively solve puzzles and uncover layers of a story that exists across multiple platforms. ARGs are distinct because they blur the lines between in-game and out-of-game experiences, creating a pervasive narrative that engages players deeply and personally.

Key Elements & terminology of ARG

Some of the terms essential to understanding discussions about ARGs are:

  • Puppet-master – A puppet-master or „PM“ is an individual involved in designing and/or running an ARG. Puppet-masters are simultaneously allies and adversaries to the player base, creating obstacles and providing resources for overcoming them in the course of telling the game’s story. Puppet-masters generally remain behind the curtain while a game is running. The real identity of puppet-masters may or may not be known ahead of time.
  • The Curtain – The curtain, drawing from the phrase, „Pay no attention to the man behind the curtain,“ is generally a metaphor for the separation between the puppet-masters and the players. This can take the traditional form of absolute secrecy regarding the puppet-masters‘ identities and involvement with the production, or refer merely to the convention that puppet-masters do not communicate directly with players through the game, interacting instead through the characters and the game’s design.
  • Rabbit-hole/Trailhead – A rabbit-hole, or trailhead, marks the first media artifact, be it a website, contact, or puzzle, that draws in players. Most ARGs employ a number of trailheads in several media to maximize the probability of people discovering the game. Typically, the rabbit-hole is a website, the most easily updated, cost-effective option.
  • This Is Not A Game (TINAG) – Setting the ARG form apart from other games is the This Is Not A Game sentiment popularized by the players themselves. It is the belief that „one of the main goals of the ARG is to deny and disguise the fact that it is even a game at all.

Effective Storytelling Through Interaction

Effective storytelling through interaction transcends traditional narrative forms by transforming passive observers into active participants. This paradigm shift fundamentally alters how stories are consumed, perceived, and remembered, offering a profound depth of engagement that static narratives cannot achieve.

Firstly, interaction in storytelling significantly heightens engagement. When audience members are given the opportunity to influence the narrative, their investment in the content naturally increases. This is because they are no longer merely absorbing information but are also actively shaping the course of events. The sense of agency this provides can make the narrative experience deeply personal and much more engaging. For instance, when a player in an ARG makes a decision that leads to a notable consequence within the game, the emotional stakes are heightened. Their choices feel impactful, which can lead to increased attentiveness and eagerness to see the outcomes of their interactions.

Moreover, this form of storytelling enhances emotional investment. As participants navigate through the story, making choices and experiencing the repercussions of those choices, they develop a connection to the narrative that is far stronger than if they were simply being told the story. This connection is not just about making decisions; it’s about seeing themselves reflected in the outcomes. When a storyline adjusts based on user input, it creates a personalized narrative arc that can resonate on a deeper emotional level, making the overall experience more meaningful.

Additionally, interactive storytelling facilitates a deeper understanding of the narrative and its underlying themes. By engaging directly with the story’s elements, participants can uncover nuances and explore complexities in ways that passive consumption does not allow. This hands-on approach encourages a more active form of learning and comprehension. For example, in an ARG, unraveling a puzzle requires understanding its context within the larger story, thereby promoting a more nuanced engagement with the narrative’s themes and messages.

Beyond just understanding, this method of storytelling also promotes empathy and perspective-taking. Participants who find themselves making moral or strategic decisions in a story can begin to empathize with characters facing similar choices, understanding their motivations and dilemmas on a more intimate level. This empathy is not merely academic but is felt, as participants navigate the emotional landscape of the characters they interact with or portray.

Finally, Interactive storytelling often involves a mix of digital and physical interactions. Digital interactions might include navigating web interfaces, making choices in a video game, or participating in social media-driven events. Physical interactions can involve attending real-world locations, interacting with physical objects, or engaging in live events. Each mode of interaction offers different strengths in engagement and immersive potential.

Applying Interactive Elements in ARG Projects

Creating an ARG is a complex endeavor, especially in an experimental environment such as a my master thesis project. Here’s a few points on how I might approach this challenge, infusing my personal vision and academic goals into the project.

Phygital Interactions

A combination of digital and physical platforms is a must to that captures the essence of ARGs. For example, I might develop a website that serves as the central hub for your ARG, supplemented by social media accounts that characters in the game use to communicate with players. Additionally, incorporating real-world tasks—such as visiting specific locations to collect clues or interact with objects—can significantly enhance the immersive quality of my ARG. By integrating various media, you ensure that participants experience a rich, multi-dimensional story that leverages the strengths of each platform.

Maintaining a coherent narrative

One of the greatest challenges in designing an ARG is keeping the narrative coherent and engaging across all platforms and interactions. I want to tackle this by carefully planning the storyline and having a clear understanding of how each puzzle and task fits into the overall narrative arc. Consider creating a detailed timeline and a flowchart that maps out all the main events and decision points in your ARG. This will help me ensure that regardless of the players‘ choices or discoveries, the story remains cohesive and compelling.

Iterate and Test

Since my project is a prototype, iterative testing and feedback are vital. I could possibly organize test runs with fellow students or volunteers, gather their feedback on various elements of the game, and use this information to refine the experience. This iterative process not only improves the quality of your ARG but also demonstrates your commitment to user-centered design in your thesis, an important aspect of interactive storytelling.

Conclusion

Interactive storytelling, particularly through ARGs, represents a frontier in narrative techniques where the line between storyteller and audience is fluid and collaborative.

By understanding its principles and applying its strategies effectively, creators can craft immersive and impactful narratives that engage participants in profound and unique ways. As we move forward, harnessing the full potential of interactive elements will be crucial in evolving storytelling practices and delivering richer, more participative narrative experiences.

References

https://benhoguet.medium.com/what-is-interactive-storytelling-46bfdd2a8780

https://benhoguet.medium.com/a-short-history-of-interactivity-6fe72f7defea

https://en.wikipedia.org/wiki/Alternate_reality_game