6 | Theatre in the digital time

In my last post, I presented a work from the project Im/material theatre spaces, which offers a potential answer to my question about a digital construction rehearsal.

As the project m/material theatre spaces encompasses further works that delve into digital aspects, particularly the theme of virtual and augmented reality in the theater environment, I would like to discuss additional projects as they can serve as inspiration for my own thoughts.

This research project explores the synergy between immersive technologies and centuries-old theater knowledge. They posit that theater and virtual/augmented reality (VR/AR) share spatial immersion and methods, addressing questions of participation and changing perspectives. VR, through complete immersion, opens up new storytelling possibilities, allowing shifts in perspective and embodiment of different roles. On the other hand, AR enriches reality by overlaying it with digital content, creating a fusion between the real and digital worlds.

Research Questions

The project addresses key questions to unlock the potential of VR/AR in theater, exploring practical applications in architecture, stage design, and theater technology. Specific inquiries include the use of augmented reality in planning theater renovations, improving safety standards backstage with digital technologies, and employing immersive technologies to provide innovative access to cultural heritage.

Goals

The overarching goal of the research project is to establish theaters and event venues as ongoing hubs of technical innovation. By investigating the intersection of analogue and digital worlds, the project aims to make these new technical spaces usable for theater practitioners. The focus lies on developing prototypical solutions, communicating findings to the theater landscape, and fostering a sustained dialogue through workshops, lectures, and blog posts. The publication serves as a comprehensive overview of the project’s results, methods, and an exploration of potential future developments in the theater and cultural landscape.

Background:

The project addresses the lack of knowledge for assembling and dismantling complex equipment in the events industry. Not everything can be adequately conveyed through training, and many assembly and operating instructions are often impractical or too vague in paper form. To ensure safety during construction, this project aims to develop digital support.

A functional prototype, specifically addressing the AR-supported setup of a curtain rail, has been developed, serving as a practical foundation for further discussions. Through an interactive website, assembly instructions are displayed in detailed steps, supported by 3D animations. The technology allows usage on conventional screens or immersively through Augmented Reality glasses or AR functions on smartphones. The website offers flexibility for future instruction updates without the need for end-device updates.

Feedback:

The digital assembly aid was generally deemed helpful, especially due to the detailed representation of complex steps. Realistic representation was considered necessary, particularly for quick component identification. Usage on a tablet or touchscreen was preferred, while Augmented Reality glasses were viewed as promising for the future. The desire for a personal account was expressed to customize existing instructions. The application could be used for notes and specific solutions within the house or for different productions. It was noted that the application could be useful for additional instructions and the visualization of theater projects. A technical obstacle lies in providing and maintaining high-quality 3D data.

Additional Areas for Digitization in Internal Processes:

  • Inventory control systems
  • Calculation tools
  • Warehousing
  • CRM systems for customer service

How could the project be continued, and what future applications could arise from the initial prototypes? Based on project feedback, the development of an individualized and fully automated creation of stage mounting systems could be pursued. A website could be created, allowing free configuration of a modular rail system. The individualized system could then serve as the basis for the automatic generation of precise assembly instructions.

DTHG: Abschluss-Publikation des Forschungsprojektes „Im/material Theatre Spaces“

Nudging Festival Attendees

This weeks blog post will be a bit shorter because of the workload this week. I want to continue on the the theme of nudging, and reflect a bit about how it might be utilized in a festival setting.

However first i want to reflect a tiny bit around the ethics of using nudges. Nudging as technique has also gained critique for being social control and that it can be unethical. This however is mainly in the context of using nudges in design of public spaces, and by public officials. In these cases, as a member of the general public, you do not really have choice to interact with these or not. This is nuanced debate with no clear answers, however i will not delve into it in this project. The spaces i am focused on are not public spaces, and audience members can choose to be there or not, and also if they wish to return.

From the article that last weeks blog post was primarily based around, one of the key issues in utilizing nudges effectively, was evaluating them. Which to most designers seems a no brainer, iteration and user feedback is essential to most of our processes. However this might not be as simple if you are a part of festival staff that is only hired for the festival and move on to other projects and jobs after the festival is complete. So systems to effectively evaluate your nudges would have to be in place, before even starting to think about which ones you could try. However i had a discussion with Karoline who i interviewed two weeks ago, and from the festival she had worked on, evaluations after the festivals was implemented in at least two of the larger festival she had worked on. This is a very small sample size, but it suggest that at least larger festivals have systems in place to evaluate how well everything went, and these could be extended to nudges.

In my research this week i found the article “How nudging inspires sustainable behavior among festival attendees: A qualitative analysis of selected music festivals”. This is an extremely relevant article for my research, as i looks directly at the unique aspects of festivals in implementing nudges and changing behaviors. As the article is focused on sustainable behaviors most of the concrete suggestions are not relevant for my research, but the are still some very interesting findings.

Key findings:

  • The relationship between the festival attendees and organizers influence how easily the attendees respond to prompts and nudges. A more personal relationship makes it easier to for attendees follow instructions.
  • Larger crowds where people are more anonymous are harder to nudge
  • Expectations and communication beforehand of the event can have a great effect.
  • Sanctions and consequences for negative behaviors do not have a great effect when people are feeling anonymous in a crowd
  • Creating a sense of inclusion in to the group and as a part of the festival is key

These are the findings for this week, and i want to continue into this realm next week.

https://www.mdpi.com/2071-1050/14/10/6321

5 | Theatre in the digital time

I have identified model building and mock-ups as focal points, as new technologies such as VR and AR find a suitable application, particularly in space planning. They allow for the visualization of project ideas in actual size, enabling movement within the space, direct adjustments to the stage set, and evaluation of the impact of individual elements without the need for large physical models. I am exploring the potential of virtual spatial extensions for stage set planning and will also delve into existing projects that have developed concepts in this area.

Virtual modeling for stage sets offers diverse potential and brings numerous advantages:

  1. Efficient Conceptualization: Digital modeling enables precise and efficient conceptualization of stage sets. Creative ideas can be visualized and adjusted quickly.
  2. Collaboration and Teamwork: Virtual models promote improved collaboration across the entire production team. All stakeholders, from the set designer to the director and lighting designers, can work on the same digital model in real-time and provide feedback.
  3. Resource Efficiency: Avoiding physical models significantly reduces material consumption, contributing not only to cost savings but also to environmental friendliness.
  4. Sustainability: Digital models allow for a more sustainable approach. By avoiding physical materials and using eco-friendly software solutions, the ecological footprint can be minimized.
  5. Flexibility and Adaptation: Digital models are easily adaptable, providing flexibility for changes during the design process, crucial in an industry often characterized by spontaneous ideas and creative adjustments.
  6. Simulation of Light and Effects: Virtual modeling allows the simulation of lighting conditions and effects, aiding in understanding and optimizing the visual impact of the stage set under different conditions.
  7. Archiving and Reuse: Digital models can be archived and reused for future productions, saving time and resources for upcoming projects.

Overall, virtual modeling for stage sets offers a contemporary and innovative approach that not only optimizes workflow but also provides ecological and economic benefits.

Complex programs like Autodesk’s AutoCAD enable the design and visualization of ideas for a stage production, supporting the process from design through modeling and prototype creation to production. Both 2D and 3D renderings can be designed for better concept communication. However, as AutoCAD focuses heavily on technical aspects and offers numerous features, it may not be suitable for quickly creating stage set concepts. A program with the potential to unite all elements of a stage set (construction, materials, lighting) and make them easily adjustable would be beneficial.

A subproject of the research project „Im/material Theater Spaces“ focused on developing methods and tools for virtual construction rehearsals that can take place in the virtual space and are location-independent.

During the construction rehearsal, the artistic and technical teams come together for the first time in a large group. Here, the previously submitted stage set design model is marked on the stage to visualize ideas and dimensions in a 1:1 scale. Through discussions, the technical feasibility is examined. It is crucial that during the rehearsal, the overall impression, dimension, and atmosphere of the stage set can be experienced for the first time, providing insight into the impact the stage set has on the audience. Additionally, depending on the stage set, the materiality and the use of light and projections are tested for the first time. The approach of the construction rehearsal can also be compared to Greyboxing in gaming, where relevant parts are prototypically simulated. Wouldn’t it be practical to incorporate this aspect directly at the beginning of the design and conceptualization process and use its effect as a design and idea driver?

The result of the project includes methods and tools for conducting virtual construction rehearsals. A workshop format called „How to go Virtual“ was developed to test practical applications with theaters and venues on-site. The focus was on practical application, experimenting with existing applications and programs, and playing through individual usage scenarios.

Results of the project: Virtual Construction Rehearsals: The project developed workflows and methods that serve as guides and tutorials for the theatrical landscape. Three different workflows were developed:

Sketchfab / Zoom „Semi-virtual Preliminary Rehearsal“
_Utilizes the Sketchfab platform for model presentation.
_Enables discussions via video conferences and shared screens.
_No live editing of the 3D model during the discussion.
Mozilla Hubs „Rehearsal for Everyone“
_Uses the Mozilla Hubs platform for virtual rehearsals.
_Easy accessibility for up to 30 people.
_No live editing of the 3D model during the discussion.
Virtual Construction Rehearsal with VR-Sketch
_Utilizes the paid plugin VR-Sketch for SketchUp.
_Enables discussions and construction rehearsals via VR headsets.
_Live editing of the virtual 3D model during the discussion.

Thinking further, a virtual construction rehearsal could not only include the stage set itself but also integrate costume design, with 3D elements related to the stage set. I came across the research project „connecting.stitches“ by Luside Ehrenwerth, where she combines costume design and technologies. Her focus is on the possibility of experimenting with conductive fabrics, 3D printing, and sensors before transferring them into production. This free experimentation could be directly incorporated into the design of an entire scene in VR to examine the overall concept more precisely. This way, all products from different workshops could be coordinated to be viewed in exact scale.

Finally, it’s worth noting that real materials, paper, and pencils are often used for sketching ideas and concepts. Could it be a possible approach to take these 2D representations as they are and transport them directly into the virtual 3D space to make quick decisions for the progressing design process?

Deutsche Theater Gesellschaft – Services  

Multi-Sensory in UX: A Journey into to the Multi-Sensory Experience

In the user experience (UX) design field, the recent focus on Multi-Sensory UX Design reveals a critical insight that designers often dismiss. Traditional design approaches that focus on visual and auditory elements fall short of taking advantage of the profound impact on emotions and memories that can take when engaging multiple senses simultaneously.

What is Sensory Design?

The intentional consideration and manipulation of sensory elements, like sight, sound, touch, taste, and smell, in order to induce specific responses or emotions in users is known as sensory design. Sensory design in UX aims to create a holistic experience that goes beyond the visual and combines various senses to engage and captivate users. Every user interaction with a product is a sensory experience. Sensory design wants to make sense of engagement deeper and more multifaceted.

What is Multi-Sensory Design?

The multi-sensory design increases sensory design by seamlessly integrating multiple senses into the user experience. It emphasizes that users interact with digital products through various visual, auditory, tactile, and even olfactory channels. Multi-sensory design enhances the general impression and memorability of a product or service. Multi-sensory design is the practice of creating an experience that is more than just visuals. To make it more meaningful, it plays with feelings, connects with sounds and smells, communicates with the environment, and builds a physical space.

What is the impact of Multi-Sensory Design on UX?

The multi-sensory design has an important effect on user engagement and satisfaction. Designers can create more memorable and emotionally compelling interactions by appealing to a wider variety of sensory experiences. This deeper involvement can lead to higher retention of users, stronger brand loyalty, and a distinct competitive advantage in the market.

Considering the emotional impact that a beautiful sunset, a nostalgic piece of music, or a familiar fragrance can have. Multi-Sensory UX uses the connection between our senses and emotions to build powerful and long-lasting memories in the digital world.

Why is Multi-sensory design is important?

Participation and Memorability:

  • It increases memory retention with various sensory stimuli.
  • It stimulates multiple senses, creating a captivating user experience.

Accessibility and Emotional Connection:

  • It benefits users with various sensory needs by increasing accessibility.
  • It evokes emotions by creating a stronger connection between users and products or services.

Feedback, Usability and Brand Differentiation:

  • It improves navigation and usability by providing reassuring sensory feedback.
  • It enables brands to stand out with different and memorable experiences.

Cross-Model Redundancy and Reduced Cognitive Load:

  • It improves comprehension by presenting information through multiple senses.
  • It reduces cognitive tension by distributing information efficiently.

Innovation and creativity:

  • It encourages creative exploration by encouraging innovative solutions in UX design.

Multi-sensory design has become a game changer when it comes to improving user experiences. It goes beyond what we see and hear in order to create interactive and memorable connections. This increases satisfaction and makes it easier to use websites or apps. It helps remember a brand by providing unique and unforgettable experiences. Standing out in a marketplace full of competitors is essential, and multi-sensory design gives brands an advantage by making them unique and easy to remember.

  • https://www.toptal.com/designers/ux/sensory-design
  • https://www.front-commerce.com/sensory-ux-in-the-digital-era/
  • https://bootcamp.uxdesign.cc/sensory-appeal-in-ux-design-the-secret-to-enhancing-user-experience-ac46755eae3f
  • https://www.yellowslice.in/bed/why-multisensory-designs-create-memorable-experiences-for-users/

XR 3 // VR Interactions: Controller vs Body Tracking

Virtual Reality (VR) has revolutionized the way we experience digital content, offering immersive and interactive environments. One important aspect of VR is the method of interaction, which can greatly impact the user experience and effectiveness of training applications. In this blog post, we will delve into the differences between controller-based and body tracking interactions in VR, their implications for training, and recommendations for interaction design.

The Importance of Natural Interaction

Research has shown that natural and intuitive interactions in VR enhance presence, immersion, and usability. When users can interact with virtual objects in a way that mimics real-world actions (e.G. by actually grabbing objects or by pushing Buttons with the virtual controllers), they can focus on the learning experience rather than learning how to use the equipment. This allows for a deeper level of engagement and better retention of information. As Abich and colleagues (2021) argue, training in VR is most useful when it allows for embodied, experiential interaction. The spectrum of interactions in VR ranges from „arbitrary“ interactions (e.g., double-clicking a mouse to select an object) to „natural“ interactions (e.g., rotating the hand clockwise to rotate a valve in the same direction). The more natural an interaction seems, the higher presence, immersion, and usability it affords.

Controller-Based Interactions

Controller-based interactions in VR function similarly to game console controllers. The difference lies in the tracking capabilities of VR controllers, which accurately represent the user’s hand position and movement in the virtual environment. This enables users to interact with virtual objects in a more precise manner. However, a challenge with controller-based interactions is the lack of standardized control schemes. Different VR applications may require users to interact with virtual objects using various methods, such as reaching out or using a laser pointer. This inconsistency can lead to confusion and cognitive load for users. It is important for developers to consider implementing standardized control schemes to provide a consistent and intuitive user experience across different VR applications.

Body Tracking Interactions

Advancements in technology have enabled body tracking interactions in VR, such as eye tracking and hand/body tracking. Hand tracking, in particular, allows users to interact with virtual objects in a more natural and intuitive manner. For example, using pinch gestures to select objects or pressing virtual buttons. Hand tracking can enhance the sense of presence and immersion in VR experiences. However, similar to controller-based interactions, there is a lack of standardized gestures for hand tracking. Users may need to learn specific gestures for different applications, which can be a barrier to intuitive interaction. To address this challenge, companies like Ultra Leap are working towards establishing standards for gesture controls in VR. By introducing standardized gestures, users can intuitively know how to perform actions without the need for extensive tutorials or guesswork.

Recommendations for Interaction Design

To optimize the user experience in VR, it is crucial to provide clear and concise tutorials for both controller-based and body tracking interactions. Instead of relying on textual instructions, show users how to perform actions through visual cues and demonstrations. This „show, don’t tell“ approach can help users quickly grasp the interaction methods without the need for extensive reading or trial and error. Additionally, consider placing important buttons and menus within reach of the user to minimize the need for physical movement within the virtual environment. For example, utilizing a popup menu on the wrist of one virtual controller can provide quick access to important functions without requiring users to navigate across the virtual room. These design considerations can help reduce cognitive load and ensure a seamless and intuitive user experience in VR.

Menu solution by Ultraleap

In conclusion, the choice between controller-based and body tracking interactions in VR depends on the specific application and the desired level of immersion and naturalness. Both interaction methods have their advantages and challenges, but with thoughtful design, standardized gestures, and concise tutorials, VR can offer truly immersive and effective training experiences. As VR technology continues to evolve, it is important for developers and researchers to collaborate in establishing best practices and standards for interaction design to unlock the full potential of VR for training and other applications.

Links

03 | Sustainability and the Internet (Part 2) 🌱

  1. Frick, Tim (2016): Designing for Sustainability. A Guide to building greener digital products & services. ↩︎
  2. Petrosyan, Ani (2022): Number of internet users worldwide from 2005 to 2022. In: Statista.
    URL: https://www.statista.com/statistics/273018/number-of-internet-users-worldwide/ ↩︎
  3. Jo Dixon, Stacy (2022): Media usage in an internet minute as of April 2022. In: Statista.
    URL: https://www.statista.com/statistics/195140/new-user-generated-content-uploaded-by-users-per-minute/ ↩︎
  4. Rozite, Vida (2023): Data Centres and Data Transmission Networks.
    URL: https://www.iea.org/energy-system/buildings/data-centres-and-data-transmission-networks# ↩︎

Airport design and nudging behaviors

This week, my main focus has been on exploring other fields of design that might provide relevant information for my topic. I started by looking into airport design, as it is a field that fascinates me and has potential applications for my research.

When designing modern airports, there are two main goals: efficiently moving people through security, check-in, and other practical tasks, and maximizing revenue by encouraging shopping once passengers are inside the secure area. I can draw parallels between this and the concert music venue experience. The venue wants the audience to quickly and efficiently pass through ticket checks so that they have more time to spend inside, purchasing drinks and snacks. However, it is easier to find information about what airports need to do and why, rather than how different airports achieve these goals and what strategies work best.

One thing I discovered is that airports use spatial design to guide passengers and help them navigate. For example, large art pieces in different airport terminals serve as landmarks and help passengers distinguish between different areas. Additionally, using clear and distinct colors on carpets and walls provides passengers with visual cues about their location.

Through this line of thinking, I came across another interesting concept that I believe is relevant to my project: nudging. Nudging is the unconscious way of signaling to passengers to make the correct decision in a given scenario. This approach can be employed in various ways to steer people towards making the right choices, without explicitly instructing or prohibiting certain behaviors. I find it particularly intriguing to explore how this approach can be applied in a music venue.

Overall, these findings have opened up new avenues for further research in my project.

Sources:

https://www.smithsdetection.com/insights/60-seconds-with-desmond-lian/
https://www.bbc.com/future/article/20190430-psychological-tricks-of-airport-design¨
https://www.bbc.com/future/article/20140917-how-to-trick-terrible-travellers
https://inudgeyou.com/wp-content/uploads/2017/08/OP-ENG-What_is_nudging.pdf
https://www.sciencedirect.com/science/article/abs/pii/S0926580523002005
https://simpleflying.com/how-airports-are-designed-to-optimize-passenger-flow/

#02 Diving into Virtual Reality

Like I said in the first blog post, I want to use these posts as a means to delve into different topics. I look forward to widening the scope of this topic. But before that, I want to note down a specific topic I have in mind. Virtual reality therapy. I will try to keep this as concise as possible. Therefore, without further ado, let us jump right into it.

What is Virtual Reality?

Before delving into the subject matter, it is essential to establish a shared understanding of the term Virtual Reality (VR). The Oxford English Language dictionary defines it as such:

The computer-generated simulation of a three-dimensional image or environment that can be interacted with in a seemingly real or physical way by a person using special electronic equipment, such as a helmet with a screen inside or gloves fitted with sensors.

This is a very technical description, but it defines VR quite clearly. A VR setups is compromised of three parts: there is a computational machine which performs the calculations needed, in other words, a computer. This can be a simple image, video or entire 3D generated worlds. These 3D worlds are often created using a game engine which allows for interactive real time environments. Two notable examples which support VR would be Unity and Unreal. Then there is an output device which displays the generated images. Traditionally this would be a screen, in this case, it is a VR headset, also referred to as a Head Mounted Display (HMD). Finally, though strictly not mandatory, the inclusion of an input system may be necessary. This spans from conventional controllers to VR hand-tracked controllers. Or, as we just recently saw with Matt Corall’s presentations about Ultraleap, there is also the possibility of tracking the hands and using them for input without any controller. Haptic feedback, the simulation of touch, is also a notable component which can drastically increase the immersion and the effectiveness of VR.

However futuristic it may seem, the roots of VR extend significantly into the past. In 1838 the concept of stereopsis, the fact that the brain overlays two images to create a 3D image with depth, was first described. In the 1950s Morton Heilig created Sensorama, a device with the goal of fully immersing the user by using a stereoscopic 3D image, sound, smell, vibrations and simulated wind.

Sensorama

In 1960, Morton Heilig, the innovator behind Sensorama also patented the Telesphere Mask, which can be considered the first HMD. Skipping ahead, in 1997 Georgia Tech and Emory University collaborated to utilize VR as a therapy method for the treatment of Post-Traumatic Stress Disorder (PTSD) in war veterans.

With this short history overview, I wanted to shake the notion that VR is something entirely new. While I provided a brief glimpse, I glossed over many other captivating inventions. If you are interested, I recommend having a read of the full articles – they really are fascinating. The links can be accessed in the Sources section below.

Use cases in therapy?

Though there is often an overlap I would differenciate between two different use cases:

  • physical
  • mental

Let us begin by considering the physical use case,. For example, partial paralysis of a body part or side. While conventional treatment methods exist, Virtual Reality (VR) presents distinctive advantages. The therapy experience can be tailored to exact use cases, which would be hard to train reliably in real life scenarios, such as relearning how to drive. A driving simulator setup, i.e. a chair with a steering wheel and shift lever should not be used instead of VR, but they should work in tandem to increase immersion and effectiveness. Furthermore, the experience may easily be gamified, meaning turning the process of therapy into a fun game. This may especially help when dealing with children who may not have the discipline or motivation to push through rigorous training programs.

VR is also especially useful in the treatment of mental problems. A notable use case involves the treatment of specific phobias, including but not limited to the fear of flying, arachnophobia, elevator anxiety, or social anxiety. Treating a fear like flying is difficult with more traditional treatment methods. Arranging a plane, traveling to a specific location, and repeating such processes multiple times can be logistically difficult and time-consuming. Using VR, a 3D scene can be comparably easily created and the treatment can be done in a very controlled fashion. Furthermore, as previously mentioned, VR has been employed by the military for the treatment of war veterans grappling with Post-Traumatic Stress Disorder (PTSD).

BraveMind – a VR treatment method for war veterans struggling with PTSD

In this context it is used to allow soldiers to relive the traumatic experiences and work through them with a specialist, in a carefully designed and controllable manner. Soldiers who may not be able to cope with their experiences may decide to commit suicide as a result. Therefore application of VR in trauma-focused therapy provides a crucial and potentially life-saving intervention for individuals dealing with the profound impact of their military service.

Personal experience

In my previous blog post, I delved into my personal motivations surrounding this topic. Since then, I have talked with my brother and the kind of experiences he had using VR therapy. In his particular case, VR served as a tool for training the left side of his body, which experienced partial paralysis, resulting in reduced speed. Additionally, VR was employed to address issues related to his partially impaired field of vision.

He recounted three different programs which were used in his treatment. Firstly, a car driving simulator, which was used to train both his motor function and his ability to perceive traffic. A virtual room in which he needed to search for objects and, on occasion, connect different objects using wire. And lastly, a game in which balls were being thrown at him, and he had to deflect them using his hands. He expressed a strong preference for the visual feedback of seeinghis hands in the VR environment. He also noted that he talked quite a lot with his therapist and at least in Austria, the options of VR treatment programs is very limited. Few programs exist and they can not really be customized to the needs of the user. In the last example, my brother wanted to train his left side more but have the objects be slower and the therapist said that this cannot be changed, unfortunately. This seems to be a common problem with these programs – the customization options for individual patients is limited.

Summary

In summary, VR has a long history, yet its potential as a treatment method remains underutilized, presenting a lot of potential for innovation and research in this area. The versatility of VR spans both physical and mental health topics. VR therapy proves useful because it can be individualised and is both time- and cost-effective. As we continue to

, offering the distinct advantage of tailoring experiences to individual users while proving to be time- and cost-effective. As we continue to uncover the multifaceted applications of VR in the realm of therapy, its transformative impact on healthcare interventions is poised for further realization and advancement.

Sources

The history of virtual reality
History Of Virtual Reality – Virtual Reality Society (vrs.org.uk)
History of VR – Timeline of Events and Tech Development (virtualspeech.com)

BraveMind video
Virtual Reality Therapy: PTSD Treatment for Veterans (soldierstrong.org)

Design for Five Senses: A Journey Into To The Multi-Sensory Experience

Multisensory design is a new technique that extends beyond the typical focus on sight and sound to encompass all five senses: sight, hearing, touch, taste and smell. It aims to develop places, products and experiences that engage users on a deeper level by stimulating various senses simultaneously. This technique is not limited to one sector; it can be applied to architecture, product design, marketing, user experience design and other fields.

Industrial designer Jinsop Lee believed that great design appeals to all five senses. This was called by him the Five Senses Theory. Jinsop also gave a Ted talk on the subject years ago. He believes that all experiences can be rated using all five senses. For example, eating noodles can be rated by sight, smell, touch, taste and sound.

5 Senses Graph, created by Jinsop Lee

He continues to discuss his theory by evaluating the different experiences he has had in his life in terms of the five senses and applying this to the ‚The 5 Senses Graph‘ he made.

This is how the perfect experience would look like on the 5 Senses Graph – a horizontal line along the top. In the years of gathering data, Jinsop Lee says that the only experience managed to come close to being the perfect one is sex.

Jinsop Lee mentiones that many designers, including himself, focus on making things look beautiful and somehow tactile, while ignoring the other three senses. He is trying to change that completely. Lee wants to apply his theory to future designs and hopes to inspire others to do the same, making designs that engage all our senses.

The five senses theory is a very helpful way of evaluating various life experiences and then hopefully incorporating those best experiences into any form of design.

02 | Sustainability and the Internet (Part 1) 🌱

  1. Frick, Tim (2016): Designing for Sustainability. A Guide to building greener digital products & services. ↩︎
  2. McGovern, Gerry (2020): World Wide Waste. How digital is killing our planet – and what we can do about it. ↩︎