I got so frustrated with Unity’s increasing restrictions and issues, I’ve decided to temporarily abandon my current prototype and shift to learning Unreal Engine. My goal for this semester is to create an interactive prototype using Unreal, starting with a sci-fi scene to grasp the basics, then adding interactivity through a smartphone or game controller. This change aims to leverage Unreal’s capabilities for more complex and visually stunning interactive and XR experiences.
Schlagwort: XR
XR 12 // Meta SDK
Exploring Meta’s Quest Integration for Unity: My Experience
My first steps started by experimenting with Meta’s Quest Integration for Unity. Here’s a quick rundown of my journey and the challenges I faced.
Easy Setup for Quick Prototyping
The initial setup was straightforward and user-friendly. Meta provides a lot of built-in functionality, making it ideal for quick prototyping. Getting a basic project up and running took minimal effort, thanks to the clear setup guides.
Complications Arise with Advanced Features
However, as I moved beyond the basics, things became complicated. The documentation from Meta is often outdated or incomplete, making it difficult to implement more advanced features.
What I Built
I created a terrain with high-definition textures and added functionalities for teleporting (locomotion) and interacting with objects. The process involved using Unity’s XR Interaction Toolkit, which, despite some confusing moments due to lacking documentation, helped achieve the desired interactions.
Performance Issues
A significant issue I encountered was performance. Despite the simplicity of my scene, I noticed major frame rate drops, falling below 38 fps. This highlighted the need for optimization, even in basic projects, to maintain a smooth VR experience.
Takeaways
While Meta’s Quest Integration offers a solid foundation for VR development, the lack of updated documentation can be a barrier. For those starting out, begin with simple projects and utilize community resources extensively. Despite the challenges, the potential for creating immersive VR experiences is exciting and worth the effort.
09 | More than just buttons – Interactions and interfaces in VR, AR and MR experiences
What this blogpost is all about
Building upon my previous blogpost on immersive level design, this post also aims to explore the topic of immersion and interaction within VR, MR and AR a bit more by looking into different interface and interactivity solutions currently available, some previously mentioned already, some new, that may increase intuitiveness and engagement from the user. While also still covering AR and MR solutions, the main focus of this blog post will be on VR, as it is more in line with my future plans and planned research.
Tracking, controllers, recognition and other interaction methods
With a wide variety of different VR, AR and MR headsets and technologies comes an equally wide variety of different input devices, interfaces and ways to interact with the created virtual environment. In VR alone, there exists a seemingly endless amount of different controllers, with each headset developer putting their own spin on it.
Different headsets, different controllers – HTC Vive Pro, Meta Quest 3 Pro, PS Move and Valve Index Pro (left to right)
However, controllers like these are by no means the sole mean to interact with virtual environments anymore, as with advancements in tracking, movement- and voice recognition, a vast variety of different input and interface methods has been developed alongside the conventional controller-based inputs.
Hands, eyes, voice and haptic feedback
As previously mentioned, constant advancements in available computing power, frequent optimisations as well as new technologies make it possible to create virtual experiences that are more immersive than ever.
One such advancement lies in tracking and how the tracked movement and data get processed. While hand and gesture tracking has been a long-time staple especially in AR headsets, due to their inbuilt sensors and tracking, it has also become one in VR and MR applications. To give a differentiation between hand-tracking, controller-based-tracking and gesture-tracking, more commonly known as gesture recognition, which all may appear similar at a glance, is quite simple. Hand-tracking, as the name suggests, track the actual movement of the hand within the virtual space.
Ultraleap’s 3Di, a small tracking camera for handtracking, comes with its own intergrated interface
Different than controller-based-tracking, it frees the hands for interactions, without relying on buttons or other inputs. Controller-tracking, in comparison, also tracks the hands movement, but instead of doing so directly, it tracks the hand-held controllers instead. These controllers usually come with a wide variety of buttons, joysticks and other triggers, that can be programmed and used to interact with the environment and input information.
Last but not least, gesture recognition interprets specific hand movements or gestures made by the user and reacts in specific ways, allowing interaction and enabling control over certain parts of the virtual space. It can be understood as a specific form of hand-tracking, as specific parts of the hand get tracked, though, in this case, the gesture made is usually more important than the position of the hand compared to the rest of the body.
Ultraleap Leap Motion Controller 2, a gesture tracking controller with a wide variety of applications
While it may seem now, that the main focus of current interaction lies in tracking movement of extremities, mainly the hands, this is not true. Eye tracking, for example, is a gaze-based form of interaction, that makes use of tracking monitors, that follow the user’s eye movements to enhance realism, allow interaction and render specific parts more or less, thus deepening immersion as needed, while devices like Amazon’s Alexa, Microsoft’s Cortana or Google’s own Voice Assistant can long sicne be used in VR and MR as well, to control and interact with the virtual environment using vocal commands. Using a combination of these different tracking technologies can make the user’s environment feel much more responsive.
But not only the users inputs can be enhanced by new technologies to create a more immersive experience. Using haptic feedback systems, spatial computing as well as hardware solutions, that creates the illusion of seemingly endless virtual spaces even within a very limited physical environment, the immersion of the user and responsiveness of the environment can be increased even further. While haptic feedback gloves have already been mentioned in a previous blog post, it is important to mention that haptic feedback, in the broader sense, is in no way limited to the tactile layer anymore.
teslasuit – a full body haptic feeback suit, that also tracks movement and can be used for motion capture
Haptic feedback suits, like the one shown above, can provide live-responses of the virtual environment to the user via vibrations. This can create the illusion of a physical component being present in the environment. Furthermore, spatial computing, especially in combination with multimedia-rooms, like the TU Graz‘ VR Cave, which can be used to more seamlessly merge physical and digital worlds, allowing physical objects to be tracked and influence the virtual, while also allowing virtual objects to interact with the real environment. Additional use of hardware, that allows for extended movement of the user through the virtual space even when limited by smaller real spaces, like an omnidirectional treadmil, can further blur the line between the virtual and the real.
Virtuix Omni One – a Omnidirectional treadmil developed for gaming in VR
Things to consider
When presented with all these different options to track the user’s input and the data they provide, it can be very easy to be overwhelmed or lose track of the reason of why these movements should be tracked and / or used in the first place, namely, to provide a smooth, fun and immersive experience, that allows a maximum amount of user interaction with a minimum amount of effort on the users side. To ensure that, there are a few important steps to consider when designing such an interaction.
Abstraction, intuitiveness, optimisation and sense of security
A good first step when approaching the design of user interaction and interfaces for the virtual is mimicking the real world and its interactions, to increase both intuitiveness for the user and clearness when providing feedback. By adapting the sense of realism or choosing a certain level of abstraction for the inputs and / or interfaces, they can be simplified and made to fit the desired experience without distracting the user.
Frequent user testing, followed by refinement and optimisation of the employed systems, can increase responsiveness, accessibility and create a sense of security in the user when confronted with the virtual environment. Furthermore, the higher the continuity of the created content, both in design and in experience, and the more seamless the transition between the physical and the virtual, the easier it is for the user to engage, which also boosts self-confidence and security.
All in all, by making use of the different technologies described above, while being aware of the challenges and chances they bring and optimising and adapting the desired experience according to the user’s needs, it is nowadays possible to create amazingly responsive environments already. It is, however, still important to be aware of the ever present limitations of current hardware, but with how rapidly technology and development keeps progressing, the next solution might already be around the corner.
Next steps:
- Look further into different VR and MR solutions and their respective issues
- Research essential tools for creating immersive virtual environments as well as different game engines and their advantages and disadvantages
- Check methods of engagement and interaction within these digital environments
- Look into accessibility and how to ensure it
- Research into immersion and storytelling
Sources:
1. Springer / Gabler.: Virtuelle Realität, in: Gabler Wirtschaftslexikon, n.y.,
https://wirtschaftslexikon.gabler.de/definition/virtuelle-realitaet-54243/ online in: https://wirtschaftslexikon.gabler.de/ [08.02.2024].
2. n.A.: Was ist Augmented Reality?, in: Omnia360, 2020, https://omnia360.de/blog/was-ist-augmented-reality/, online in: https://omnia360.de/ [08.02.2024].
3. n.A.: Mixed Reality: Wenn sich Reales und Virtuelles vermischt, in: Omnia360, 2023, https://omnia360.de/blog/mixed-reality/, online in: https://omnia360.de/ [08.02.2024].
4. n.a.: Extended Reality, in: RyteWiki, n.y., https://de.ryte.com/wiki/Extended_Reality, online in: https://de.ryte.com/wiki/Hauptseite [08.02.2024].
5. Hayden, S.: Vision Pro Teardown Shows Balancing Act Between Cutting Edge Tech & Weighty Design, in: ROADTOVR, 2024, https://www.roadtovr.com/apple-vision-pro-teardown-ifixit/, online in: https://www.roadtovr.com/ [08.02.2024].
6. Hayden, S.: Quest 3 Teardown Shows Just How Slim the Headset Really Is, in: ROADTOVR, 2023, https://www.roadtovr.com/meta-quest-3-teardown-ifixit-repair/, online in: https://www.roadtovr.com/ [08.02.2024].
7. Hayden, S.: Vive Ultimate Tracker Gets Beta Support for Third-Party PC VR Headsets, in: ROADTOVR, 2024, https://www.roadtovr.com/vive-ultimate-tracker-quest-index-pico/, online in: https://www.roadtovr.com/ [08.02.2024].
8. n.a.: What to Watch: February 2024 Highlights, n: Meta Quest-Blog, 2024, https://www.meta.com/de-de/blog/quest/what-to-watch-free-meta-quest-tv-vr-film, online in: https://www.meta.com/at/ [08.02.2024].
9. CGV Channel: TU Graz Virtual Reality Cave, https://www.youtube.com/watch?v=aeTHlAZtlAI [08.02.2024].
06 | VR, AR, MR, XR – Exploring the future of extended reality and its technologies
What this blogpost is all about
To explore one of my possible research topics further, this blogpost will look into the strengths and weaknesses of virtual, augmented, mixed and extended reality systems as well as their exact definitions and current technological trends, to gain a better understanding of which system / technology to use in future endeavors when going for different experiences.
VR vs AR vs XR/MR – a comparison
To better understand the difference and similarities between the different technologies, it is first important to understand their definitions as well as their strengths and weaknesses. For this reason, a short overview will be provided.
Virtual Reality (VR)
Virtual reality creates a computer-generated environment that immerses the user in a completely digital, three-dimensional space, often experienced through specialized VR headsets, providing a sense of presence and interaction within the virtual world. The user, thereby, is completely seperated from the real world and any stimuli they experience is completely computer generated. As a result, it usually allows for a much deeper immersion than the other solutions researched in this blogpost, but has its own strengths and weaknesses to be aware of.
Strengths:
- Offers a completely immersive experience, perfect for training, gaming and simulations
- Can easily create spaces, that are not accessible normally and / or provide space where normally there would be none
- Can be used in healthcare, especially in therapeutic applications, to provide immersive therapy, exposure therapy, pain management and rehabilitation
Weaknesses:
- Isolation from real world may cause emotional distress (solitude)
- Needs special equipment, that may be costly and / or not readily available
Augmented Reality (AR)
Augmented reality overlays digitally created content onto our real world, enhancing the user’s perception of their surroundings by integrating computer-generated information such as images, text, or 3D models into the real-time environment. These are typically viewed through devices like AR glasses, tablets or smartphone screens, though in recent years, more applications have surfaced. Especially in the automotive industry, heads-up displays also make use of AR to display necessary information directly to the driver while projecting said information directly onto the windshield.
Strengths:
- Real world information overlay, that relays information in real-time and provides additional input
- Allows for hands-free interaction, which allows the user to engage the digital content while also staying aware in the real world
- Useful for product visualisation and trying out products before making a buying decision
Weaknesses:
- Limited field of view, especially on smartphone screens or tablets
- Mobile dependency means less computing power, limiting display performance and causes need for optimisation
Mixed Reality (MR)
Mixed reality combines elements of both virtual and augmented reality, allowing digital and physical objects to coexist and interact in real-time. It seamlessly blends the virtual and real worlds, and allows switching between them, enabling users to engage with both types of content simultaneously. While this, of course, can provide difficult to understand with at first, it also allows for a much deeper influence of the user’s perceived reality as a result.
Strengths:
- A high level of versatility, as it combines both VR and AR, it allows for a broader range of experiences to be created
- Enables both in-room and virtual connection, communication and collaboration
- Can, same as VR, be used in a wide variety of industries for training purposes, while also allowing for direct testing in the real world in AR
Weaknesses:
- Different technologies and their implementation can cause performance and optimisation issues, posing technical difficulties
- Cost of adaption currently still very high, especially when compared to pure VR or AR solutions
Extended Reality (XR)
Finally, extended reality is an umbrella term, encompassing VR, AR, and MR. It referes to the spectrum of immersive technologies, that extend, enhance, or blend reality. XR is a comprehensive term covering the entire range of experiences, from completely virtual to fully augmented. It aims to offer a holistic approach to immersive technologies. As such, it comes with all of the previously strengths and can, if used correctly, mitigate some of their weaknesses. The opposite, when used incorrectly, is also true, however.
Current trends and technologies
While of course, the market around VR, AR, MR and other technological solutions like that is constantly evolving, it is still important to understand the general direction of these developments in order to better understand them and work with them. As such, some of the most important ones will be listed here.
Stand-alone, wireless VR, AR and MR headsets without the need for external trackers:
HTC recently presented their new inside-out tracker, that would allow for inbuilt-tracking on a multitude of existing 3rd party headsets
While some of the previous generation’s systems still require for external trackers / tracking stations to be set up or for the headset to be connected to a PC via cable in order to ensure a smooth and immersive experience, current trends have since begun to diverge from that. The current trend seems to be to offer stand-alone, wireless VR, AR and MR solutions without the need for a seperate computer or similar unit for calculations, that also offers tracking via inbuilt sensors instead of external stations. The tracker shown in the picture above, for example allows for complete controllerless tracking of a person’s arms, legs, torso and even head, all by attaching it to the limb in question and up to 8 trackers can currently be used together to provide a smooth and easy experience.
Hybrid systems:
Both the Apple Vision Pro and the Meta Quest 3 offer hybrid solutions when it comes to VR and AR.
While there are still certainly solutions, that focus on either VR or AR in particular, MR seems to be the much more common trend, with passthrough headsets becoming more and more widespread. This is especially obvious when looking at both Apple’s recently released Vision Pro, but also Meta’s Quest 3.
Wider application in everyday life:
When looking at the adaption rate of VR, AR and MR solutions, it quickly becomes apparent that the possible applications for it have skyrocketed. While originally more of a niche development, nowadays a wide variety of experiences are offered, ranging from immersive nature documentaries, sport events, games, movies and more. Furthermore, with both the increasing variety of headsets to choose from and their increasing calculation power comes a lower price – except for the apple solution, of course – with an also increased quality in the experience, making the technology much more accessible in everyday life.
Next steps:
- Look further into different XR solutions and their respective issues
- Research essential tools for creating immersive virtual environments
- Check methods of engagement and interaction within these digital environments
- Look into accessibility and how to ensure it
- Research into immersion and storytelling
Sources:
1. Springer / Gabler.: Virtuelle Realität, in: Gabler Wirtschaftslexikon, n.y.,
https://wirtschaftslexikon.gabler.de/definition/virtuelle-realitaet-54243/ online in: https://wirtschaftslexikon.gabler.de/ [08.02.2024].
2. n.A.: Was ist Augmented Reality?, in: Omnia360, 2020, https://omnia360.de/blog/was-ist-augmented-reality/, online in: https://omnia360.de/ [08.02.2024].
3. n.A.: Mixed Reality: Wenn sich Reales und Virtuelles vermischt, in: Omnia360, 2023, https://omnia360.de/blog/mixed-reality/, online in: https://omnia360.de/ [08.02.2024].
4. n.a.: Extended Reality, in: RyteWiki, n.y., https://de.ryte.com/wiki/Extended_Reality, online in: https://de.ryte.com/wiki/Hauptseite [08.02.2024].
5. Hayden, S.: Vision Pro Teardown Shows Balancing Act Between Cutting Edge Tech & Weighty Design, in: ROADTOVR, 2024, https://www.roadtovr.com/apple-vision-pro-teardown-ifixit/, online in: https://www.roadtovr.com/ [08.02.2024].
6. Hayden, S.: Quest 3 Teardown Shows Just How Slim the Headset Really Is, in: ROADTOVR, 2023, https://www.roadtovr.com/meta-quest-3-teardown-ifixit-repair/, online in: https://www.roadtovr.com/ [08.02.2024].
7. Hayden, S.: Vive Ultimate Tracker Gets Beta Support for Third-Party PC VR Headsets, in: ROADTOVR, 2024, https://www.roadtovr.com/vive-ultimate-tracker-quest-index-pico/, online in: https://www.roadtovr.com/ [08.02.2024].
8. n.a.: What to Watch: February 2024 Highlights, n: Meta Quest-Blog, 2024, https://www.meta.com/de-de/blog/quest/what-to-watch-free-meta-quest-tv-vr-film, online in: https://www.meta.com/at/ [08.02.2024]
XR 10 // Wrap up
As we close the chapter on a semester filled with extensive learning about Extended Reality (XR), it’s an opportune moment to reflect on the ground we’ve covered and to anticipate the exciting journey ahead. Our exploration has spanned a diverse range of topics, each offering a unique perspective on the rapidly evolving world of XR.
- XR 1// Evolving Interaction and Experiences in XR: I began this journey by delving into the dynamic world of XR interactions, examining how they’ve transformed over time and what the future holds.
- XR 2 // The Evolution and Current Landscape of Extended Reality: This post offered a comprehensive view of XR’s growth trajectory and its current state, setting the stage for this subsequent deep dives.
- XR 3 // VR Interactions: Controller vs Body Tracking: A detailed comparison between controller-based and body-tracking interfaces in VR, highlighting their unique advantages and potential applications.
- XR 4 // Enhancing Virtual Reality: The Power of Haptic Feedback: I explored the sensory dimension of VR, focusing on how haptic feedback intensifies the immersive experience.
- XR 5 // Interesting Case Study: HoloStudio UI and Interaction Design by Microsoft: This case study provided insights into practical applications of XR, emphasizing user interface and interaction design.
- XR 6 // UX in Mixed Reality: I discussed the intricacies of user experience design in the mixed reality spectrum, emphasizing its importance in creating engaging and intuitive applications.
- XR 7 // Dive into Vision OS Guidelines: This post was dedicated to understanding the best practices and guidelines in designing for XR platforms, particularly the Vision OS.
- XR 8 // Beyond Gaming: XR in the Entertainment Industry: I expanded our view to see how XR is revolutionizing the broader entertainment sector, beyond just gaming.
- XR 9 // XR in the Military Complex: My exploration concluded with an examination of XR’s applications in military training and strategy, showcasing its diverse utility.
What now?
Last semester’s curriculum primarily revolved around theoretical aspects. For the upcoming semester, I aim to adopt a more practical approach towards the subject. This will involve actively engaging with various XR applications and models for comparative analysis and learning. Additionally, I plan to implement the concepts learned in my own projects and create case studies for them.
XR 9 // XR in the Military Complex
In this semester’s research project, I explored various facets of Extended Reality (XR). Today, let’s delve into a lesser-discussed but significant aspect: the role of XR in the military defense complex. Notably, the military sector is one of the leading investors and developers in this technology.
Training with Mixed Reality
Mixed Reality (MR), blending elements of Augmented Reality (AR) and Virtual Reality (VR), has revolutionized military training. Historically, MR applications like the Swiss tank-driving simulator from the 1970s laid the groundwork for today’s sophisticated systems.
One prominent example is the U.S. Army’s Synthetic Training Environment (STE). This advanced system merges virtual and augmented reality to simulate a wide range of scenarios, from urban warfare to counterinsurgency operations, thus providing immersive training experiences. The STE is a comprehensive platform integrating virtual, live, and collective training elements, designed to be portable and cost-effective. It includes the Reconfigurable Virtual Collective Trainer (RVCT), which offers training for various military vehicles and infantry units.
Mixed Reality in Combat
MR’s role extends beyond training to actual combat operations. It significantly enhances situational awareness by providing soldiers with real-time information through contextually relevant visualizations. This includes displaying crucial data like maps, navigation, and enemy locations seamlessly.
A key development in this area is the Integrated Visual Augmentation System (IVAS), a collaborative effort between Microsoft and the U.S. Army. Based on Microsoft’s HoloLens technology, IVAS delivers advanced capabilities such as rapid target acquisition, enhanced situational awareness, and improved navigational tools. It integrates various technologies like thermal imagery, sensors, GPS, and night vision to give soldiers a comprehensive view of the battlefield. This technology is not only pivotal for training but also holds immense potential for real-world combat operations, allowing soldiers to plan and execute missions with enhanced precision and information.
Support Functions
MR’s applications in the military also extend to support functions. It can transform maintenance and repair processes by overlaying relevant instructions onto real-world objects, aiding technicians and mechanics in performing tasks more efficiently.
In medical support and telemedicine, MR can overlay digital content such as instructions and patient data, facilitating accurate and efficient medical procedures in challenging environments.
Conclusion
MR technology is a game-changer in military applications, enhancing various aspects of operations. While it offers immense benefits in training, situational awareness, and support functions, there are challenges to consider. For instance, overreliance on technology can lead to operational inefficiencies if not managed properly. The concept of „HUD-Cripple,“ prevalent among Navy aviators, highlights the risk of becoming overly dependent on technological aids to the extent that performance without them is significantly impaired.
Moreover, the use of MR in combat situations introduces ethical dilemmas around warfare conduct. The enhanced capabilities provided by MR could lead to debates about the fairness and humanitarian implications of using such advanced technology in conflicts. This necessitates a balance between technological advancement and adherence to international warfare norms and ethics.
The responsibility of XR designers and developers in this context is profound. They must not only focus on the technical and functional aspects of MR systems but also consider their broader societal implications. This includes ensuring that the technology is used responsibly and in accordance with ethical standards. Designers and developers need to collaborate closely with military experts, ethicists, and psychologists to understand the full spectrum of impacts their creations might have. Furthermore, there should be ongoing assessment and adjustment of these technologies to align with evolving ethical standards and societal values.
As we venture further into this technologically advanced era, the responsibility of XR professionals extends beyond innovation, encompassing the ethical stewardship of their creations in the complex domain of military applications.
Sources
- https://capsulesight.com/mixedreality/real-examples-and-use-cases-of-mixed-reality-in-military/#:~:text=MR
- https://ieeexplore.ieee.org/abstract/document/8550993
- https://www.sciencedirect.com/science/article/abs/pii/S026322411930154Xhttps://www.armyupress.army.mil/Journals/Military-Review/English-Edition-Archives/May-June-2022/Kallberg/
- https://www.warfighterpodcast.com/blog/how-is-the-military-uses-mixed-reality-mr-for-training-and-operations/
- https://www.youtube.com/watch?v=AcQifPHcMLE
XR 8 // Beyond Gaming: XR in the Entertainment Industry
XR is a technology that has been gaining popularity in the entertainment industry. While gaming is a major part of XR, this post would explore its other applications in entertainment, such as virtual concerts, immersive theater, and interactive art installations.
Virtual Concerts: A New Stage for Artists and Fans
Imagine attending a concert by your favorite artist from the comfort of your living room, yet feeling as though you’re right there in the front row. XR makes this possible. Virtual concerts in XR are not just about streaming a live performance; they are about creating an immersive, interactive experience. Fans can choose different viewpoints, interact with the environment, and even feel the vibration of the music through haptic feedback technology.
Artists like Travis Scott and Marshmello have already experimented with these concepts, drawing millions of virtual attendees. These events aren’t just concerts; they’re hyper-realistic experiences blending music, visual art, and digital interaction.
Meta is also pushing strongly into this direction by hosting live concerts on their Meta Quest Plattform. There will be for example a Lice concert by imagine Dragon at June 15th on this plattform.
Immersive Theater: Blurring the Lines Between Audience and Performer
Theater has always been an immersive experience, but XR takes this immersion to a new level. Unlike traditional theater, where the audience is a passive observer, XR theater can make viewers a part of the performance. Through VR headsets or AR applications, audience members can experience different narratives from multiple perspectives, interact with the performers, or even influence the outcome of the story.
Companies like Punchdrunk and Magic Leap are pioneering in this space, creating experiences where the line between audience and performer is blurred, leading to a more intimate and personal form of storytelling.
Interactive Art Installations: Stepping into the Canvas
Art has always been a medium for expression and experience, but XR adds an interactive dimension to it. Interactive art installations using XR technologies allow viewers to step into the artwork, manipulate elements, and experience the art in a multi-sensory manner. This form of art is not just to be seen but to be experienced and interacted with.
Artists like Refik Anadol and teamLab are at the forefront, creating stunning visual landscapes that respond to and evolve with viewer interactions. These installations are not static; they are dynamic and alive, offering a personalized experience to every viewer.
Conclusion: A New Era of Entertainment
XR in entertainment is more than a technological advancement; it’s a paradigm shift in how we experience art, music, and storytelling. It’s about creating worlds that we can step into, interact with, and be a part of. As we look to the future, the possibilities are boundless. We’re not just witnessing a change in entertainment; we’re participating in the birth of entirely new forms of expression and experience.
This is just the beginning. As XR technologies continue to evolve, we can expect to see even more innovative and immersive experiences that challenge our perceptions of reality and entertainment. The future of entertainment is here, and it’s virtual, augmented, and mixed.
Sources
XR 7 // Dive into Vision OS Guidelines
Apple is stepping into the future with its highly anticipated mixed reality headset, introducing a groundbreaking operating system: Vision OS. This isn’t just another tech release; it’s a glimpse into what could shape the future of mixed reality. Diving into Apple’s developer resources, we’re offered a preview of this innovative landscape. Interestingly, Apple steers clear of terms like „Virtual Reality“ or „Augmented Reality,“ opting for „Spatial Design.“ This isn’t just a play on words; it’s a strategic move to set Vision OS apart from other mixed reality platforms.
A Familiar Yet Revolutionary Interface
Vision OS brings a familiar feel to an entirely new dimension. The interface mirrors iPad app designs but in a dynamic 3D space. It’s not just about aesthetics; it’s about functionality. Windows in Vision OS adapt to lighting conditions, and introduce intuitive controls for movement, resizing, and closing. The system also integrates extra depth layers to establish a hierarchy between elements, all while maintaining a spacing akin to iPad apps. Apple’s strategy here is clear: use familiar paradigms and patterns to ease users into this new spatial environment.
Human-Centric Design
At its core, Vision OS is designed with a keen focus on human interaction. The view plane is usually centered and horizontally aligned, aligning with the natural human line of sight. Apple makes a notable design choice here: windows don’t follow your head movements; they’re anchored in 3D space. Familiar gestures, like pinch to zoom, are still part of the experience, offering users various ways to interact with apps.
Dimensional Depth
In Vision OS, apps are designed to be in tune with the user’s real-world surroundings. Apple emphasizes UI elements crafted from a glass-like material, allowing backgrounds to subtly shine through and create a sense of real 3D objects blending into the room. Controls and menus are thoughtfully positioned closer to the user, making them more accessible and easier to perceive. Apple’s attention to detail extends to how windows and apps interact with their environment, casting realistic shadows and emitting light. The depth usage is subtle, and the windows closer to the user are smaller, enhancing the sense of spatial awareness.
Immersive Experiences
Vision OS categorizes immersion into three levels:
- App window floating in front of user
- panoramic window wrpaping around the user
- Completely surround vr experience
The system smartly fades out the surrounding environment to focus on the selected window. With 3D audio enhancement, these immersive experiences are reserved for moments that truly matter, always allowing an easy return to reality.
Authenticity Is Key
In Vision OS, authenticity is paramount. Apps are expected to be engaging, immersive, and make sense for this new device. It’s not just about creating something new; it’s about creating something that feels right for the platform.
Conclusion
Vision OS is Apple’s bold statement in the mixed reality arena, blending the familiar with innovative spatial design. With its human-centric approach, dimensional depth, varying levels of immersion, and emphasis on authenticity, Vision OS is poised to revolutionize how we interact with technology. It’s more than an operating system; it’s a new way to experience the digital world.
Sources
- https://developer.apple.com/design/
- https://developer.apple.com/design/human-interface-guidelines/ornaments/
- https://developer.apple.com/design/human-interface-guidelines/immersive-experiences/
- https://developer.apple.com/design/human-interface-guidelines/designing-for-visionos/
- https://developer.apple.com/videos/play/wwdc2023/10072/
- https://www.figma.com/community/file/1253443272911187215
XR 6 // UX in Mixed Reality
Physical Considerations
- Environmental Interface: Designers must consider the entire surrounding environment as a potential interface, moving beyond the confines of a flat screen.
- Comfortable Focusing Range: Interactive elements should be placed within a range of half a meter to 20 meters, the comfortable focusing distance for human eyes.
- Beyond Reach: For interacting with objects 20 meters away, MR utilizes tools like handheld controllers or technologies such as eye tracking and hand recognition.
Eye Movement
The human eye comfortably moves 30°-35° in all directions, creating a field of view (FoV) of about 60°. Key UI elements should be placed within this range for easy accessibility.
Arms Reach
The average arm’s length is 50–70 cm. Essential interactions should be positioned at this distance for ease of use.
Designing for Distance
Drawing from Kharis O’Connell’s “Designing for Mixed Reality”, the interaction space is divided into three layers:
- Interaction Plane: Core UI elements are placed within arm’s reach.
- Mid-Zone: For placement of virtual objects in MR.
- Legibility Horizon: The limit for comfortable focus and reading, approximately 20 meters. Beyond this, only images should be used.
Addressing User Fatigue
- Ease of Exit: Always provide a straightforward method to exit or pause, like a button.
- Save Functionality: Allow users to save progress to prevent data loss and alleviate exit anxiety.
Scaling and Interaction
- Button Size: Ensure buttons are large enough, with a minimum size of 2 centimeters.
- Natural Interactions: Mimic real-world interactions, like picking up a mug by its handle.
Poses and Gestures
- Clear Instructions: Given the novelty of MR, provide explicit instructions for poses and gestures.
- Simplicity: Use poses and gestures sparingly to avoid overwhelming users.
Feedback and Guidance
- System Feedback: Implement feedback mechanisms like haptic feedback or color changes when interacting with virtual elements.
- Clear Guidance: Offer concise and clear instructions, crucial in the unfamiliar terrain of MR.
Mixed Reality is not just a new technology; it’s a new way of interacting with our world. As we design for MR, we must consider the unique physical and perceptual aspects of this medium. By focusing on intuitive interactions, comfortable viewing distances, and clear instructions, we can create MR experiences that are not only engaging but also accessible and user-friendly. The future of MR is bright, and as designers and technologists, it’s our responsibility to pave the way for this exciting new era of digital interaction.
Sources
- https://medium.com/ux-planet/ux-101-for-virtual-and-mixed-reality-part-2-working-with-the-senses-c39fbd502494
- https://medium.com/ux-planet/ux-101-for-virtual-and-mixed-reality-part-1-physicality-3fed072f371
- „Designing for Mixed Reality“ by Kharis O’Connell
- https://www.inderscienceonline.com/doi/abs/10.1504/IJTMKT.2019.104600
XR 5 // Interesting Case Study: HoloStudio UI and Interaction Design by Microsoft
This case study from Microsoft’s HoloStudio highlights the unique challenges and innovative solutions in designing UI and interaction experiences in mixed reality. It emphasizes the importance of user comfort, non-intrusive alerts, and seamless interaction between UI elements and holograms.
For more in-depth insights and details, you can read the full case study on Microsoft’s official website.
Problem 1: Reluctance to Move in a Virtual Environment
In HoloStudio, Microsoft initially designed the Workbench as a rectangle, akin to a real-world desk. However, they noticed a behavioral pattern: users were hesitant to move around. This reluctance was attributed to a lifetime of conditioning to stay still while working at a desk or computer. To counteract this, the Workbench was redesigned into a circular shape, eliminating the notion of a ‚front‘ position. This encouraged users to move around and explore their 3D creations from all angles.
Key Learning: Comfort for the user is paramount. Essential UI elements, for instance, could be anchored to the virtual hand, reducing the need for physical movement to access them.
Problem 2: Disruptive Modal Dialogs
In 3D environments, traditional modal dialogs can be intrusive, popping up unexpectedly and disrupting the user experience. Microsoft experimented with various solutions and finally adopted a „thought bubble“ system. This system used visual cues like pulsing tendrils to subtly direct user attention where needed in the application, avoiding the abruptness of traditional pop-ups.
Key Learning: Alerting users in 3D environments requires more finesse. Using attention directors like spatial sound, light rays, or thought bubbles can effectively guide users without being obtrusive.
Problem 3: UI Obstruction by Other Holograms
A common challenge in mixed reality is the obstruction of UI controls by other holograms. Microsoft’s initial solution of moving UI controls closer to the user proved uncomfortable, as it created a disconnect between the control and the associated hologram. The final solution was to ‚ghost‘ the UI control at the same distance as its associated hologram, maintaining a sense of connection while ensuring visibility and accessibility.
Key Learning: Accessibility of UI controls is crucial, even when obstructed. Innovative solutions are needed to ensure users can interact with holograms and controls seamlessly in the mixed reality environment.
In the following Blog Post I will go more in-depth into the Topic UX / UI in mixed reality.