What this blogpost is all about
Building upon my previous blogpost on immersive level design, this post also aims to explore the topic of immersion and interaction within VR, MR and AR a bit more by looking into different interface and interactivity solutions currently available, some previously mentioned already, some new, that may increase intuitiveness and engagement from the user. While also still covering AR and MR solutions, the main focus of this blog post will be on VR, as it is more in line with my future plans and planned research.
Tracking, controllers, recognition and other interaction methods
With a wide variety of different VR, AR and MR headsets and technologies comes an equally wide variety of different input devices, interfaces and ways to interact with the created virtual environment. In VR alone, there exists a seemingly endless amount of different controllers, with each headset developer putting their own spin on it.
Different headsets, different controllers – HTC Vive Pro, Meta Quest 3 Pro, PS Move and Valve Index Pro (left to right)
However, controllers like these are by no means the sole mean to interact with virtual environments anymore, as with advancements in tracking, movement- and voice recognition, a vast variety of different input and interface methods has been developed alongside the conventional controller-based inputs.
Hands, eyes, voice and haptic feedback
As previously mentioned, constant advancements in available computing power, frequent optimisations as well as new technologies make it possible to create virtual experiences that are more immersive than ever.
One such advancement lies in tracking and how the tracked movement and data get processed. While hand and gesture tracking has been a long-time staple especially in AR headsets, due to their inbuilt sensors and tracking, it has also become one in VR and MR applications. To give a differentiation between hand-tracking, controller-based-tracking and gesture-tracking, more commonly known as gesture recognition, which all may appear similar at a glance, is quite simple. Hand-tracking, as the name suggests, track the actual movement of the hand within the virtual space.
Ultraleap’s 3Di, a small tracking camera for handtracking, comes with its own intergrated interface
Different than controller-based-tracking, it frees the hands for interactions, without relying on buttons or other inputs. Controller-tracking, in comparison, also tracks the hands movement, but instead of doing so directly, it tracks the hand-held controllers instead. These controllers usually come with a wide variety of buttons, joysticks and other triggers, that can be programmed and used to interact with the environment and input information.
Last but not least, gesture recognition interprets specific hand movements or gestures made by the user and reacts in specific ways, allowing interaction and enabling control over certain parts of the virtual space. It can be understood as a specific form of hand-tracking, as specific parts of the hand get tracked, though, in this case, the gesture made is usually more important than the position of the hand compared to the rest of the body.
Ultraleap Leap Motion Controller 2, a gesture tracking controller with a wide variety of applications
While it may seem now, that the main focus of current interaction lies in tracking movement of extremities, mainly the hands, this is not true. Eye tracking, for example, is a gaze-based form of interaction, that makes use of tracking monitors, that follow the user’s eye movements to enhance realism, allow interaction and render specific parts more or less, thus deepening immersion as needed, while devices like Amazon’s Alexa, Microsoft’s Cortana or Google’s own Voice Assistant can long sicne be used in VR and MR as well, to control and interact with the virtual environment using vocal commands. Using a combination of these different tracking technologies can make the user’s environment feel much more responsive.
But not only the users inputs can be enhanced by new technologies to create a more immersive experience. Using haptic feedback systems, spatial computing as well as hardware solutions, that creates the illusion of seemingly endless virtual spaces even within a very limited physical environment, the immersion of the user and responsiveness of the environment can be increased even further. While haptic feedback gloves have already been mentioned in a previous blog post, it is important to mention that haptic feedback, in the broader sense, is in no way limited to the tactile layer anymore.
teslasuit – a full body haptic feeback suit, that also tracks movement and can be used for motion capture
Haptic feedback suits, like the one shown above, can provide live-responses of the virtual environment to the user via vibrations. This can create the illusion of a physical component being present in the environment. Furthermore, spatial computing, especially in combination with multimedia-rooms, like the TU Graz‘ VR Cave, which can be used to more seamlessly merge physical and digital worlds, allowing physical objects to be tracked and influence the virtual, while also allowing virtual objects to interact with the real environment. Additional use of hardware, that allows for extended movement of the user through the virtual space even when limited by smaller real spaces, like an omnidirectional treadmil, can further blur the line between the virtual and the real.
Virtuix Omni One – a Omnidirectional treadmil developed for gaming in VR
Things to consider
When presented with all these different options to track the user’s input and the data they provide, it can be very easy to be overwhelmed or lose track of the reason of why these movements should be tracked and / or used in the first place, namely, to provide a smooth, fun and immersive experience, that allows a maximum amount of user interaction with a minimum amount of effort on the users side. To ensure that, there are a few important steps to consider when designing such an interaction.
Abstraction, intuitiveness, optimisation and sense of security
A good first step when approaching the design of user interaction and interfaces for the virtual is mimicking the real world and its interactions, to increase both intuitiveness for the user and clearness when providing feedback. By adapting the sense of realism or choosing a certain level of abstraction for the inputs and / or interfaces, they can be simplified and made to fit the desired experience without distracting the user.
Frequent user testing, followed by refinement and optimisation of the employed systems, can increase responsiveness, accessibility and create a sense of security in the user when confronted with the virtual environment. Furthermore, the higher the continuity of the created content, both in design and in experience, and the more seamless the transition between the physical and the virtual, the easier it is for the user to engage, which also boosts self-confidence and security.
All in all, by making use of the different technologies described above, while being aware of the challenges and chances they bring and optimising and adapting the desired experience according to the user’s needs, it is nowadays possible to create amazingly responsive environments already. It is, however, still important to be aware of the ever present limitations of current hardware, but with how rapidly technology and development keeps progressing, the next solution might already be around the corner.
Next steps:
- Look further into different VR and MR solutions and their respective issues
- Research essential tools for creating immersive virtual environments as well as different game engines and their advantages and disadvantages
- Check methods of engagement and interaction within these digital environments
- Look into accessibility and how to ensure it
- Research into immersion and storytelling
Sources:
1. Springer / Gabler.: Virtuelle Realität, in: Gabler Wirtschaftslexikon, n.y.,
https://wirtschaftslexikon.gabler.de/definition/virtuelle-realitaet-54243/ online in: https://wirtschaftslexikon.gabler.de/ [08.02.2024].
2. n.A.: Was ist Augmented Reality?, in: Omnia360, 2020, https://omnia360.de/blog/was-ist-augmented-reality/, online in: https://omnia360.de/ [08.02.2024].
3. n.A.: Mixed Reality: Wenn sich Reales und Virtuelles vermischt, in: Omnia360, 2023, https://omnia360.de/blog/mixed-reality/, online in: https://omnia360.de/ [08.02.2024].
4. n.a.: Extended Reality, in: RyteWiki, n.y., https://de.ryte.com/wiki/Extended_Reality, online in: https://de.ryte.com/wiki/Hauptseite [08.02.2024].
5. Hayden, S.: Vision Pro Teardown Shows Balancing Act Between Cutting Edge Tech & Weighty Design, in: ROADTOVR, 2024, https://www.roadtovr.com/apple-vision-pro-teardown-ifixit/, online in: https://www.roadtovr.com/ [08.02.2024].
6. Hayden, S.: Quest 3 Teardown Shows Just How Slim the Headset Really Is, in: ROADTOVR, 2023, https://www.roadtovr.com/meta-quest-3-teardown-ifixit-repair/, online in: https://www.roadtovr.com/ [08.02.2024].
7. Hayden, S.: Vive Ultimate Tracker Gets Beta Support for Third-Party PC VR Headsets, in: ROADTOVR, 2024, https://www.roadtovr.com/vive-ultimate-tracker-quest-index-pico/, online in: https://www.roadtovr.com/ [08.02.2024].
8. n.a.: What to Watch: February 2024 Highlights, n: Meta Quest-Blog, 2024, https://www.meta.com/de-de/blog/quest/what-to-watch-free-meta-quest-tv-vr-film, online in: https://www.meta.com/at/ [08.02.2024].
9. CGV Channel: TU Graz Virtual Reality Cave, https://www.youtube.com/watch?v=aeTHlAZtlAI [08.02.2024].