Design Principles and User Interfaces of Erkki Kurenniemi’s Electronic Musical Instruments of the 1960’s and 1970’s

The paper „Design Principles and User Interfaces of Erkki Kurenniemi’s Electronic Musical Instruments of the 1960’s and 1970’s“ by Mikko Ojanen et al., published at the NIME 2007 conference, offers a comprehensive examination of the innovative electronic musical instruments designed by Erkki Kurenniemi. These instruments, created in the 1960s and 1970s, were avant-garde for their time, significantly influencing the field of electronic music with their experimental approaches to user interfaces and interaction designs.

DIMI-A

Kurenniemi’s instruments, like the Dimi series, utilized digital logic and unconventional interactive control methods that departed from traditional musical interfaces. His creations incorporated novel ideas such as real-time control, non-traditional input mechanisms (like biofeedback and video interfaces), and were designed with an experimental ethos that challenged conventional musical instrument design. This approach allowed for a unique artistic expression that mirrored the radical cultural shifts occurring during that era.

However, despite their innovative designs, Kurenniemi’s instruments faced challenges related to user accessibility and market acceptance. The complexity and unfamiliarity of the interfaces possibly hindered wider adoption and commercial success. This reflects a common tension in design between innovation and usability—groundbreaking ideas may not always align with user expectations or needs, which can impact their practical application and acceptance.

Critically, this paper not only documents historical innovations but also provides insights into the relationship between technological capabilities and artistic expression in electronic music. It emphasizes the importance of understanding user interaction in the design of new musical instruments and the potential barriers innovators may face when their creations precede current technological and cultural readiness.

Overall, the paper serves as a valuable resource for understanding the evolution of electronic musical instruments and offers critical lessons on the impact of user interface design on the adoption of new technologies in art. It’s a recommended read for those interested in the intersection of music technology, design, and user experience.

XR 10 // Wrap up

As we close the chapter on a semester filled with extensive learning about Extended Reality (XR), it’s an opportune moment to reflect on the ground we’ve covered and to anticipate the exciting journey ahead. Our exploration has spanned a diverse range of topics, each offering a unique perspective on the rapidly evolving world of XR.

  1. XR 1// Evolving Interaction and Experiences in XR: I began this journey by delving into the dynamic world of XR interactions, examining how they’ve transformed over time and what the future holds.
  2. XR 2 // The Evolution and Current Landscape of Extended Reality: This post offered a comprehensive view of XR’s growth trajectory and its current state, setting the stage for this subsequent deep dives.
  3. XR 3 // VR Interactions: Controller vs Body Tracking: A detailed comparison between controller-based and body-tracking interfaces in VR, highlighting their unique advantages and potential applications.
  4. XR 4 // Enhancing Virtual Reality: The Power of Haptic Feedback: I explored the sensory dimension of VR, focusing on how haptic feedback intensifies the immersive experience.
  5. XR 5 // Interesting Case Study: HoloStudio UI and Interaction Design by Microsoft: This case study provided insights into practical applications of XR, emphasizing user interface and interaction design.
  6. XR 6 // UX in Mixed Reality: I discussed the intricacies of user experience design in the mixed reality spectrum, emphasizing its importance in creating engaging and intuitive applications.
  7. XR 7 // Dive into Vision OS Guidelines: This post was dedicated to understanding the best practices and guidelines in designing for XR platforms, particularly the Vision OS.
  8. XR 8 // Beyond Gaming: XR in the Entertainment Industry: I expanded our view to see how XR is revolutionizing the broader entertainment sector, beyond just gaming.
  9. XR 9 // XR in the Military Complex: My exploration concluded with an examination of XR’s applications in military training and strategy, showcasing its diverse utility.

What now?

Last semester’s curriculum primarily revolved around theoretical aspects. For the upcoming semester, I aim to adopt a more practical approach towards the subject. This will involve actively engaging with various XR applications and models for comparative analysis and learning. Additionally, I plan to implement the concepts learned in my own projects and create case studies for them.

XR 9 // XR in the Military Complex

In this semester’s research project, I explored various facets of Extended Reality (XR). Today, let’s delve into a lesser-discussed but significant aspect: the role of XR in the military defense complex. Notably, the military sector is one of the leading investors and developers in this technology.

Training with Mixed Reality

Mixed Reality (MR), blending elements of Augmented Reality (AR) and Virtual Reality (VR), has revolutionized military training. Historically, MR applications like the Swiss tank-driving simulator from the 1970s laid the groundwork for today’s sophisticated systems.

One prominent example is the U.S. Army’s Synthetic Training Environment (STE). This advanced system merges virtual and augmented reality to simulate a wide range of scenarios, from urban warfare to counterinsurgency operations, thus providing immersive training experiences. The STE is a comprehensive platform integrating virtual, live, and collective training elements, designed to be portable and cost-effective. It includes the Reconfigurable Virtual Collective Trainer (RVCT), which offers training for various military vehicles and infantry units​​​​.

Mixed Reality in Combat

MR’s role extends beyond training to actual combat operations. It significantly enhances situational awareness by providing soldiers with real-time information through contextually relevant visualizations. This includes displaying crucial data like maps, navigation, and enemy locations seamlessly.

Soldiers training with Microsoft’s IVAS

A key development in this area is the Integrated Visual Augmentation System (IVAS), a collaborative effort between Microsoft and the U.S. Army. Based on Microsoft’s HoloLens technology, IVAS delivers advanced capabilities such as rapid target acquisition, enhanced situational awareness, and improved navigational tools. It integrates various technologies like thermal imagery, sensors, GPS, and night vision to give soldiers a comprehensive view of the battlefield. This technology is not only pivotal for training but also holds immense potential for real-world combat operations, allowing soldiers to plan and execute missions with enhanced precision and information​​​​.

Support Functions

MR’s applications in the military also extend to support functions. It can transform maintenance and repair processes by overlaying relevant instructions onto real-world objects, aiding technicians and mechanics in performing tasks more efficiently.

In medical support and telemedicine, MR can overlay digital content such as instructions and patient data, facilitating accurate and efficient medical procedures in challenging environments.

Conclusion

MR technology is a game-changer in military applications, enhancing various aspects of operations. While it offers immense benefits in training, situational awareness, and support functions, there are challenges to consider. For instance, overreliance on technology can lead to operational inefficiencies if not managed properly. The concept of „HUD-Cripple,“ prevalent among Navy aviators, highlights the risk of becoming overly dependent on technological aids to the extent that performance without them is significantly impaired.

Moreover, the use of MR in combat situations introduces ethical dilemmas around warfare conduct. The enhanced capabilities provided by MR could lead to debates about the fairness and humanitarian implications of using such advanced technology in conflicts. This necessitates a balance between technological advancement and adherence to international warfare norms and ethics.

The responsibility of XR designers and developers in this context is profound. They must not only focus on the technical and functional aspects of MR systems but also consider their broader societal implications. This includes ensuring that the technology is used responsibly and in accordance with ethical standards. Designers and developers need to collaborate closely with military experts, ethicists, and psychologists to understand the full spectrum of impacts their creations might have. Furthermore, there should be ongoing assessment and adjustment of these technologies to align with evolving ethical standards and societal values.

As we venture further into this technologically advanced era, the responsibility of XR professionals extends beyond innovation, encompassing the ethical stewardship of their creations in the complex domain of military applications.

Sources

XR 8 // Beyond Gaming: XR in the Entertainment Industry

XR is a technology that has been gaining popularity in the entertainment industry. While gaming is a major part of XR, this post would explore its other applications in entertainment, such as virtual concerts, immersive theater, and interactive art installations.

Virtual Concerts: A New Stage for Artists and Fans

Imagine attending a concert by your favorite artist from the comfort of your living room, yet feeling as though you’re right there in the front row. XR makes this possible. Virtual concerts in XR are not just about streaming a live performance; they are about creating an immersive, interactive experience. Fans can choose different viewpoints, interact with the environment, and even feel the vibration of the music through haptic feedback technology.

Artists like Travis Scott and Marshmello have already experimented with these concepts, drawing millions of virtual attendees. These events aren’t just concerts; they’re hyper-realistic experiences blending music, visual art, and digital interaction.

Meta is also pushing strongly into this direction by hosting live concerts on their Meta Quest Plattform. There will be for example a Lice concert by imagine Dragon at June 15th on this plattform.

Immersive Theater: Blurring the Lines Between Audience and Performer

Theater has always been an immersive experience, but XR takes this immersion to a new level. Unlike traditional theater, where the audience is a passive observer, XR theater can make viewers a part of the performance. Through VR headsets or AR applications, audience members can experience different narratives from multiple perspectives, interact with the performers, or even influence the outcome of the story.

Companies like Punchdrunk and Magic Leap are pioneering in this space, creating experiences where the line between audience and performer is blurred, leading to a more intimate and personal form of storytelling.

Interactive Art Installations: Stepping into the Canvas

Art has always been a medium for expression and experience, but XR adds an interactive dimension to it. Interactive art installations using XR technologies allow viewers to step into the artwork, manipulate elements, and experience the art in a multi-sensory manner. This form of art is not just to be seen but to be experienced and interacted with.

Artists like Refik Anadol and teamLab are at the forefront, creating stunning visual landscapes that respond to and evolve with viewer interactions. These installations are not static; they are dynamic and alive, offering a personalized experience to every viewer.

Conclusion: A New Era of Entertainment

XR in entertainment is more than a technological advancement; it’s a paradigm shift in how we experience art, music, and storytelling. It’s about creating worlds that we can step into, interact with, and be a part of. As we look to the future, the possibilities are boundless. We’re not just witnessing a change in entertainment; we’re participating in the birth of entirely new forms of expression and experience.

This is just the beginning. As XR technologies continue to evolve, we can expect to see even more innovative and immersive experiences that challenge our perceptions of reality and entertainment. The future of entertainment is here, and it’s virtual, augmented, and mixed.

Sources

XR 7 // Dive into Vision OS Guidelines

Apple is stepping into the future with its highly anticipated mixed reality headset, introducing a groundbreaking operating system: Vision OS. This isn’t just another tech release; it’s a glimpse into what could shape the future of mixed reality. Diving into Apple’s developer resources, we’re offered a preview of this innovative landscape. Interestingly, Apple steers clear of terms like „Virtual Reality“ or „Augmented Reality,“ opting for „Spatial Design.“ This isn’t just a play on words; it’s a strategic move to set Vision OS apart from other mixed reality platforms.

A Familiar Yet Revolutionary Interface

Vision OS brings a familiar feel to an entirely new dimension. The interface mirrors iPad app designs but in a dynamic 3D space. It’s not just about aesthetics; it’s about functionality. Windows in Vision OS adapt to lighting conditions, and introduce intuitive controls for movement, resizing, and closing. The system also integrates extra depth layers to establish a hierarchy between elements, all while maintaining a spacing akin to iPad apps. Apple’s strategy here is clear: use familiar paradigms and patterns to ease users into this new spatial environment.

Human-Centric Design

At its core, Vision OS is designed with a keen focus on human interaction. The view plane is usually centered and horizontally aligned, aligning with the natural human line of sight. Apple makes a notable design choice here: windows don’t follow your head movements; they’re anchored in 3D space. Familiar gestures, like pinch to zoom, are still part of the experience, offering users various ways to interact with apps.

Dimensional Depth

In Vision OS, apps are designed to be in tune with the user’s real-world surroundings. Apple emphasizes UI elements crafted from a glass-like material, allowing backgrounds to subtly shine through and create a sense of real 3D objects blending into the room. Controls and menus are thoughtfully positioned closer to the user, making them more accessible and easier to perceive. Apple’s attention to detail extends to how windows and apps interact with their environment, casting realistic shadows and emitting light. The depth usage is subtle, and the windows closer to the user are smaller, enhancing the sense of spatial awareness.

Immersive Experiences

Vision OS categorizes immersion into three levels:

  1. App window floating in front of user
  2. panoramic window wrpaping around the user
  3. Completely surround vr experience

The system smartly fades out the surrounding environment to focus on the selected window. With 3D audio enhancement, these immersive experiences are reserved for moments that truly matter, always allowing an easy return to reality.

Authenticity Is Key

In Vision OS, authenticity is paramount. Apps are expected to be engaging, immersive, and make sense for this new device. It’s not just about creating something new; it’s about creating something that feels right for the platform.

Conclusion

Vision OS is Apple’s bold statement in the mixed reality arena, blending the familiar with innovative spatial design. With its human-centric approach, dimensional depth, varying levels of immersion, and emphasis on authenticity, Vision OS is poised to revolutionize how we interact with technology. It’s more than an operating system; it’s a new way to experience the digital world.

Sources

XR 6 // UX in Mixed Reality

Physical Considerations

  • Environmental Interface: Designers must consider the entire surrounding environment as a potential interface, moving beyond the confines of a flat screen.
  • Comfortable Focusing Range: Interactive elements should be placed within a range of half a meter to 20 meters, the comfortable focusing distance for human eyes.
  • Beyond Reach: For interacting with objects 20 meters away, MR utilizes tools like handheld controllers or technologies such as eye tracking and hand recognition.

Eye Movement

The human eye comfortably moves 30°-35° in all directions, creating a field of view (FoV) of about 60°. Key UI elements should be placed within this range for easy accessibility.

Key Elements are arranged in a FoV of ~60°

Arms Reach

The average arm’s length is 50–70 cm. Essential interactions should be positioned at this distance for ease of use.

Designing for Distance

Drawing from Kharis O’Connell’s “Designing for Mixed Reality”, the interaction space is divided into three layers:

  1. Interaction Plane: Core UI elements are placed within arm’s reach.
  2. Mid-Zone: For placement of virtual objects in MR.
  3. Legibility Horizon: The limit for comfortable focus and reading, approximately 20 meters. Beyond this, only images should be used.

Addressing User Fatigue

  • Ease of Exit: Always provide a straightforward method to exit or pause, like a button.
  • Save Functionality: Allow users to save progress to prevent data loss and alleviate exit anxiety.

Scaling and Interaction

  • Button Size: Ensure buttons are large enough, with a minimum size of 2 centimeters.
  • Natural Interactions: Mimic real-world interactions, like picking up a mug by its handle.

Poses and Gestures

  • Clear Instructions: Given the novelty of MR, provide explicit instructions for poses and gestures.
  • Simplicity: Use poses and gestures sparingly to avoid overwhelming users.

Feedback and Guidance

  • System Feedback: Implement feedback mechanisms like haptic feedback or color changes when interacting with virtual elements.
  • Clear Guidance: Offer concise and clear instructions, crucial in the unfamiliar terrain of MR.

Mixed Reality is not just a new technology; it’s a new way of interacting with our world. As we design for MR, we must consider the unique physical and perceptual aspects of this medium. By focusing on intuitive interactions, comfortable viewing distances, and clear instructions, we can create MR experiences that are not only engaging but also accessible and user-friendly. The future of MR is bright, and as designers and technologists, it’s our responsibility to pave the way for this exciting new era of digital interaction.

Sources

XR 5 // Interesting Case Study: HoloStudio UI and Interaction Design by Microsoft

This case study from Microsoft’s HoloStudio highlights the unique challenges and innovative solutions in designing UI and interaction experiences in mixed reality. It emphasizes the importance of user comfort, non-intrusive alerts, and seamless interaction between UI elements and holograms.

For more in-depth insights and details, you can read the full case study on Microsoft’s official website.

Problem 1: Reluctance to Move in a Virtual Environment

In HoloStudio, Microsoft initially designed the Workbench as a rectangle, akin to a real-world desk. However, they noticed a behavioral pattern: users were hesitant to move around. This reluctance was attributed to a lifetime of conditioning to stay still while working at a desk or computer. To counteract this, the Workbench was redesigned into a circular shape, eliminating the notion of a ‚front‘ position. This encouraged users to move around and explore their 3D creations from all angles.

Circular Environment to encourage Users to move around.

Key Learning: Comfort for the user is paramount. Essential UI elements, for instance, could be anchored to the virtual hand, reducing the need for physical movement to access them.

Problem 2: Disruptive Modal Dialogs

In 3D environments, traditional modal dialogs can be intrusive, popping up unexpectedly and disrupting the user experience. Microsoft experimented with various solutions and finally adopted a „thought bubble“ system. This system used visual cues like pulsing tendrils to subtly direct user attention where needed in the application, avoiding the abruptness of traditional pop-ups.

The "Thought Bubble" system included pulsing tendrils which provided a sense of direction, leading users to where their attention was needed in the app.
Dialogue Window guiding User to Action

Key Learning: Alerting users in 3D environments requires more finesse. Using attention directors like spatial sound, light rays, or thought bubbles can effectively guide users without being obtrusive.

Problem 3: UI Obstruction by Other Holograms

A common challenge in mixed reality is the obstruction of UI controls by other holograms. Microsoft’s initial solution of moving UI controls closer to the user proved uncomfortable, as it created a disconnect between the control and the associated hologram. The final solution was to ‚ghost‘ the UI control at the same distance as its associated hologram, maintaining a sense of connection while ensuring visibility and accessibility.

The solution: we ghosted the UI control, which both allowed interaction with the control and made it feel connected to the hologram it was affecting.
More Accessible UI

Key Learning: Accessibility of UI controls is crucial, even when obstructed. Innovative solutions are needed to ensure users can interact with holograms and controls seamlessly in the mixed reality environment.

In the following Blog Post I will go more in-depth into the Topic UX / UI in mixed reality.

XR 4 // Enhancing Virtual Reality: The Power of Haptic Feedback

Virtual Reality (VR) has transformed from a futuristic concept into a practical tool in various fields. However, the true immersion in VR is not just about what we see or hear; it’s also about what we feel. This is where haptic feedback plays a crucial role.

Prototype Glove by Meta

The Essence of Haptic Feedback

Haptic feedback refers to the use of touch or force to communicate with users in a digital environment. In VR, this technology simulates the tactile experience, making virtual interactions more realistic and engaging.

Why Haptic Feedback Matters

  1. Improved Immersion and Realism: Haptic feedback bridges the gap between virtual and real experiences. It allows users to ‚feel‘ objects and textures in VR, enhancing the sense of presence in the virtual world.
  2. Enhanced Learning and Training: In educational and training simulations, haptic feedback can significantly improve learning outcomes. For instance, medical students can practice surgeries in a VR environment, feeling the texture and resistance of virtual tissues and organs.
  3. Increased Accessibility: For individuals with visual impairments, haptic feedback opens up new possibilities in VR, allowing them to interact with and understand virtual environments through touch.

The Haptic Fidelity Framework

This study introduces the Haptic Fidelity Framework, an innovative tool designed to assess the realism of haptic feedback in VR. This framework is categorized into Sensing, Hardware, and Software, encompassing 14 criteria that collectively define the quality and realism of haptic experiences.

The study’s evaluation of 38 papers using this framework reveals a strong correlation between the Haptic Fidelity score and the perceived realism of haptic feedback. This finding underscores the framework’s effectiveness in providing a standardized method to quantify haptic feedback’s impact in VR environments.

Scatter Plot for the analyzed Papers of the study.

This framework is a game-changer for VR developers. By using the Haptic Fidelity Framework as a guideline, developers can enhance the tactile dimension of VR, leading to more realistic and engaging user experiences.

Challenges and Future Directions

While haptic technology is promising, it faces challenges like the precise alignment of virtual and real-world interactions. Future research is focused on improving the accuracy and range of sensations that can be simulated.

Sources

XR 3 // VR Interactions: Controller vs Body Tracking

Virtual Reality (VR) has revolutionized the way we experience digital content, offering immersive and interactive environments. One important aspect of VR is the method of interaction, which can greatly impact the user experience and effectiveness of training applications. In this blog post, we will delve into the differences between controller-based and body tracking interactions in VR, their implications for training, and recommendations for interaction design.

The Importance of Natural Interaction

Research has shown that natural and intuitive interactions in VR enhance presence, immersion, and usability. When users can interact with virtual objects in a way that mimics real-world actions (e.G. by actually grabbing objects or by pushing Buttons with the virtual controllers), they can focus on the learning experience rather than learning how to use the equipment. This allows for a deeper level of engagement and better retention of information. As Abich and colleagues (2021) argue, training in VR is most useful when it allows for embodied, experiential interaction. The spectrum of interactions in VR ranges from „arbitrary“ interactions (e.g., double-clicking a mouse to select an object) to „natural“ interactions (e.g., rotating the hand clockwise to rotate a valve in the same direction). The more natural an interaction seems, the higher presence, immersion, and usability it affords.

Controller-Based Interactions

Controller-based interactions in VR function similarly to game console controllers. The difference lies in the tracking capabilities of VR controllers, which accurately represent the user’s hand position and movement in the virtual environment. This enables users to interact with virtual objects in a more precise manner. However, a challenge with controller-based interactions is the lack of standardized control schemes. Different VR applications may require users to interact with virtual objects using various methods, such as reaching out or using a laser pointer. This inconsistency can lead to confusion and cognitive load for users. It is important for developers to consider implementing standardized control schemes to provide a consistent and intuitive user experience across different VR applications.

Body Tracking Interactions

Advancements in technology have enabled body tracking interactions in VR, such as eye tracking and hand/body tracking. Hand tracking, in particular, allows users to interact with virtual objects in a more natural and intuitive manner. For example, using pinch gestures to select objects or pressing virtual buttons. Hand tracking can enhance the sense of presence and immersion in VR experiences. However, similar to controller-based interactions, there is a lack of standardized gestures for hand tracking. Users may need to learn specific gestures for different applications, which can be a barrier to intuitive interaction. To address this challenge, companies like Ultra Leap are working towards establishing standards for gesture controls in VR. By introducing standardized gestures, users can intuitively know how to perform actions without the need for extensive tutorials or guesswork.

Recommendations for Interaction Design

To optimize the user experience in VR, it is crucial to provide clear and concise tutorials for both controller-based and body tracking interactions. Instead of relying on textual instructions, show users how to perform actions through visual cues and demonstrations. This „show, don’t tell“ approach can help users quickly grasp the interaction methods without the need for extensive reading or trial and error. Additionally, consider placing important buttons and menus within reach of the user to minimize the need for physical movement within the virtual environment. For example, utilizing a popup menu on the wrist of one virtual controller can provide quick access to important functions without requiring users to navigate across the virtual room. These design considerations can help reduce cognitive load and ensure a seamless and intuitive user experience in VR.

Menu solution by Ultraleap

In conclusion, the choice between controller-based and body tracking interactions in VR depends on the specific application and the desired level of immersion and naturalness. Both interaction methods have their advantages and challenges, but with thoughtful design, standardized gestures, and concise tutorials, VR can offer truly immersive and effective training experiences. As VR technology continues to evolve, it is important for developers and researchers to collaborate in establishing best practices and standards for interaction design to unlock the full potential of VR for training and other applications.

Links

XR 2 // The Evolution and Current Landscape of Extended Reality

From Sensorama to Apple Vision Pro: A Journey Through XR’s History

The Beginnings: Sensorama and the First HMD

The journey of Extended Reality (XR) dates back to 1956 when cinematographer Morton Heilig created Sensorama, the first Virtual Reality (VR) machine.

This innovative movie booth combined 3D, stereoscopic color video with audio, smells, and a vibrating chair, immersing viewers in a unique cinematic experience. Heilig’s pioneering work didn’t stop there; in 1960, he patented the first head-mounted display (HMD), merging stereoscopic 3D images with stereo sound, laying the groundwork for future VR technologies.

Early Steps in Augmented Reality: The Sword of Damocles

By 1965, the field of XR took another significant leap with Ivan Sutherland’s development of „The Sword of Damocles“ Considered the first augmented reality (AR) HMD and tracking system, it aimed to enhance users‘ perception of the world. Despite its primitive user interface and simple wireframe graphics, it marked a crucial step in the evolution of AR.

Mediated Reality and the Reality-Virtuality Continuum

In the 1970s, Steve Mann’s research into mediated reality, which later influenced tech giants like Google, Apple, and Microsoft, focused on augmenting human perception through digital overlays in the real world. Building on this concept, Paul Milgram and Fumio Kishino introduced the Reality-Virtuality continuum in 1994, illustrating a spectrum of experiences from purely real to purely virtual environments.

The 1990s: Pioneering AR and the Birth of Sportsvision

The 1990s saw Thomas Caudell & David Mizell develop the first see-through HMD, coining the term „augmented reality.“ In a significant mainstream breakthrough, Sportsvision broadcast the first live NFL game in 1998 with a yellow yard marker overlay, revolutionizing sports broadcasting.

Modern Advancements: The 2010s Onward

The 2010s heralded rapid advancements in XR technology. Key developments included:

  • First Oculus Rift Prototype: A milestone in VR technology.
  • 2014 – A Landmark Year: Sony and Samsung jumped into the VR headset market, while Google launched the affordable Cardboard VR viewer and the Google Glass AR glasses.
  • Microsoft’s HoloLens: Released in 2016, it introduced a more interactive AR experience, often referred to as „mixed reality.“
  • Pokémon GO: This 2016 game brought AR to the masses, demonstrating the technology’s mainstream appeal.

A Push into the Mainstream

Apple’s AR Kit and Google’s MR Toolkit made AR accessible on smartphones, broadening the technology’s reach. In 2017, the IKEA Place app showcased AR’s practical use in retail, allowing users to visualize furniture in their homes before purchasing.

The Current State: Meta Quest 3 and Apple Vision Pro

Today, we see state-of-the-art AR and VR combinations through devices like Meta Quest 3. The recent announcement of Apple Vision Pro signals a potential expansion in audience reach, acceptance, and continued research and development in mixed reality technologies.

Links