Undersea

Role: Lead UI/UX Designer

Overview

Undersea started from a simple demo where a single fish interacted with the user’s hand as it lead them towards a human-sized spherical aquarium featuring a coral reef and many varieties of fish. Though that initial demo was fairly simple, it resonated with the team as an opportunity to showcase the potential of spatial computing. It was built using Unreal Engine 4 and Vulkan 3.1 mobile on Magic Leap One.

I was brought into the project to help design an experience that went beyond what had been achieved with the original prototype, additionally I led UI/UX and Interaction Design for the experience,  I was responsible for all design documentation, user-flows, menu mock ups, icon sets while also producing usability test frameworks, and formulating surveys and questionnaires that a Jr Designer used to gather user insights to help with refinement.

This experience became part of an ongoing series of curated content, which helped internal and external developers advance their understanding of creative, technical, and production challenges that are unique to Spatial Computing platforms.

Crafting A Believable Experience

We defined our core experience pillars early in the process, which guided the design goal of providing users with an opportunity to immerse themselves in the experience as their room transformed into a coral reef biome.

  1. Observation – Provide users with an opportunity to sit back and become immersed in the experience as photo-realistic fish and coral merge with the physical environment, and suspend their disbelief.
  2. Exploration –  Reward curiosity, as users move around their space provide moments of delight and surprise.
  3. Interaction – Fulfill user expectations during encounters with photo-realistic digital creatures.

Transforming the User’s Space//

A unique challenge when designing for Spatial Computing (Mixed Reality) is not being able to predict the environment in which a user will launch the experience. Understanding this constraint is essential to the design process.

To populate the environment, we utilized a semantic understanding of the user’s space (walls, floors, etc) to anchor the core components of the experience: Vistas and Clusters.

Vistas served as gateways into an underwater world, which enabled AI-driven creatures to freely navigate in and out of the user’s space. These were placed on flat vertical surfaces like walls.

Clusters served as anchors for creatures to spawn from, shelter in, and navigate to and from. These were placed on flat horizontal surfaces such as tables or the floor.

To account for different room scales we defined three scale tiers (Small, Medium, and Large) for both Vistas and Clusters, this approach allowed us to maximize the available real estate without sacrificing the experience.

At first, we explored the possibility of content placement being a user-driven process, however, we realized that this approach affected the overall tone of the experience and would often result in false expectations and choice paralysis from a user’s perspective.

Knowing this, allowed us to focus on simplifying the setup process and offset the majority of the work to be handled by the back-end systems. We ended up shipping with a basic setup process that only required users to quickly map their space and then select from an array of dynamically placed Vistas in their room.

Onboarding Sequence //
Once users had selected a Vista from the provided options a reveal sequence would happen, the vista surface would smoothly transform into a window looking into an underwater scene. After a brief moment a fish would enter the user’s space from the Vista and navigate towards a Coral cluster located nearby, this would redirect the user’s attention away form the wall and onto their space, which was now transformed into a coral reef.

Fish were given relatively simple AI-pathing rules which considered the user’s position, their hands and their physical space. Fish eyes featured a “look-at” system, which enabled them to make eye contact with the users as they navigated around them, this made fish feel more alive and believable. Different fish were designed to behave distinctly from one another, some fish would flee as users approached them, others would graciously navigate towards the user’s hands as they nibbled their fingers.

The most complex fish were dubbed “hero creatures” these featured unique flourishing moments to reward user observation. For example, a blow-fish would gently navigate into a room and after specific conditions were met, it would start inflating its body. These kind of behaviors are truly magical if one ever is lucky enough to observe them out in the wild.

As we conducted user test sessions, we observed that users kept wanting to use their hands to interact with the content due to the appearance and behaviors of the fish and coral, they couldn’t resist the urge of interacting with the fish. As such, we made the determination of removing the controller input during the experience, and added hand-tracking support to affect the behavior of the fish by either having them flee away from the user’s hand or come towards it.

This is a side-by-side comparison of the original user flow (left) and the final version which we shipped with (right). There was a thorough simplification of the user-facing process, which resulted in a much more enjoyable experience.

Options Menu//

The options menu was designed to expose a very limited set of parameters to the user, specifically Audio volume controls and Room Map Visibility Toggle. Other TRC (Technical Release Candidate) specifications we also included, such as Privacy Policy and an App Exit button.

The Menu is revealed when the user presses the “home” button on the controller, once displayed the controller displays a ray-cast laser to easily target any of the available options. To select an option, users simply pull the “trigger” button. This approach was originally intended to be an accessibility fallback, as we wanted to keep consistency of interaction and allow users to utilize their hand as the input which instigated the menu.

Unfortunately due to tight deadlines, and limited resources we could not implement a hand-based instigation in time to ship with the final version of the experience. Another feature we couldn’t implement in time was a method for changing biomes mid-experience, this feature would have allowed users to seamlessly transition from one biome to another without having to exit the experience and go through the initial setup process again.

The wire-frame image bellow, represents the final Options Menu that we shipped with. I used Adobe Illustrator CC to generate the mock-ups as well as the icons, which were first sketched on paper.

This experience became particularly important to me because it enabled people who had never been underwater to get a glimpse of the beauty of the oceans, while establishing a more personal connection and deeper understanding of the importance of these ecosystems. It’s extremely gratifying to be able to work on an experience that transcends entertainment and provides educational value to the audience!

Undersea has been selected as a 2020 SXSW finalist in the Innovation Award Finalist.