Designing in VR

This is a collection of thoughts around my ongoing experimentation using VR as a design, prototyping, user testing, storytelling, and visualization tool.  

I was lucky enough to speak at Interaction '18 on this topic. The video is a condensed (bit dated) version of my thoughts/examples below. 

Resources:

Why VR?

VR revolutionizes the way we interact with computers. Users can now leverage physical inputs like their head, hands, eyes, voice, and body. This is especially powerful when these inputs are working together in concert. The sketch below is a slide recreation from Michael Abrash’s OC6: “The Future of VR” presentation where he perfectly articulated this point.

VR also enables higher levels of user experience with things like immersion, scale, embodiment, and presence.

As a VR designer, you will become a multi-disciplinary force of creativity. Spatial designers will need to think about interaction. Interaction designers will need to think spatially. Static artist will explore animation. 2D designers will pick up 3D and so on. 


The Beginning

I’ve been obsessed with VR since the Oculus Rift’s 2012 Kickstarter and had a chance to demo the DK1 at the Bay Area Maker Faire (2014 I think) which sealed the deal. I finally got my hands on a DK2 in 2015 and quickly learned Unity. I spent a lot of time digging through forums to troubleshoot problems and share solutions. It was a wild time. Anything was possible and it felt like we were all on a crazy adventure to find new patterns. I remember specifically having conversations like: How in the world is locomotion going to work? (Joystick free locomotion and teleportation wasn’t a common pattern at that point) Do we need a warehouse to walk around? What about this crazy idea of redirected walking? I knew right away we were in a very special moment in time.

VR + Architecture

My first professional experience with VR was in Architecture. My goal was to find out what value VR could bring to architecture, and spatial design, while at Gensler SF. So naturally I gravitated to the lowest hanging fruit…visualization. I discovered that high fidelity (realistically rendered) 3D environments were great for interactive visualization but required a separate work stream, lots of effort, and could never keep up with iterative design workstreams. If VR was to work as a spatial design tool, the workflow had to be more integrated and keep pace with the core design workstream. Basically real-time. So Low/medium fidelity VR content became the sweet spot. This was a time before tools like IrisVR and Enscape 3D solved the real-time + fideltiy problem.

This was one of my early test scenes in Unity 5 that included a VR camera (OVR player controller not in video), player controls, crude character AI, proximity-triggered audio, a hacked together mini-map showing real-time player tracking, and Unity's lighting system. 

I also experimented with basic AI in Unity with the hopes of testing various in-store conditions. This test simulated 3 checkout types, 10 different shopper behavior types, and some randomness added for the employees (card failures, bagging time, etc). The below GIF shows the beginnings of that simple effort.    

Usability Testing in VR

Even though visualization was low hanging fruit I found the most valuable use case for VR in architecture to be user testing. User testing is rare in architecture for many reasons. You need to either build a portion of the design via foam core model in a warehouse, lay tape to mimic a 1:1 plan (See the movie “The Founder”), or build an actual working store in a warehouse and test. Only Apple, P&G, Walmart, and a few others can do this. All options are time-consuming and expensive. But that’s where VR comes in. It enables architects, and spatial designers, to user test their designs in real-time and at very low cost.

One example of usability testing in VR came from the need to design a complex café checkout experience. The team was having trouble aligning on a direction. The problem was our lack of understanding of how people would ACTUALLY navigate a complex space like the one we designed. VR was the best tool for getting testers in the design and observing them complete simulated tasks. I started with a test script, set up our 3D models for testing, and had users role-play a few scenarios. Hidden problems with the design were uncovered immediately and some of our hypotheses confirmed.

Testing often and doing it IN your design is so so critical. We know this in digital product/UX already. And now architecture has a way to deploy the same design practices. I strongly believe every spatial designer should include VR into their toolset to complete the capabilities loop. Which is something like:

  1. Design tool (Fast 3D modeler like SketchUp or Blender).

  2. Testing tool (VR).

  3. Build tool for BIM (Revit).

Photo credit: Tuomas Sahramaa

Photo credit: Tuomas Sahramaa

Learnings from usability Testing in VR 

  1. Limit locomotion.
    There are certain locomotion conventions in VR that give users movement "superpowers." This encourages testers to travel through the experience at an unnatural speed, which can affect the data output.

  2. Standing "room-scale" VR over seated VR.
    Standing VR helps testers feel more engaged in the space as their scale and eye level are more natural. This also encourages them to look around naturally instead of swiveling in a chair.

  3. Allow time for testers to get comfortable in VR before running the test.
    You don't want testers taking mental energy away from completing their task and verbalizing their thoughts.

  4. Focus the VR content (3D models) on how the space functions.
    If details or aesthetics serve a functional purpose then include them in the 3d model.

  5. Moderator participation is necessary to carry out complex interactions.
    This will get exciting soon when multiple VR users are able to interact in the same room. Role-playing as staff (moderator) with customers (testers) in a VR environment will become common practice. Update: IrisVR, the Wild, and other VR tools are focused on multi-user collaboration.

  

Sketching Architecture in VR

Creative VR tools, like Quill, allow designers to sketch spatial concepts at the same speed as doodling on paper. The added benefit of a VR sketch is the designer now has a 3D artifact that they may explore at scale.

VR as a Prototyping Tool for Architecture

This example shows how easy it is for spatial designers to jump in and out of VR with "one click to VR" solutions like Enscape 3D and IrisVR. This workflow allows designers to set up a design/test cycle without building out separate workstreams. This video also shows how to use Unity, with VRTK's prebuilt VR mechanics, to prototype design concepts. Unity is great for interaction and dynamic elements like simulated video panels.

Demoing VR

For most of the people demoing VR it will be their first introduction to the tech. This means pre-planning the demo's UX is crucial if you want your participants to retain the content you prepared. Most likely the participants will only have enough mental load to appreciate the fact that they are in VR. For that reason, the in headset content is often overlooked. To combat this simplify the number of things the VR user needs to learn. Removing control options and abilities free up mental load which will help participants focus a little more. Comfort features also ensure participants leave with a positive experience. These options may include a seated VR version. Also consider movement options, like teleportation and snap rotation, for the standing demos. Finally, consider the social aspect of demoing VR. Avoid scenarios where the VR user feels like they are on display. If they feel like they are being watched they may be reluctant to try. 

VRinPOD.gif

VR Tools for Spatial Design


Lo-Fi Prototyping in VR

Sketching your ideas, or even bringing in your flat UI designs, in Quill is a fast way to test how a design feels in context. This is especially useful when you're trying to find those sweet spots between UI size and spatial position. Animating your design takes it further allowing you to mock functionality intent and communicate flows to your team.

Visualizing VR Mechanics (Gun Game Sketch)

Complex object interaction is one of the more fun aspects of VR because it brings together all your physical inputs. Playing with objects that mimic physical interfaces in VR, allows users to manipulate tools with powers beyond reality. It also taps into a child-like fascination with buttons, switches, and other physical interfaces. In this concept players are challenged to a procedural set of weapon/tool scenarios during gameplay. This puzzle of scenarios would create a dynamic player experience where they need to power on, repair, recharge, reload, unjam, clear, toggle modes, unplug/plugin, open/adjust, etc. This all happens while the player is facing a dynamic enemy and environmental challenges. This motion sketch is a quick way to communicate some of the intended interactions before jumping into detailed assets and/or code.

Designing AR in VR (G Maps Sketch)

This mini-project demonstrates how designers can use Quill to concept AR UX and place it in situ with the Wavy Music app. This project explores what Google AR Maps UI (2018) for your car could look like. To make sure the layout was accurate I imported a scale 3D model of my car into Quill for reference. This reference also helped when positioning UI components outside the driver's required FOV. I used my phone to visualize the UI but in reality, this could display on the windshield. The UI components are based on Google's 2018 AR Maps announcement. All the UI components are shown at once so you can see the whole set. This is not a UX flow. Seeing all this animated UI at once would not be ideal in real conditions 😄

2D Design Tools vs. Designing Directly in Headset

A question I get a lot is “what does a VR design workflow look like?” A lot of new designers to the space assume the work is only done in VR. The short answer is your going to be working both in VR and 2D. Traditional UX/UI tools, like Figma, are still VERY important when designing for VR. A lot of the time your design is best laid out in 2D first. Just like how some architects design in plan view before moving on to 3D. You're working out elements of the design that are better controlled via 2D. The core reason to use 2D design tools is for creating, and maintaining, a design system. Grids, alignments, color and type styles, component libraries, precise layout variables like margins/padding, z axis dimensions, etc.

One trick I learned designing between 2D and VR is to set equal conversions between pixels and real-world measurements. Since everything in VR is set to metric units (or imperial? I hope not 😄) just make one pixel equal to one millimeter. That way engineers can easily understand your 2D design dimensions when inspecting mock-ups for VR.

All that said I’m a huge champion for designing directly in headset. There are problems and discoveries that will be hidden until your in situ. I’ve experienced these type of discoveries many times. I’ve also seen ideas, I know could be better, not take off because they weren’t designed directly in VR. It’s similar to car design where you have a killer concept sketch but when it’s time to model in 3D you realize it doesn’t work. Another way to think about it is limited input = limited output. If your designing on a platform that doesn’t allow for all the inputs VR does then your limiting potential output.

Additional thoughts on VR UX/UI design:

Direct Touch vs. “Laser Pointer” Ray Cast

I see a lot of apps that default to one interaction model. While I understand this reduces cognitive load for beginners, and is a cleaner approach for engineering, I don’t believe it’s the optimal user experience. Half Life: Alyx proved multiple locomotion interactions, used in concert, can empower users with new heightened abilities. I believe the same is true for the direct touch vs laser pointer debate. Why not allow for both and design around their unique strengths? For example near field UI/menus in VR are much easier, and a pleasure to use in my opinion, when you “touch” them. Direct touch also allows for greater finger input manipulation. Sometimes you need far field UI to make images and text bigger for browsing. Sometimes you need near field UI for functionality that carries with you thru the whole experience like search, settings, or filters. Or maybe you need breakout-UI for things like playback controls. Different use cases = different solutions.

I personally gravitate towards touch interaction, and would like to use it as much as appropriate, because it takes advantage of VR’s unique affordances. It’s a differentiator. When done right, touch can also be more responsive. Take the example of a keyboard. With touch you’re quickly “tapping” around but with a laser pointer it’s a trigger press and release for each tap. The operation has one extra micro step which really adds up when you're typing a lot. I believe the ultimate way to type in VR is with voice but that’s a whole other topic.

3D vs Flat UI

Seeing flat UI, spatialized panels and buttons, in VR just feel unfinished to me. As in it's an interim compromise for a more 3D future. I know 3D UI can be very expensive (perf cost, draw calls, etc) but I think it's worth the investment to solve for perf cost if your app/platform is UI heavy/focused. Creation tools, utilitarian apps, and OS for example. There are some areas where 3D is not worth the cost. Distant UI such as large discovery walls, info panels, etc. There are also some special cases with near field UI that don’t need 3D. Drop down list for example. But my general approach is to make almost everything in your personal space 3D.

3D UI just makes VR unique and fun. Its a clear differentiator that helps address the question “why should I put this thing on?” But it’s not only about a “nice to have” experience. 3D is also functional. Flat buttons can be mistaken as disabled or not buttons at all. 3D is just more “tappable.”

Controllers vs. Hand Tracking

We use tools for a reason! Yes, hand tracking is great for casual users/use cases and will likely be the primary input modality for Ray-Ban size AR. However, I don't believe hand tracking will replace controllers in VR. Just like all these VR debates I think the right direction is to offer both options and design towards their unique strengths. I'm pretty confident controllers will stick around based on what we (public) learned in previous efforts here. Xbox Kinect is the example that comes to mind. The optimistic vision for Kinect was a controllerless future. However, that didn’t happen for a number of reasons and we still have controllers today. Controllers have been with us since the first video game console for a reason. The Vision Pro also has proven eye + hand tracking have a usability ceiling. Controllers are still undefeated 😆 I'm not saying the form factor is perfect and shouldn’t be innovated. It's that controllers, or wearables like gloves, allow for critical affordances. Mainly feedback, presence, and a greater number of inputs. So complex use cases, like VR creation and it's pro users, can access as many inputs as possible with greater freedom to combine operations. Most importantly controllers enable multi-tasking. As the Vision Pro has shown us, you need the ability to decouple inputs. Your hands can do things, while your eyes, head and body do other things. Still working together in concert just with greater freedom.

One key use case here is brush pressure sensitivity for VR painting. As a pro creator I need the physical trigger’s spring tension feedback so that I can be as precise as possible with things like stroke weight and taper. Without this physical feedback, features like brush sensitivity wouldn’t be usable. I also look at it like this - sure you can play drums out of thin air. There are actually products that make it possible. But why? Drummers need something to hit. They need the feedback from the instrument if they are to play it with skill.

Or think of hand tracking, as the only input modality in VR, this way. . . Do you want to play HL: Alyx by my making a finger gun and saying “pew pew”?


Learning from the Best: Thoughts on Half-Life: Alyx’s Design

The best VR game/experience to date, in my opinion, is 2020’s Half-Life: Alyx. It’s a 10/10 for me. One of my top 5 games of all time. It also hits on all the things I've mentioned here and has set the standard for many key VR interactions.

Locomotion

Before Alyx, most devs/designers struggled with choosing a primary locomotion design. It was hard to nail down a target audience. So one solution was to build all types and let the user choose. But which do you set as default? Gotta pick one right? Alyx came on the scene and said “why not allow for multiple locomotion styles at once?” The cool thing about this is I started using both locomotion styles in concert which opened up a whole new way of getting around in VR. The free locomotion speed is just slow enough that you're encouraged to use teleport for larger spaces in order to get around quicker. The slower free locomotion speed also encourages you to treasure hunt for useful items. When a head crab surprises you then just teleport away, line up your shots, and evade via strafing. Each locomotion style has its unique strengths which add to a whole new mental model for VR controls. Combining locomotion styles, and designing for “choice”, should be the standard for all VR apps going forward.

Quick Menu

To me this design is perfect. Keeping it to 4 options (simple axis layout) allows the player to select without even looking. Like shifting gears on a car. It's crazy efficient AND you feel super cool every time you use it. The design details are also a key to its success. The subtle follow, how the UI dismisses if you move out of the trigger origin range but you're still able to select, sound design, and so on. It’s worth getting in there and just playing around with this mechanic. It feels amazing.

Object Interaction

This is probably the more famous mechanic in the game as it really nails the problem of grabbing objects outside of your physical boundaries. Or items on the floor. Games like Blade and Sorcery pioneered the “Jedi grab” style mechanic but Alyx took it to a whole new level by adding a catch to further gamify it. To me this is its secret sauce. The team even went to great lengths with their trajectory prediction in order to find that sweet spot between feeling like a super power but also adding in failure/misses. It’s easy enough but you also feel like it’s something you can master. So you feel like a badass every time you pull it off.

Physical/Diegetic UI

Dead Space was one of the early pioneers of diegetic UI design. I recommend checking out Dino Ingacio’s GDC talk “Crafting Destruction: The Evolution of the Deadspace User Interface.” for the definitive details on this approach. Many VR apps have taken diegetic UI further, given the physical affordances, and Half Life: Alyx is my favorite example here. Each hand plays a role whether it's your right for checking ammo clip count or your left for reviewing things like health status. Other design elements include activating sound and haptics via gaze.


More Favorite VR UX Patterns

Patterns that take advantage of VR’s physical affordances (A Piece of the Universe, Hotdogs, Horseshoes & Hand Grenades, Tvori, Elite Dangerous).

A Piece of the Universe: Fun/in-context menus. You rarely see this much creative attention put to menus. Naam has treated A “Piece of the Universe” as a playground/canvas for this type of creative VR UX, adding new ideas over years, and I love it so much.

Hotdogs Horseshoes and Hand Grenades (H3VR): Anton Hand is another one of my favorite VR devs. Especially for his contribution to physically based tool belts and item management systems. These include a variety of configs you can wear and carry with you. To me this is the second half to H3VR’s gameplay hook. Manage <> Engage. It works because managing your tactical setup is insanely fun here and 100% sells the immersion.

TVORI’s original take on animation was SO simple and perfect for VR. They fully embraced the physical nature of animation and made it super engaging via its playful UX. IMO physically performance is the best way to animate in VR because it takes advantage of the tech’s unique affordances.

Elite Dangerous: Simple gaze controls to trigger spatial menu system visibility. Imagine this in MR/AR use cases!

VR Tools for UX/UI


Animating in VR

The beautiful thing about animation in VR is you can become an animator, without training, thanks to VR’s physical affordances. You can skip over the technical aspects, tho I think it’s important to have a grasp of the principles, and jump straight to the acting part. I think most animators agree that's where the fun is. Animating in VR is a lot like puppeteering. You’re performing the animation in real-time as opposed to slowly planning frame by frame. Frame by frame is still just as powerful in VR and can allow you to layer on fidelity. So when you have a trained animator pick up VR the results are just insane. Seeing this, and having a lower barrier to entry thanks to VR’s affordances, has motivated me (not a trained animator) to go back and really learn the fundamentals. There’s something magical in that aspect alone. Another great benefit from VR (real-time) puppeteering is that it brings out your personality. The results are totally unique and will differentiate you from other animators.

static1.squarespace.gif

Discovery Through Play

VR is a inherently fun technology. It’s a playground for silliness (see Job Simulator and Gorn). VR encourages users to explore and break the rules which opens up new avenues for creativity. For example, before Quill had animation, I found a fun way to animate by grabbing layers and screen recording the "performance." I discovered this when painting a silly character for our baby announcement video. This now a feature thanks to Quill 2.0’s timeline update which allows for recording your movements via keyframes.

Micro-Stories

For “Alex’s Sci-Fi World” I naturally gravitated to non-linear/exploratory storytelling. Where's Waldo was a big inspiration. I loved the game of looking for Waldo. Once I found him it was the micro-stories that kept me coming back. I wanted to explore every inch of the illustration and discover all the artist’s Secrets. VR satisfies a similar craving only now artists have access to the power of immersion! This is the magic I’m always looking to capture.

I find this type of Storytelling really shines in VR. Linear storytelling is great but takes some of the viewer agency away. A non-linear / “scene full of stuff” narrative can be designed for viewer agency from the start.

Spatial Storytelling

With most of my VR animation and storytelling, I design places I want to hang out in. I think this criteria captures a lot of what makes VR unique and engaging. I always consider viewer agency, environmental design, lighting, scale, presence, and especially spatial audio. Architecture, video games, and immersive theater plays are a big influence. I will often hide Easter eggs to discover in places that require the viewer to physically search for them. Sometimes under a desk or behind walls. I know people explore in VR and I want to encourage more of that behavior with these little rewards.

Storyboards and Animatics

Traditional 2d Storyboards are still great for capturing cinematic shots (aligning things to the grid for 2D trailer compositions) and planning your scope. There’s a lot of value in seeing a snapshot of your entire story at a glance. I find it’s important to quickly move on after planning and start designing directly in VR. There you will be solving different narrative and consumption problems. Especially ones around viewer agency, spatial and immersive storytelling.

Another huge value VR adds in pre-production is the ability to create 3D/immersive animatics at speed. You will discover better camera angles, or viewpoints in the scene, that would be difficult to spot planning in 2D. Its also important to test how your characters are moving thru the sets, as the story progresses, in relation to the viewer. That relationship between the scene and viewer will be one of the main areas of iteration. Dialing up/down various elements like spatial audio cues, pools of light, and animation to make sure that relationship is harmonious. To me this is where the creative action is. It’s such a new and exciting space to design in.

Here are a couple examples of how you can use VR in the pre-production workflow. One (Remember When? Animatic) shows how you can take a scribbled 2D storyboard and use VR’s physical affordances to animate cameras and sketches at crazy speed. The others (Pilot: Alien Landscape Animatic and the decision scene test) show how you can create an entire sketched animatic in VR for an immersive narrative.

VR Tools for Animation

Other Notable VR Creation Tools