Designing in VR
This is a collection of thoughts around my ongoing experimentation using VR as a design, prototyping, user testing, storytelling, and visualization tool.
My VR blog here
My "getting started in VR" resource doc here
Interview with Animation Nights New York on this topic here
I was lucky enough to speak at Interaction '18 on this topic. The video is a condensed version of my thoughts/examples below.
VR is a completely new space thats ripe for creativity and innovation. Every day is a new discovery and unique problem to solve.
It challenges designers to become multi-disciplinary forces of creativity. Spatial designers will need to think about interaction. Interaction designers will need to think spatially. Static artist will explore animation. 2D designers will pick up 3D and so on.
VR also offers a completely new and natural way of interacting with computers. Users now have access to physical inputs like their head, hands, and body. This empowers users to physically navigate software as opposed to clicking / tapping thru a 2D experience.
My experimentation in VR started with the Oculus DK2. I spent a lot of time digging through forums to troubleshoot problems and share solutions. My goal was to find what value VR adds to architecture and design while at Gensler. I discovered that high fidelity (realistically rendered) 3D environments were great for interactive visualization but required a separate work stream, lots of effort, and could never keep up with the ever-changing design work stream. If VR was to work as a design tool, the workflow had to be more integrated and keep pace with the core design work stream. Low / medium fidelity VR content became the sweet spot.
This was one of my early test scenes in Unity 5 that included a VR camera (OVR player controller not in video), player controls, crude AI, proximity-triggered audio, a hacked together mini-map showing real-time player tracking, and Unity's lighting system.
I also experimented with AI in Unity with the hopes of testing various in-store conditions. This test simulated 3 checkout types, 10 different shopper behavior types and some randomness added for the employees (card failures, bagging time, etc). The below GIF shows the beginnings of that effort.
Usability Testing in VR
User testing in architecture is rare for many reasons. You need to either build a foam core model in a warehouse, lay some tape to mimic a 1:1 plan, or test the real thing. All options are time consuming and expensive. VR now allows architects and spatial designers to user test their designs in real-time. In my mind this is VR’s most valuable use case in architecture/spatial design.
One example of usability testing in VR came from the need to design a complex cafe experience. The team was having trouble aligning on a direction. The problem was our lack of understanding of how people may actually use the space. VR was the best tool for getting testers in the space and observing them complete task. I started with a script, set up our 3D models for testing, and had users role-play a few scenarios. Problems with the design were uncovered right away and some of our hypothesis confirmed.
Learnings from usability Testing in VR
1. Limit locomotion.
There are certain locomotion conventions in VR that give users movement "super powers." This encourages testers to travel through the experience at an unnatural speed, which can affect the data output.
2. Standing "room scale" VR over seated VR.
Standing VR helps testers feel more engaged in the space as their scale and eye level is more natural. This also encourages them to look around naturally instead of swiveling in a chair.
3. Allow time for testers to get comfortable in VR before running test.
You don't want testers taking mental energy away from completing their task and verbalizing their thoughts.
4. Focus the VR content (3D models) on how the space functions.
If details or aesthetics serve a functional purpose then include them in the 3d model.
5. Moderator participation is necessary to carry out complex interactions.
This will get exciting soon when multiple VR users are able to interact in the same room. Role-playing as staff (moderator) with customers (testers) in a VR environment will become common practice.. I hope.
Sketching in VR
Creative VR tools, like Quill, allow designers to sketch spatial concepts at the same speed as doodling on paper. The added benefit of VR sketches is that the designer not only has a visualization of their ideas, but also a 3D artifact that they may explore at scale.
VR as a Prototyping Tool
This example shows how easy it is for spatial designers to jump in and out of VR with "one click to VR" solutions like Enscape 3D and IrisVR. This workflow allows designers to set up a design/test cycle without building out separate tracks. This video also shows how to use Unity, with VRTK's prebuilt VR mechanics, to prototype design concepts. The added benefit of using Unity is that you may include interaction and dynamic elements like simulated video panels.
Modeling Spaces in VR
After spending time exploring VR Sketch for SketchUp I found a couple stand out use cases/advantages:
1. Use your hands to work in SketchUp. The physical UX patterns and inputs are more intuitive than their 2D counterparts.
2. Make changes to a SketchUp model while immersed in it. Users can now design and model spaces directly in VR. This allows users to FEEL their design and will inspire richer outputs.
3. Real-time multi-user collaboration. The sync feature allows one user in the headset and one on desktop to work in the same file simultaneously.
There is a slight learning curve with VR Sketch. Luckily the devs supply detailed documentation on the website. Once designers master a workflow in VR their efficiency and outcome will improve dramatically. I think a focus on increasing speed, thru UX improvements, will make VR Sketch a powerful tool for all spatial designers.
Detailed thoughts/feedback here
Interactive Spatial Design
VR offers unique abilities beyond traditional design tools. This example shows a Unity scene, build to serve as a template, that includes a lot of unique features, such as: Interactive/grabbable objects (VRTK), Interactive drawers, Teleportation (VRTK), Animated objects (clouds, graphic messages), Oculus Avatar SDK (hands/controllers), Spatial audio, Video textures, Complex 3D models built with VR creative tool Gravity Sketch and Google Blocks, Controller instructions, Real-time lighting, Floating particles, Easter eggs, etc.
I spend a lot of time demoing VR to various clients, teams, and directors. For most of the people who try VR demos it will be their first introduction to the tech. This means pre-planning the demo's UX is crucial if you want your users to retain the specific content. Most likely the users will only have enough mental load to appreciate the fact that they are in VR. For that reason the specific content is often overlooked. To combat this simplify the number of things the VR user needs to learn. Removing control options and abilities free up mental load which enables users to focus on your content. Comfort features also ensure users leave with a positive experience. These options may include a seated VR version of your experience. Also consider movement options, like teleportation and snap rotation, for the standing demos. Finally, consider the social aspect of demoing VR. Avoid scenarios where the VR user feels like they are on display. If everyone is sitting watching, then others may feel reluctant to try.
Growing collection of VR UX patterns here
Concepting VR UX
My dream VR tool would be a new SketchUp product built for Virtual Reality. I put this example together to show how designers can whiteboard (spatially) and concept VR UX/UI directly in the headset. I just want this product/tool to exists. I have since discovered VR Sketch who's devs are working on a similar goal.
Visualizing VR Mechanics
Complex object interaction is one of the more fun aspects of VR because it brings together all your physical inputs. Playing with objects that mimic a physical interfaces in VR allows users to manipulate tools with powers beyond reality. It also taps into a child like fascination with buttons, switches, and other physical interfaces. The concept below challenges players to master the procedural game of a dynamic weapon/tool. The puzzle of scenarios creates a dynamic player experience where they need to power on, repair, recharge ,reload, un jam, clear, toggle modes, unplug/plug in, open/adjust, etc. This all happens while the player is facing dynamic enemy and environmental challenges. This motion sketch is a quick way to communicate some the intended interactions before jumping into detailed assets and/or code.
Designing AR in VR
This mini project demonstrates how designers can use Quill to concept AR UX and place it in situ with the Wavy Music app. This project explores what Google AR Maps for your car could look like. To make sure the layout was accurate I imported a scale 3D model of my car into Quill for reference. This reference also helped when positioning UI components outside the driver's required FOV. I used my phone to visualize the UI but in reality this could display on the windshield. The UI components are based on Google's 2018 AR Maps announcement.
Safety is a critical problem to solve in this use case so I broke out the concept into two states of cognitive load.
1. Driving mode only allows essential wayfinding/nav features.
2. Stopped mode allows search and encourages exploration.
This visualization shows stopped mode (all animations running). It does not show how users could interact with the UI. I imagine a way to capture user input would be through sensors. These sensors could pick up your voice for search and your hands (or eye gaze) for select, trigger, or expand actions.
Animating in VR
The current VR hardware augments users with many inputs mapped to the body. These inputs control a camera as well as objects in a scene. Animation is created from a physical performance along side planned out techniques and workflows. The end result feels organic and hand-crafted. This differentiates VR animation tools from their screen based counterparts.
Discovery Through Play
VR is a inherently fun technology. Encouraging users to explore and break the rules opens up new avenues for creativity. For example, I found a fun way to animate in Quill by "puppeteering" layers and screen recording the "performance." I discovered this when painting a silly character for our baby announcement video. This approach eliminates rigid (also not fun) aspects of animating and lets you focus on acting our your character’s personality through organic movements. To me, puppeteering is what makes VR animation so powerful because it leverages the technologies unique affordances.
Update on Puppeteering!
You can now record your puppeteering in Quill. Before I could only puppeteer and record my screen to show in videos. Now you can do it all, and watch it, in VR. Heres a quick tutorial on how to puppeteer with Quill 2.0.
Storytelling in VR will only continue to grow now that Quill has made animation natural and accessible to all skill levels and disciplines.
For my “Alex’s Sci-Fi World” VR animation I wanted to lead the viewer from one micro-story (event) to the next and encourage immersive exploration. Where's Waldo was a big inspiration. I loved the game of looking for Waldo. Once I found him it was the micro-stories that kept me coming back. I wanted to explore every inch of the illustration and discover all the artist’s Secrets. VR satisfies a similar craving only now artist have access to the power of immersion! This is the magic I’m always looking to capture.
Thoughts on Storyboarding
Traditional storyboards are still necessary to plan out a video/animation's timing and flow. Working in VR enhances your storyboard by giving you more control of the camera and animated assets. This control exposes new and better shots. Storyboard artist and concept designers should all start using VR creative tools!