How VR Enhanced Our Design Process

March 8, 2018 | David Wallin

We're constantly looking for opportunities to apply the latest technologies — such as augmented and virtual reality — to business problems. The last distinction is crucial: AR and VR are often used simply as gimmicks, without real consideration of whether they’re actually appropriate for a project or capable of meaningfully increasing its ROI.

Here, we're a little more practical. We recently utilized VR on a client project to help us work more efficiently, and ultimately deliver a better end product, sans gimmicks.

Bringing Wawa to the Big Screen

Our long-time client Wawa approached us to design an interactive experience for a large touchscreen installation that would be the centerpiece of its Washington, DC store. The installation comprised a trio of 65” screens, each hung sideways. The size alone made this project unusual: we have ample experience designing for digital signage and ordering screens, but this was the largest interactive sign we’ve ever worked on.

Aside from the UX challenges we would encounter on such a large screen, we also were tasked with ensuring the experience was ADA compliant. This entailed a few accommodations, including mounting the screens fairly low so they would be accessible to customers in wheelchairs.

To complicate matters, we wouldn’t have access to the screens for several weeks, and the timeline dictated that we start designing and building the screens before they arrived for testing.

Enter Virtual Reality

As a self-proclaimed VR evangelist, I knew this was a perfect opportunity to incorporate VR into our design process. Car companies like Ford are already using AR and VR to help with their design processes, as it allows them to avoid building a physical model of a car every time they want to get a feel for a new design. In our case, we would be using VR to build a 3D model of the screens to get a feel for their unique affordances.

We made our 3D model in Blender, a free and Open Source modeling application, using the using architectural images and measurements. From there, we brought that model into Unity, a popular 3D game engine, which is well-suited for VR applications. Since we have Unity set up so that 1 unit of measurement is the same as 1 meter, it was easy to get our model to match the scale of its real-world counterpart.

Right away, our designers were able to get their preliminary UI sketches displayed on the virtual model. My team simply took their designs and applied them as textures on the model’s virtual screens. Even at the earliest stages of the design process, it was helpful to get a feel for how big text and buttons should be, and start understanding UX and accessibility challenges. For example, since the screens were mounted quite low, the bottom third of the screen would be difficult for a standing person to comfortably reach. Content in the upper third would be difficult or impossible for a person seated in a wheelchair to reach. This meant we couldn’t leverage any fixed UI at the top or bottom of the experience because it would be unreachable for some customers.

girl wearing vr equipment as part of a vr test

These insights led us to a vertical scrolling approach. This would enable anyone, regardless of height, to enjoy the full touchscreen experience by simply scrolling until out-of-reach elements came within reach.

In design, we opted to arrange the experience into sections based on content type — social posts, details about Wawa’s history and mission, employee features, and finally information about food quality. Since the majority of content was images, we devised varying formats for image carousels. These would be easy to operate on a touchscreen with large buttons, and the ability to swipe or tap through content.

We also determined the ideal size for the experience’s images and text with the aid of VR. Viewing the designs in this way helped our team wrap their heads around the scale of the 65” screens. We added objects to the scene to provide a frame of reference. For example, we made the controllers visible, since we knew exactly how big they were in real life, added realistic textures to the walls and floors, and added a human-sized character. In the future, adding realistic hands in place of controllers could also help.

man using the prototype developed from the VR test

A Learning Experience

Even with the VR model, we still made some changes after seeing our work on the real touchscreens. We discovered everything — images, text, CTA buttons, etc. — still felt a bit too large. Turns out that, within our virtual world, we tended to stand further away from the screens than people do in real life. This difference mostly accounts for the lack of haptic (touch) feedback, and underscores the importance of reference objects. It was a simple fix, but a valuable lesson.

woman using prototype board

Though it was our first foray into VR assisted design, the technology proved to be an asset throughout the creative process. It even helped us present the work to Wawa. We used the VR prototype to take several of our clients through the touchscreen experience, and it was clear that virtually interacting with the design enhanced their understanding of our vision. If a picture is worth a thousand words, an interactive 3D model might be worth a library.