At Wayfair, we’re always looking forward and we’re always exploring new and promising technology. Through our partnership with Magic Leap we’ve made some groundbreaking contributions to an emerging computing paradigm – spatial computing.

In August, we launched our first experience on Magic Leap One. We created a web experience for Helio, Magic Leap’s spatial web browser, that applies the idea of spatial computing to the traditionally 2D world of the web, bringing 3D content, and digital replicas of our products into the space around us. That experience is now included as a featured bookmark on every new device.  

Today, we’re going to talk about our second experience, Wayfair Spaces, that was showcased at L.E.A.P., Magic Leap’s first developer conference in October. This experience is a native application built using the Unity engine and takes advantage of Magic Leap One’s unique six degree of freedom (6DOF) controller. Wayfair Spaces is currently available on the app store, a.k.a Magic Leap World.

Wayfair Spaces is an inspirational interior design experience that takes you from beautifully curated room ideas to designing your own room. The concept allows anyone to select a style they like from a palette of dollhouse-like scenes that have been styled by Wayfair 3D artists and designers. Once placed into their room, they can then drag products out of the dollhouse, and place them on the floor, walls, or other surfaces around them, interacting with virtual 3D products in the same way they would with a real product. You can then view product information, swap-in similar looking products, and browse further options to see what fits and looks best in the room. Design experimentation is very intuitive here, and within minutes we’ve seen people get comfortable and rapidly move furniture around, amazed at how easy it is to visualize life-sized products right in their own space.

Leveraging our Previous Work

The development of Wayfair Spaces follows an interesting path. We’re constantly experimenting, and the idea of immersive room design in virtual reality VR has taken different shapes and forms over the past three years. It reflects on the culture of our team (Wayfair Next), actually. One of the more impactful interior design prototypes was a room scale VR experience that we code-named ‘Sketch’ internally, and was built using the Unity engine.

A key element of this VR interior design prototype was intuitive interaction using 6DOF controllers, which we could use to accurately track the position of your hands in space. We loved that Magic Leap built a 6DOF controller for the Magic Leap One, which allows your hands to really be part of the experience. For our interior design experiences, interacting with the content played an equally important role as visualizing the content. With the intuitive interaction Magic Leap’s 6DOF controller offered, and the core design engine we’d already built in Unity, we were able to re-use most of the code, and some of the interaction, which helped us bring our design experience from the virtual room to a real room.

Making the Interface Super Simple

One of our biggest challenges was simplifying the interface. We wanted it to work with one controller that just used pointing, and a single button to accomplish everything. Most of the interface in our VR experiences involved two controllers, with a lot of the UI being anchored to the second (non-dominant) controller. We needed to bring it down to one controller, and have the UI be more contextual.

For example, we have a product palette (a set of miniature 3D product models that you can browse) that intuitively floats in front of the user for selection, and a product interaction menu that opens over the product, where the user can simply point to select actions such as info, swap, copy, and delete. The visual effects, sounds, and haptic feedback we added made the interaction feel more natural – like the thud of a product being placed on a table, or a swoosh of a product going from dollhouse-size to real size, or the haptic feedback when (literally) picking up a virtual product.

To move products around, the user simply needs to point and drag. You can literally pick up virtual products by reaching out and grabbing them through the controller. When we watch people using the application (in our secret lab we’d built for sensitive projects in our Boston, MA office), there is almost always a moment when the experience clicks for them. That “Aha!” moment might come when they drag wall art out of the dollhouse onto a real wall, or when they pick up a piece of virtual decor the way they would in real life, walking around with it, and then placing it down again. People are fascinated by how easy it is to arrange virtual products just like they would real items (but with a lot more effort) and visualize their design. We’ve found some folks talking to themselves about their design, focusing more on the design itself than the technology. It can get pretty meta when moving a virtual table lamp sitting on a virtual table onto a real table.

Hitting Frame Rate

Our previous VR interior design prototypes ran on powerful desktop computers where the VR headsets were tethered to the computer. Magic Leap One offers untethered freedom of movement, in a much smaller form factor, offering performance similar to mobile or notebook platforms.

As with any head-mounted device (HMD), the user sees graphics rendered in real-time in response to their head orientation, so the application must meet strict frame rate requirements. For achieving a smooth and comfortable experience, that requirement is 60 frames per second. This meant that everything drawn for the user to see – such as furniture, user interface elements, effects, etc. – has to be drawn 60 times every second and not overwhelm the GPU and CPU, or consume too much power.

To meet these criteria, we optimized the application to reduce rendering requirements. We simplified the lighting and removed unnecessary features such as real-time shadows and full-screen camera effects.

The big win, however, came from improvements made to the Unity engine in how frames are rendered to headsets that have two camera views to draw (one for each eye). Unity supports a technique called single pass rendering which sends information from the CPU to the GPU in one step, skipping work that is essentially the same for both eyes. The Magic Leap One supports a more advanced version of this technique, called single pass instanced rendering, that further reduces the amount of work the CPU must do.

This approach to rendering made it possible for us to show a whole scene of furniture, rugs, wall art, and numerous decorative items, and not overload the CPU and GPU that were trying to draw them.

Utilizing our GLTF Models

The last key ingredient in the development of Wayfair Spaces was the content. All of the product models used in Wayfair Spaces are GLTF models made from more detailed VRAY models used for photo-real rendering.

GLTF is a new standard that is being rapidly adopted as a way to distribute 3D content over the web – think of it as the .JPEG for 3D. These are the same models currently being used by the View in Room 3D screen-based augmented reality (AR) feature on Wayfair’s iOS and Android apps.

If you’re interesting in learning more about our modeling practices, check out Wayfair 3D University.

GLTF models store material information for a rendering technique called PBR, which is a collection of mathematical equations that simulate the interaction of light and surfaces in a realistic way. The GLTF models store the inputs to these equations for a given material through a set of parameters such as numbers, images, etc., and an engine like Unity calculates the lighting using PBR and draws them over the geometry they are associated with. What’s great about PBR is that since it’s based on physical properties of light, you can get the same look of a material across engines.

Wayfair’s GLTF models provide engine-independent PBR models that can be rendered consistently in real-time in various lighting conditions across platforms. For generic 3D applications the lighting is often predefined, but for AR and mixed reality (MR) this lighting should correlate to the real world in which the virtual object is rendered; it should look like it belongs there. Currently, the ability of devices to capture and mimic lighting in real world spaces is limited, but evolving.

For Wayfair Spaces we used a simple overhead light and a generic studio environment map for reflections. The feeling of presence is improved significantly when lighting information from the environment is used to render the virtual content, and as augmented and mixed reality devices evolve, the content will feel more real and natural.

Launching and Demoing at L.E.A.P.

Wayfair Spaces was announced at the first Magic Leap developer conference, L.E.A.P. We also showcased it at the conference, giving live demos to press, attendees, and VIPs for two days.

And, just like anything developed at Wayfair, we’re never done. We’re thinking beyond the 2D screen and are constantly experimenting and evolving as technology evolves. For more interesting projects, watch this space!