📹 New! Remote User Testing - Get video + voice feedback on designs and prototypes
Read more
Viewpoint

Mixed Reality User Flows: A New Kind of Template

Posted 7 years ago by Lillian Warner
Mixed Reality User Flows: A New Kind of Template

As designers, we create user flows and give them to developers, product managers, clients, and sometimes even users for testing. At its best, a user flow is a concise, clean, and compelling way to showcase the scope and vision of your application before it is developed. Oftentimes user flows are the key piece of documentation that we provide developers to actually bring our apps to life.

Most user flows for web and mobile products look something like this:

A traditional web/mobile user flow (from KM Design House)

This user flow can be deciphered pretty easily by anyone. It lays out a clear roadmap for how a user can accomplish tasks within the application. There is only one input option available: clicking.

But if we were to adapt this format for a mixed reality experience, it would get cluttered quickly. Imagine adding voice input options to this flow. And 3D objects that can be resized and rotated. And 360 sound. And Cortana. And multiple application windows…and more. This user flow, if kept in this same format, would likely get very messy and difficult to read if it were translated directly for the medium of mixed reality.

Mixed Reality User Flows: Scene, not Screen

In mixed reality, there are so many different components to design for that the format of a standard web/mobile user flow doesn’t work very well. There’s a lot going on in any given mixed reality experience. Designers working in mixed reality aren’t creating a screen-based experience — we’re creating scene-based experiences. Devices like the HoloLens bring computing to your personal landscape. Mixed reality has everything to do with the physical world, timing, and a variety of input actions. A user can accomplish the same task in a number of different ways, all while other things are happening (e.g. a user can select an object at the same time audio and video are playing). On top of that, the actions and objects in a mixed reality scene are probably influenced by the physical landscape they’re placed in. So as a designers, how can we account for all of this in a user flow? How can we describe mixed reality scenes in a clear and concise way to developers, product managers, and other stakeholders? How can we help others understand the intention of our mixed reality app, as well as the precise interactions available to users in any given MR scene?

We need a different format for mixed reality user flows — something clean, clear, and concise, but also something that shows the full scope of our application and all the various components that make up a mixed reality experience. And ideally, this document should also provide specific interaction details, so that we can give our mixed reality user flows to a developer and she or he could begin coding with a clear understanding of the design/flow/interactions in each scene.

Last year, I worked with a small team to design and develop an app for the HoloLens. This was our solution for creating user flows for mixed reality:

This is actually a partial user flow. Our flow for this particular user journey has 3 more panels.

Our design process was messy (you can read more about that here). We clocked in dozens of hours deciding which interactions would be available in each scene. We didn’t usually have a HoloLens on hand during our design meetings, so we came up with other ways to prototype our ideas. We kept meticulous track of our interaction design decisions in a spreadsheet. And then, the amazing Katie Chen boiled it all down into the clean and concise document above (she is a genius!).

My teammate Cindy, testing things out on the HoloLens.

The mixed reality user flow my team and I created describes a clear user journey and takes into account the variety of components inherent in a mixed reality experience. Our mixed reality user flow is something in between a storyboard and a traditional web/mobile use flow. It’s like a storyboard because it’s describing a scene and telling a story (the user’s journey), and it’s like a web/mobile user flow because it describes specific interactions and user goals.

Here’s a breakdown of the user flow:

Here’s a GIF highlighting each component of this mixed reality user flow. You can view this, but bigger, in the template I link to below.

  1. Scene: This is what the user is seeing. This is in first person, but you could easily change it to be in third person.
  2. Input Options: These are the different input options available in any given scene. Note how some input options are available in some scenes, and aren’t in others.
  3. Audio: This is what the user is hearing. Our app had a significant amount of audio narration, so we used this space to write out actual scripts.
  4. User Action: What does the user do to move within a scene or move from one scene to another? Does the user air tap an object? Double air tap? Bloom? Gaze at something to select it?
  5. Icon Key: Last but not least, we have a key telling us what all these icons mean.

Put a bunch of these panels together in a sequence, and you get a mixed reality user flow. Or, what I’ve been calling a “Reality Sequence.” As a medium, mixed reality is so fundamentally different from web and mobile, that I don’t think “user flow” is the most accurate description of this document. Sure, it shows a flow, but it also tells a story and takes into account time and the physical world. I think Reality Sequence is a more accurate descriptor, because it implies that many things are happening in real time, and the outcome depends on the way a whole host of factors interact with one another.

I’ve created a mixed reality user flow template, which you can find here. Feel free to use this to get started on your own mixed reality design projects.

A mixed reality user flow template (a one-pager). You can download it here.

The same template, but with 4 panes on a page. You can download it here.

I’ve also created a Reality Sketching template. This is a third-person view template, and is intended to help visualize the scene designed process in mixed reality. Determining how much distance will be between the user and the digital object they are interacting with is crucial, and a third person sketch is helpful for envisioning this (this is a topic for a whole separate Medium post).

Third-person sketching template to get the creative juices flowing. Download available here.

If you use either of these templates, let me know how it goes! These are living documents, and I’d love to keep refining them.

This article was originally published on Lillian's Medium page.

Prototype with Sketch!

Prototype with Sketch!

Send artboards straight from Sketch into your Marvel projects.

Download Plugin

Lillian is a UX designer and researcher based in New York. You can find her on LinkedIn or at her website.

Related Posts

Making good component design decisions in react when it’s hard to see how an existing component can still be reused

Disney’s 12 Principles of Animation is one of the inestimable guides when traditional animations are considered. It was put forth by Ollie Johnston and Frank Thomas in their book – The Illusion of Life. These principles were originally designed for traditional animations like character animations. However, these principles can still be applied in designing interface animations. So, this is just… Read More →

Design is no longer subjective. Data rules our world now. We’re told all design decisions must be validated by user feedback or business success metrics. Analytics are measuring the design effectiveness of every tweak and change we make. If it can’t be proven to work in a prototype, A/B test, or MVP, it’s not worth trying at all. In this… Read More →

Categories