I’m a User Experience (UX) designer, and I happen to have a penchant for the user onboarding side of it. All UX professionals are onboarding designers, in a way. That’s because UX design involves closing the gap between systems, services, and the humans that use them through thoughtful design of virtual and physical interfaces. Good UX designers work hard to minimize how much time people have to spend on learning and using interfaces, trying to make them as intuitive as possible. This is no small feat given how every new user brings a unique set of mental models with them, and how limited current technologies are in providing perfectly individualized personalisation.
Now let’s shift to the neurotechnology industry; specifically, brain computer interfaces (BCIs). BCIs use sensors, which can range from non-invasive to invasive, to “measure brain activity, extract features from that activity, and convert those features into outputs that replace, restore, enhance, supplement, or improve human functions” (ScienceDirect). That definition can bring up ideas from sci-fi stories like The Matrix, Upload, or Altered Carbon, where characters can download expertise, control machines with their minds, and do other superhuman things.
But alongside these ideas, I worry that a question may also be forming in people’s minds: “Do we need the discipline of UX design to close the gap between humans and systems if we have direct-to-brain interfacing?”
In a previous post, we looked at multiple opportunities over time for user onboarding techniques to be useful in our products. If we design for those opportunities ad hoc, we risk unscalable designs and frustrating users with fragmented education. Instead, let’s see how to tackle onboarding design so that it fits into a long-term approach to guidance.
I often evangelize the importance of first time user experiences. After all, not all of the users acquired to a product will stick around, but they’ll all experience its first run design. To encourage return use, that first impression must be solid. But it’s also very common for designers to overemphasize the first run experience at the expense of long-term user support.
Clippy, the Microsoft Office Assistant, failed partly because it catered to first time users. It didn’t scale gracefully as those users became acclimated to the product. As James Fallows describes “…Clippy suffered the dreaded ‘optimization for first time use’ problem. That is, the very first time you were composing a letter with Word, you might possibly be grateful for advice about how to use various letter-formatting features. The next billion times you typed ‘Dear …’ and saw Clippy pop up, you wanted to scream.”
People can be shaped by a mix of factors: genes, people, places, events, both the good and the bad. Everyone is a unique blend. As good product designers, we consider these contexts in relation to the users for whom we design, with the intent of creating experiences that suit their needs and expectations. But we don’t often take as much time to understand what has shaped us. Why do we gravitate to the problems and solutions that we do? A little self reflection in uncertain times can help us realign to higher quality, impactful projects and can also remind us, as we approach the Thanksgiving holiday, to thank those that had a positive impact on who we are now.
This is the last post in the 3-part “Engaging new users” series. Part 1 covered guided interaction, the practice of educating users in a realistic context, and how it is more compelling than slideshows, videos or static instruction. In part 2, we learned how to use free samples to demonstrate a product’s value proposition and build the trust needed to encourage sign-up.
And in today’s post, part 3, we’ll examine how giving new users a personal focus is the key to making these onboarding techniques stick.
In the first part of this series, I shared how guided interaction introduces users to the authentic context of your product with just the right amount of education to ensure they find success. Today, we investigate how the 2nd of the 3 pillars of better onboarding, the use of free samples, gets those new customers using your product in the first place.
There are 3 overarching best practices when it comes to engaging and educating new users:
In the past I’ve covered patterns and anti-patterns for onboarding new users and principles for first time user experiences. In this post and the two that will follow, I’ll be digging into each of the 3 ways we can better engage new users:
Today’s post is focused on guided interaction. So let’s jump into what it means, discover patterns for making guided interaction a reality and see a few examples.
In a recent presentation, I discussed the role that guided interaction and coaching can play in onboarding new users to a product. Playthroughs and user-guided tutorials are some examples of guided interaction. Guided interaction allows users to start playing with a new product quickly in an authentic context (instead of wading through abstracted coachmarks, instructions or intro tours), but also gives them enough coaching so that they’ll be motivated by an early success.
To help teams explore the right cadence of guided interaction for their product’s new user experience, I created a template to help with judging that interaction between a product and a new user. I’ve been calling it the coaching cadence worksheet. This can be used to audit an existing experience, or to explore variations for a revision or completely new first time ux. The worksheet follows.
After checking out the design principles of Android Wear, I found myself thinking particularly about the third principle, “Helpful”. Certainly in UX design a product needs to be helpful before anything else. But what does it mean to build helpful experiences for wearables, specifically?
<>To me, it seems that helpful wearable devices or wearable apps would do the following (the “6 R’s”):
I’ve been keeping an ongoing collection of first time user experiences (FTUEs) at http://firsttimeux.tumblr.com/. In this post, I’ve distilled the most common approaches I’ve observed being used today into a list of 8 design patterns and anti-patterns.
Each pattern has a description, pros/cons list, design considerations, and an example. You may recognize a few of these because many are modern takes on well-established UX patterns. My hope is for this to serve as a helpful reference as you develop your own first time user experiences.
More than 10 million Kinects have been sold since the depth- and gesture-recognition Xbox accessory first launched in November (selling so quickly in its first 60 days that it beat out the iPhone and iPad for a Guinness Book of World Records award). Given that it’s aimed at a much larger consumer audience than the Xbox has been, and with Microsoft’s announcement of an official Kinect SDK on April 13, it’s likely more of us will be designing for Kinect-based interfaces in the near future.
I recently partnered with two talented developers to prototype a Kinect-based experience. We had the opportunity to observe more than 30 people use the prototype, which allowed for some great, ad hoc user research.
After the jump, you can read my takeaways and design recommendations based on observations of our experiment. I’ll also try to post any new Kinect info I might gather from MIX11 next week.
Recently, I went to the dentist for my routine cleaning. Like most folks, I abhor going to the dentist, because I can count about a bajillion other things I’d rather be doing with 45 minutes of my time.
But at this last appointment, I found myself pleasantly occupied by this handwritten list taped to the overhead lamp: