An Event Apart Minneapolis: Whitney Hess, "DIY UX: Give Your Users an Upgrade"

The second speaker at An Event Apart Minneapolis was Whitney Hess, who spoke about Do It Yourself User Experience techniques in “DIY UX: Give Your Users an Upgrade.” Below are summaries of my notes. These are Whitney’s excellent thoughts, not mine. Hopefully I captured them accurately, but this is a poor substitute for the engaging, practical presentation Whitney gave. You have to go to An Event Apart for that!

Anyhow, after a delightful introduction from Jeffrey Zeldman, Whitney began by talking about how as a user experience designer, she has worked on great projects like Boxee, and she wakes up every day loving what she’s doing. She loves fighting for users so sites give them what they need, rather than whatever the organization want to give.

She showed a great visual from thisisindexed.com, the Pyramid of Success. On the bottom? Makes money. A bit higher? Makes an impact. At the top? Makes good. Whitney loves how her work “makes good” for users.

She also announced that we are all user experience designers. User experiences happen whether or not we are designing them.

One example she went over at length was about two developers, who were not designers, that came up with a time tracking tool. It was so useful in their own work, that they decided to make it available everyone. Now, it’s called Harvest. These developers were not user experience designers, and they didn’t have one on staff, and yet with Harvest, everything is in the right place and just works. That’s great user experience.

The majority of the rest of the session went over four primary techniques that UX designers can use: design research, web analytics, usability testing, and experimentation and iteration.

Design research

What do our users actually need?

Harvest has support forms that all direct to one Gmail inbox. This is more difficult for Harvest to handle, but easier for users. To manage these support emails, Harvest built a tool they called Kaizen, which means “continuous improvement” in Japanese. Kaizen has feature requests on the left side and an individual request that is being examined on the right. Each request is tagged by feature. Features can then be developed based on how requests are being tagged, with tags voted up Digg style based on the number of requests. The actual requests are retained so that developers know what individuals actually said about a feature. When a feature is actually developed, their notification system can then email the users who requested that feature, so that they can test it to see if it meets their expectations. This is pretty exceptional customer service.

Harvest also does surveys, with lots of open-ended questions. Experts often say that is no-no, because open-ended questions are more difficult to analyze, but this allows people to describe things in their own words.

An important principle of Iridesco (the developers of Harvest is that “We don’t just want to patch; we want to address the core problem.” To enable that, they have email conversations with users. This takes time, but Iridesco points out that “Customers love to tell you their workflow.” Customers will go on and on, because people hardly ever ask customers what they really feel about a product. People get excited when they are treated this way.

So to summarize, the key point for design research are:

  • Make it easy for customers to reach you, log their requests and use those requests to prioritize new features.
  • Dig deeper to discover the underlying problems.
  • Keep in touch.

Whitney recommended a great book about design research, “Observing the User Experience,” by Mike Kuniavsky.

Web analytics

What are our users actually doing?

Iridesco notes: “We don’t believe in data-driven design, but data doesn’t lie.” Whitney then noted that “sometimes it does.” You need anecdotes of what people are saying they are needing plus analytics to give the whole picture of what user needs are. Neither are enough on their own.

She showed us how Harvest’s analytics data showed that the highest usage was at beginning of week, with major dropoffs on the weekend and somewhat lower usage mid-month. Through this, Harvest could easily see that people were using their product primarly during business hours.

Harvest also uses crazyegg.com. Crazy Egg creates heat maps of where people are clicking on the site, so you can easily see what’s of most interest to them. This allowed Harvest to find out which links were useful and which people weren’t using. Nobody was clicking on one of Harvest’s links, so it could be dropped.

Another tool is the Google Website Optimizer. For example, Harvest took a look at the exact same page with three different free trial buttons. They discovered that either green or blue buttons had much better rates of usage than a muted gray button. In fact, using that data led to a 10% improvement in usage.

Another example of how anybody can do user experience was an individual developer who worked for the US House of Representatives on house.gov. His position was buried deep in the bureacracy. However, he wanted to make the site better, so he managed to get access to the logs from their search engine. He tracked those results in painstaking detail for over nine months and made some amazing findings:

  • Queries were case-sensitive. Use a different case? Get a different result.
  • A single PDF showed up in nearly all results, no matter what the search terms.
  • Titles were given no weight in determining search results. In fact, some pages had no titles at all.

Based on looking at the results, he was even able to help predict what a top news item might be the next day, based on what search terms people were using. After that incident, people started listening to what he had to say, and house.gov switched their search engine to Google. Search results improved dramtically.

Takeaway points about web analytics:

  • Understand your site’s traffic cycles
  • Uncover usage patterns
  • Test design variations
  • Explore search logs
  • Make sure you understand what people are search for

Good read on this subject? “Web Analytics: An Hour a Day,” by Avinash Kaushik. Avinash used to run a lot of the analytics programs for Google. I am reading this book now, and if anybody can make analytics exciting, it is Avinash! He really lays out a story about what you want to be tracking, and how you can do so.

Usability testing

How well does our stuff actually work?

Harvest has used usability testing to really find out how people are using their products. While this is often not recommended, they usually re-use the same participants over and over again: their thinking is that some data is better than none. Their best user tester? The wife of one of Harvest’s founder. She gives very honest feedback, which is key.

The key to testing early and often is to do light usability testing. Find a user and show them a build, prototype, comp, sketch, whatever. Don’t tell participant what they should do. Instead, just ask, “What are your general feelings about this?” Then, lets particpants talk. Listen.

Iridesco point out: “It looks good is the worst feedback we can get.” Really probe to dig deeper. What looks good? Bad? This can be hard to hear, but it is crucial. It is hard to hear what people think, but you owe it to yourself to pressure test your work before putting it out there. This is the only way to get better. Iridesco also says: “You need to have humility and listen. Users aren’t always right, but you need to hear them.”

“Don’t make users designers and try to have them solves all your problems for you. Instead, just try to listen to what they are saying and find out what you can change based on that.

Where do you find participants?

  • Friends amd family
  • Folks in office who don’t work on the project (HR and secretaries are good candidates)
  • Twitter followers
  • Starbucks
  • Craiglist (make sure to screen these participants!)

Tools to collect feedback:

  • Silverback captures screen activity, eye tracking and participant voice.
  • Morae is pricy, but good.
  • Windows Media Encoder and Quick Time X both capture particpant mouse movement on the screen.

You do not need a formal usability testing lab. You can do this simply and for low-cost.

What about testing online? If possible, test in person. But if not, here are some options:

  • Open Hallway, although you will not be watching users live.
  • Usabilla.com asks people what they like on page.
  • Fivesecondtest gives people five seconds, then asks them questions on what they saw.
  • Whitney does not recommend UserTesting.com. In her experience, they recruit users for you, and you don’t have access to those users, so you never feel their pain. She pointed out that this may change in the future.

The drawbacks of online tools are:

  • You don’t see participants expressions and body language.
  • You can't ask probing, follow-up qeustions.
  • Online tools make it harder to internalize findings if you’re not there.
  • Whitney felt very strongly that essentially, online tools are cowardly: you need to feel the embarrassment of a sucky design.

Nobody does a great design with the first try. It’s a myth. Doing user testing puts you ahead of the game.

Takeaway points:

  • Test designs early and often.
  • Informal tests just as valuable.
  • Use people in your environment not involved in the project.
  • Choose participants of different backgrounds and capabilities.
  • Do the user testing yourself and acknowledge poor design choices.

Recommended book? “Handbook of Usability Testing,” Rubin and Chisnell.

(On a personal note, I’m all for doing user testing early and often, whether that is with scraps of paper or more elaborate prototypes, but just do it. Some is better than none!)

Experimentation and iteration

How are we always getting better?

There are lots of steps to making iterations in a design. This could include:

  1. Sketch
  2. Photoshop
  3. Test
  4. Static HTML Prototype
  5. Test
  6. Working protype
  7. Test
  8. Tweak
  9. Launch quality
  10. Feedback
  11. Get feedback and tweak until it is right!

One example of experiementation? Roz from Comcast was working on comcast.net. However, her user experience team was not working with the front designers. So to get engineers to start thinking about design, she put out lots of fun drawing tools to get them to play. She would also put up fun drawings in her cube and put out design books that people could check out and learn from. This really got people thinking about how they could experiment with new ways of doing things.

Roz pointed out, “We aren’t always working on the most interesting stuff, but we always want to work smarter.”

What is amazing is that Whitney told this story at An Event Apart, and apparently word got back to Comcast. For a week, the brought their front end designers together with the user experience team to see what sort of magic they could create. They called this Engineering Lab Week. They made the engineers unavailable for anything except for collaborating that week. It was a great experience, and now these two Comcast teams work much better. It really changed everyone’s attitudes.

Takeaway points on experimentation and iteration:

  • Never stop improving a product
  • Make the work environment creative
  • Encourage the team to solve problems together
  • Soak up inspiration.

Whitney’s book recommendation on experimentation and testing was “Sketching User Experiences,” by Bill Bruxton.

Summary

The number one thing is that you should always be listening to your users. Also:

  • Ask good questions to get to the underlying problems.
  • Use data and anecodotes to inform design.
  • Test designs and have the humility to admit you’re wrong.
  • Complete the feedback loop.
  • Never stop trying to make things better.

Just remember to make your users happy, and they will thank you!

Archives