If you’ve got power at your house, a flood of energy usage data is headed your direction someday soon.  If it hasn’t already, your utility company will replace your old mechanical meter — that someone read once a month — with a new digital “smart” meter – that will wirelessly transmit information about your household energy consumption on an hourly basis (more often if you are a business customer).

Today you probably get a statement that shows you how you did last month, and compares your usage to a few other relevant points in time — like last month, and the same time last year. You can learn a little bit from this comparison.  The Smart Meters will be accompanied by a massive increase in information, which your utility will likely give you access to online.  You’ll be able to compare this week to last week (when you were on vacation), weekdays to weekends, today with yesterday (when it was really hot, or really cold), and even this hour with last hour (when you watched a movie on that new Plasma screen TV and heated up your hottub).

I’ve had  the pleasure of helping utilities think through how to best present this information to their customers — so it is understandable, useful, and hopefully motivates customers to conserve energy.  This is no small undertaking.  I “met” a guy in a LinkedIn Group who summarized the problem beautifully:  “the average person thinks about his/her home energy costs about as much as thumb tacks and phytoplankton.” (thank you James Black!)  It’s like finding a way to make the tax code informative, interesting, and motivating. Huh?  No small feat.  And there can be a pretty big disconnect between the “I love data and complex excel graphs!” utility employees and the average “Joe/Jill” customer.

In usability testing sessions where I showed customers detailed info about their usage (wireframe mockups), one recurring question I have heard is “How am I really doing?”  Generally, customers like to believe that they are not the problem — somebody else is.  Many think they are doing ok, want to do ok, but don’t have the information they need to know if they are really doing ok, or doing enough.  They pat themselves on the back for turning off lights, but when they wander off to watch movies on their new plasma screen TVs they leave every peripheral in their offices plus the computer on overnight.   To figure out if they are really doing ok (or enough), many customers long to know how they compare to the neighbors.  This is one situation where many people don’t want to keep up with the Joneses;  and, if they are exceeding the Joneses in terms of energy consumption, they could be motivated to do better.

So I was thrilled today to learn that OPower and Xcel Energy have teamed up for a 3-year pilot program aimed at encouraging St. Paul, MN customers cut down.  Apparently similar pilots that use social pressure to drive conservation are being conducted in CA, WA, and elsewhere in MN.  Initial reports are positive —  Connexus Energy observed 2-3% drops in energy usage, and Xcel homes for the same.

(screenshot from the OPower website)

The customers in the XCel pilot received their first reports in December, with colorful paper statements that use the simple smiley face to indicate how a customer compares to 100 neighbors.  (Imagine that — colorful statements from your utility? with Smiley faces?  That’s new!)  You get 2 smiley faces for doing “great,” one for doing “good.”  Feedback in California suggested that frowny faces were not well received — so heavy users get a gentle nudge:  “You used more than average” and ideas for how to save.

There are, of course, privacy issues. And in an ideal world, the comparisons would be “apples to apples.”  For example, comparing a 1972 3,000 square foot house with a home office, a stay at home mom and kids, and a pool to a brand new 1,000 square foot house isn’t useful information.

I’m interested to see how this pilot plays out, and optimistic that this social pressure will have a positive impact. My research has suggested that this type of information could be very motivating if customers discover that they are at the top of the consumption pile.  I would love to be a fly on the wall when customers open a colorful bill from their utility company with a smiley face on it!

Posted by: bellaviaresearch | January 11, 2010

Hyundai’s New Site: UX Falls Short

As reported in MediaPost’s “Online Media Daily” newsletter, Hyundai just launched a new website.  I was curious to see what sort of customer experience it offers.

One of my neighbors bought a Hyundai last year and loves it.  I am not a Hyundai customer, and I’ve never visited their site or considered one of their vehicles.So I approached the site wondering if there was a vehicle that could work for my family.

The CUVs look like the right sort of vehicle for schlepping around young kids and their bikes (even though I don’t know what a “CUV” is.  Thank you google!  It’s a “Crossover”).  Turns out that Hyundai has a few CUVs.  One immediate burning question for me:  Which of these is best for my family?

Which CUV is best for my family? No clue.

Sitting on the CUV page, there simply isn’t enough information here for me to tell which one of these is most likely to meet my needs.  How is the Tuscon different from the Santa Fe different from the Veracruz?  Clearly price, but what are the other major differences?  Does one sit more than the other?  Have a third row?  Is one more souped up than the other?  There isn’t enough information here for me to tell at a glance.  I guess I have to futz around with the sliders to figure that out, or go elsewhere on this site.

What I want here is some clue that points me in the right direction, meaning towards the CUV that is most likely to be a good fit for me.  I’m sure the info is here somewhere, but I’m not the sort of customer who likes to dig around and waste my limited time.

And that’s not unusual. I once had a customer in a usability test get so frustrated that she stood up and stormed toward the door, with all sorts of colorful language  that the backroom found quite useful for understanding her level of frustration with their website!

Which Santa Fe? No clue.

From the CUV lineup page, I take the safe middle ground and pursue the Santa Fe. But once I’m on the Santa Fe page, I have three Santa Fe’s to choose from.  Again, I can’t easily figure out which one to pursue.  Once again, there isn’t enough information to allow me to make a smart choice, so at a glance I can determine which path is most likely to be fruitful for me.

In conjunction with their agency iCrossing, Hyundai determined that “an important consideration was the number of clicks required to reach information consumers might need to make buying decisions,” and they intended for this new website to provide “simple navigation that allows consumers to explore all the creative elements on the site.” In my opinion, the number of clicks to complete a goal is a shallow metric that generally misses the user experience mark. Customers don’t really care how many clicks it takes for them to fulfill their goals, they really only care about whether the site offers what they need and feels intuitive. If it takes “too many” clicks, customers don’t like that; but what they respond best to is a site that really understands their needs.

In my opinion, for someone looking to answer the question “which Hyundai is best for me and my family,” this beautiful new site falls short of offering a good user experience.  There may be many other wonderful things about this site, but for me it falls short on this fundamental issue.  Based on my time on the new Hyundai site, I have no idea which car is best for me; so far, all I can tell is that they have different price tags.  Was there customer research behind this design?  I can’t imagine I’m the only one out there whose needs aren’t met by this design. Have a different opinion or something to add?  I’d love to hear from you.

Posted by: bellaviaresearch | January 7, 2010

Usability Testing Pays for Itself: $1:$10:$100

rule of thumb

I’m passionate about the value of usability testing, and regularly tell my clients that user research pays for itself. A client recently asked me to show the data behind my claim. I’ve certainly seen it anecdotally, but where is the hard proof that user research during product design pays for itself?

In my literature review, I found two interesting references that build a case for gathering user requirements when creating products.  Neither are very recent, but I would expect they could be extrapolated to apply to any sort of customer facing product, whether its a consumer electronic, website, or mobile app:

– “The rule of thumb in many usability-aware organizations is that the cost-benefit ratio for usability is $1:$10-$100. Once a system is in development, correcting a problem costs 10 times as much as fixing the same problem in design. If the system has been released, it costs 100 times as much relative to fixing in design.” (Gilb, 1988)

– A change may cost 1.5 units of project resource during conceptual design, 6 units during early development, 60 during systems testing and 100 during post-release maintenance.”  (Pressman, 1992).

Have evidence from your own work that usability testing pays for itself?  Or a more recent study you can share?  I would love to hear it!

As recently reported in the Wall Street Journal, research is a cornerstone of successful product development.  This survey  of North American and European businesses conducted by McKinsey and MIT reveals what does and does not work in product development.

Businesses with the best track records for product development share three best practices:

1)    Clear articulation of project goals and scope

2)    Strong project culture (so employees know what their priorities are and managers are accountable for project success/failure)

3)    Research conducted throughout the design process, so key stakeholders are continually listening to customers and using their learning to guide informed product development.

According to the authors, “the successful innovators in our study kept in close contact with customers throughout the development process.”  This study revealed that 80% of top performing companies include research to test and validate customer needs throughout the development process. This figure is nearly twice as high as the 43% of bottom performers who utilize research.

In addition, this research confirms the importance of listening to customers early and often. Top performing innovators were two times as likely to start the product design process with a solid understanding of customer needs. This foundation enables innovators to identify and fix design issues early in the process, which is much more cost effective than identifying them and trying to resolve them later in the design process.

The proof is in the pudding: companies that embraced these three best practices for product design enjoyed he following payoffs:

–       17x more likely to deliver products on time

–       5x more likely to deliver products on budget

–       2x more likely to meet ROI targets

In my experience, companies are better served by listening to customers earlier in the design process. Even if you feel you “aren’t ready,” get in front of your target customers early with rough concepts or paper prototypes before you commit staff time and money heading the wrong design direction. It can be much more cost effective than waiting.

Early conceptual research focuses your team, limits scope creep, and enables you to respond to customer feedback when you are still nimble.

  • By listening to customers early, your product team is more likely to stay focused on what really matters to your customers.
  • You’ll also find less institutional resistance to change if your entire team starts out with a solid understanding of who they are designing for and what the customers want/need.
  • It’s much easier to erase lines on paper prototypes than it is to go back to the drawing board with coding.

It is surprising how many companies feel they “can’t afford” to slow the process with research.  Really, they can’t afford to skip it.

Want to talk about cost effective ways to include research in the product development cycle?  julie@bellaviaresearch.com or 831.454.8217.

Posted by: bellaviaresearch | January 4, 2010

Happy New Year!

Ahhhh,   New Years.  A time for reflection.  A time for champagne.

A time for good intentions with high hopes for committed follow-through.

Which brings me to one of my New Year’s Resolutions for BellaVia Research.   I’m finally (!) launching this blog.  If I follow through (and I have good intentions to do so!), this will be a place for my musings on user experience, usability, and the role of qualitative research in product development.   Got something in particular you’d like to hear about?  Let me know.  And I hope to see you back again soon.

Happy 2010!

Categories