2012 Consumer Picks Survey

Uncovering Hidden Truths in the Data

A closer look at the results reveals even more interesting trends worth watching.

As every leading restaurant will tell you, the most successful brands deliver their fundamentals consistently, while carefully tweaking and revising the model to stay current and compelling.

WD Partners followed this advice when producing our second annual Consumer Picks survey, a comprehensive rating of more than 150 leading restaurant chains. We built on the success of last year’s effort, following a similar format so year-to-year performance changes may be studied. But we also made enhancements to the survey for even greater clarity. I love digging into the Consumer Picks data, and this year we generated more than ever. Over 5,000 consumers told us what earns their dining loyalty in a slew of categories. And while the tabulations at face value are fascinating and instructive, there are more surprising truths if you take an even closer look. This article reveals some of our subtler findings — and offers a unique way for you to make performance comparisons yourself.

Co-sponsored by Nation’s Restaurant News, the 2012 Consumer Picks report is a national reflection of how customers rate a wide variety of restaurant chains. Our Uncovering Hidden Truths in the Data 2012 Consumer Picks Survey results reflect opinions on 152 chains total (13 more than last year), and we added a new attribute — Craveability — to the list of factors respondents were asked to consider and rank. The complete results, which were tabulated and scrutinized for two months, can be found in a recent issue of Nation’s Restaurant News. You may also request a copy of the report directly from WD Partners (see “For a Closer Look” sidebar for details). The report is a deep dive, ranking chain performance in three categories (Limited Service, Casual/ Family, and Fine Dining) and across 11 segments organized by food type (Asian/Noodles, Bakery/Café, Chicken, Frozen Treats, etc.). While the data is somewhat similar to last year’s benchmark results, there were some obvious — and not-so-obvious — shifts and changes I found very interesting. One key to finding underlying these insights? The “box scores.” And understanding box scores is important. Let’s take a look.

Box Scores 101

A survey box score is, literally, the box a respondent chooses when given survey choices, such as:

▢ Excellent

▢ Good

▢ Average

▢ Poor

▢ Very Poor

In general, a positive review of a brand is drawn from the top two box scores combined. Totaling all the Excellent and Good ratings gives a good idea of how many customers rate you highly in a category. (Or, conversely, how poorly they rate you, based on the bottom two box scores.) But here’s the interesting thing about these surveys — combined top box scores can tell two very different stories.

In this year’s results, this distinction made a big difference when identifying brands with fans versus raving brand loyalists — in other words, attributes that indicate solid performance as opposed to true brand strength.

Here’s a good example: In the Limited Services Hamburger category, out of a total of 17 brands measured, only six points separated the two leaders, In-N-Out Burger and Culver’s, for Food Quality based on the top two box scores. But when we isolated the top box score (Excellent) from the second box (Good), the differences are more pronounced — far more customers ranked In-N-Out as Excellent. Those are the brand loyalists talking. What do I take away from this? Culver’s, while popular, cannot rest on its laurels, and perception of food quality can never be taken for granted. Ironically, In-N-Out scores very low, as might be expected, on Menu Variety — but that’s part of what makes it an iconic brand.

Another Surprise

Sometimes strong top box performance in just a few attributes can lift an otherwise underperforming brand. Here’s my favorite example.

In the Casual Dining/Varied Menu segment, one brand that scored poorly (19th out of 23) in top 2 box scores jumped all the way to 7th place when measured by consistently high top box scores in two attributes, Service and Atmosphere. The brand? Hooters. Remove those two unique attributes, and the brand’s overall score plummets. In some respects, the company’s unusual formula is working. But there’s lots of room for improvement elsewhere. It’s a good example of the top 2 box scores only telling part of the story.

Other Highlights

Beyond top box analysis, there are lots of other interesting (and accessible) results in the final report.

Battle of the Subs

Firehouse Subs shook up the Sandwich category, taking first place and earning top scores for five attributes, including Food Quality and Craveability. Jason’s Deli, last year’s No. 3, climbed to the No. 2 spot, bumping last year’s top sandwich brand, McAlister’s Deli, to third place.

Listen to the Music

In the Family Dining category, Cracker Barrel Old Country Store defended its No. 1 spot, fending off competition from Marie Callender’s and Bob Evans. Cracker Barrel added to its retail revenue with a focus on country music artists, releasing a CD/DVD set by Dolly Parton that held the No. 2 spot on Billboard’s Top Music DVD chart for three weeks. The chain also focused on food, refining its menu and pricing strategies and introducing a $5.99 weekday lunch special.

Demographic University

Average scores are just that — a high-level look at all scores combined. A closer look at demographic sets, such as by income, generation, and gender, show many contrasting favorite brands. For example, at the top of the Casual Dining category, men preferred PF Chang’s China Bistro, women chose The Cheesecake Factory, Millennials loved Bonefish Grill, and Baby Boomers chose BJ’s Restaurant and Brewhouse. Those with a household income over $100,000 per year? Their favorite is Mellow Mushroom Pizza Bakers. The full report, and NRN’s breakdown of the results, includes a detailed dive into many more similar demographic preferences.

Key Attributes:

What Consumers Were Asked to Rate WD Partners’ 2012 Consumer Picks research asked thousands of customers to rate brands on 10 key attributes and 15 competitive sub-segments:

  1. Atmosphere
  2. Cleanliness
  3. Craveability (NEW!)
  4. Food Quality
  5. Likely to Recommend
  6. Likely to Return
  7. Menu Variety
  8. Reputation
  9. Service
  10. Value

Segments: Types of Chain Consumers Were Asked to Rate

WD Partners’ 2012 Consumer Picks study compares the performance of more that 150 individual chains in 4 categories and 11 competitive subsegments:

Limited-service

  • Asian/Noodles: Overall Segment Leader – Pei Wei Asian Diner
  • Bakery/Café: Overall Segment Leader – Panera Bread
  • Beverage/Snack: Overall Segment Leader – Krispy Kreme Doughnuts
  • Buffet/Steak: Overall Segment Leader – Souplantation/Sweet Tomatoes
  • Chicken: Overall Segment Leader – Chick-Fil-A
  • Frozen Treats: Overall Segment Leader – Marble Slap Creamery — Also the overall highest rated chain in the category
  • Hamburger: Overall Segment Leader – In-N-Out Burger
  • Mexican: Overall Segment Leader – Chipotle Mexican Grill
  • Italian/Pizza: Overall Segment Leader – Papa Murphy’s Take ‘N’ Bake Pizza
  • Sandwich: Overall Segment Leader – Firehouse Subs
  • Seafood: Overall Segment Leader – Captain D’s

Casual Dining

  • Italian/Pizza: Overall Segment Leader – Olive Garden
  • Seafood: Overall Segment Leader – Bonefish Grill
  • Steak: Overall Segment Leader – Texas Roadhouse
  • Varied Menu: Overall Segment Leader – The Cheesecake Factory

Fine Dining

  • Overall Category Leader – Ruth’s Chris Steak House

Family Dining

  • Overall Category Leader – Cracker Barrel Old Country Store

How To Study Your Rivals

If you only compare a brand’s scores to those in last year’s report, you run the risk of drawing incorrect conclusions. The best way to analyze subtler changes in performance isto compare a chain’s scores to the same set of rivals.

For example, at first glance in the Casual Dining category, Red Robin Gourmet Burgers’ Overall score shows almostno change from 2011 to 2012 (66.5% to 66.6%). But compared to its competitor set, where the average score dropped from 63.9% to 61.4%, Red Robin’s improvement in position becomes more apparent. A similar pattern can be seen in Red Robin’s Food Quality score. And when comparing the Service attribute, while Red Robin’s scorewas slightly lower than in 2011, its rivals dropped further.

Please visit wdpartners.com/research for more information.

Share

Leave a Reply

Your email address will not be published.

three × two =

This site uses Akismet to reduce spam. Learn how your comment data is processed.