RADIANCE BLOG

Category: In Action

Measuring Reactions to Your Ideas

Market research can be painful sometimes.  You may have poured your heart and soul into an idea and feel it’s really good, only to put it in front of your customers and hear all the things they hate about it.  But it’s better to know in advance than to find out after you’ve spent a ton of money and risked your brand equity for your idea.

It may not be as sexy as measuring customer satisfaction, prioritizing product features, or helping you optimize your pricing strategies, but sometimes market research is simply necessary to make sure that you haven’t overlooked something important when developing a product, service, or marketing campaign.  No matter how much we try to put ourselves in the shoes of our customers, it is impossible to be 100% sure that your own background and experiences have ensured that you fully understand the perspectives of customers who come in a huge variety of shapes and sizes.

In our own work, we frequently work with advertising agencies to help inform and evaluate ad campaigns and media before launch.  Considering the enormous amount of money required to reach a wide audience (though television, radio, online ads, etc.), it just makes sense to devote a small part of your budget to running the campaign by a variety of people in your audience to make sure you know how people might react.

In some cases, what you learn might be fairly minor.  You might not have even noticed that your ad lacks diversity.  You might not have noticed that your ad makes some people feel uncomfortable.  Or perhaps, your own world view has given you a blind spot to the fact that your ad makes light of sensitive issues, such as religion, major tragedies, or even date rape.

Unfortunately, we saw an example of this issue in Denver recently, where a local coffee chain’s attempt at humor infuriated the local neighborhood with a sign that read, “Happily gentrifying the neighborhood since 2014.”  From the perspective of someone less engaged in the neighborhood, you can understand what they were getting at – that good coffee was a sign of progress in the natural development of a thriving city.

However, the statement completely misses the fact that gentrification often results in people being forced from the homes they have lived in for years and the destruction of relationships across an entire neighborhood.  In this particular case, the coffee shop was located directly in the middle of a neighborhood that has been struggling with gentrification for the past decade or more, and tensions were already high.  The ad was like throwing gasoline on a fire and has resulted in protests, graffiti, and even temporary closure of the store.

It’s certainly easy to blame the company, the ad agency, and anyone else that didn’t see that this campaign would be a bad idea.  However, the reality is that all of us have our blind spots to sensitive issues, and no matter how much we feel like we understand people of different backgrounds, there will always be a chance you’ve missed something.

So, please, for the sake of your own sanity and those of your customers, do some research before you launch a marketing campaign.  At a minimum, run your ad by some people who might see it just to see how they react.  And if you want a more robust evaluation of your campaign, which can help to ensure that your advertising dollars have the biggest impact possible, we can probably help.


Tuft & Needle: Incredible Mattresses. Incredible research?

If you have ever received a proposal from Corona Insights regarding customer research, you may have seen this line:

“We believe that surveying customers shouldn’t lower customer satisfaction.”

We take the respondent’s experience into account, from the development of our approach through the implementation of the research (e.g., survey design, participant invites, etc.), even in our choice of incentives. We work with our clients on an overall communications plan and discuss with them whether we need to contact all customers or only a small subset, sparing the rest from another email and request. For some clients, we even program “alerts” to notify them of customers that need immediate follow-up.

As such, I’m always interested to see how other companies handle their interactions when it comes to requesting feedback. Is it a poorly thought out SurveyMonkey survey? Personalized phone call? Or something in between?

Recently, I was in the market for a new mattress and wanted to try one of newer entrants shaking up the mattress industry. I went with Tuft & Needle, and while I won’t bore you with details of the shopping experience or delivery, I found the post-purchase follow-up worth sharing (hopefully you’ll agree).

I received an email that appeared to come directly from one of the co-founders. It was a fairly stock email, but not with overdone marketing content or design, and it is easy enough to mask the email to make it appear to come from the founder. In it, it had one simple request:

“If you are interested in sharing, I would love to hear about your shopping experience. What are you looking for in a mattress and did you have trouble finding it?”

The request made clear that I could simply hit reply to answer. So I did.

I assumed that was it, or maybe I’d get another form response, but I actually got a real response. One that was clearly not stock (or at least not 100% stock – it made specific references to my response). It wasn’t the co-founder who had responded, but another employee, but still impressive in my opinion.

So, what did they do right? What can we take away from this?

  • Make a simple request
  • Make it easy to reply to
  • Include a personalized acknowledgement of the customer’s responses

Maybe you think this is something only a start-up would (or should) do, but what if more companies took the time to demonstrate such great service, whether in their research or their everyday customer service?


How data-driven insights can reveal strategic advantages

A client recently asked us for guidance in the middle of their communication campaign.  They had already created and deployed a series of vivid ads encouraging a specific behavior. (For confidentiality reasons, I can’t state the behavior, but let’s pretend they wanted dog owners to register their dogs with the local humane society). Their desired outcome was to increase the percentage of registered dogs from the baseline, which hadn’t changed for many years. Their strategy was to use mass media messages to motivate all dog owners to register their dogs.

They hired Corona to evaluate the campaign and provide recommendations for improvement.  We found that a small percentage of the population had strong intentions to not register their dog (intention to do something is a relatively good predictor of what people will do).  Based on other scientific research, we know that it is difficult to change peoples’ strong intentions, especially through mass media.  Thus, we suggested that the client stop trying to influence all dog owners, at least in this campaign.

A better strategy was to motivate dog owners with weak intentions or were unsure what they would do.  Our research found that people with weak or unformed intentions had different barriers and reasons to register their dogs. Indeed, those with weak intentions often said they just “never got around to it.” This finding was the keystone of our research because it showed how a strategy shift aimed at influencing this sub-population – rather than all dog owners – would have the biggest impact on increasing overall registration!

Shifting strategy was not easy for this client, but the data and our recommendations compelled them to make the change.  We helped them see their issue from a new perspective, and our guidance made the transition of the communication strategy easier. Quality research and thoughtful analysis can reveal strategic advantages, and every strategic advantage can have a meaningful impact on success.



Corona Summer Camp 2015: AAPOR

Ah, summer camp. For those of us who were generally allergic to the outdoors as kids, summer camp did not necessarily mean cabins, building fires, and outdoor recreation. In my case, summer camp usually meant summer orchestra, which was the best. Not only did I get to play music for hours every day, I also got to spend time with a bunch of other orchestra kids who would not even bat an eye if I wanted to discuss which Suzuki book they were on or the pros and cons of catgut strings.

In many ways, AAPOR feels similar to those years of summer orchestra—conference attendees discuss sampling issues and response rates as if they were discussing the weather. This year over one thousand people met at AAPOR to discuss all the intricacies of public opinion research. Beth and I had a really difficult time choosing which talks to go to because they all sounded so relevant and interesting.

This conference in particular gives us a good sense of the changing landscape of survey research and of the big issues in the field. One big issue was the mode to use for survey research. Phone surveys have been the gold standard for opinion research for a while, but with the increasing rate of cell phone use (especially cell phone only households) and the decreasing response rates, it has become harder and more expensive to maintain phone survey quality. There was at least one presentation about a survey that has transitioned over to cell phone only. Other talks discussed the quality of online panel surveys or using multiple survey modes (e.g., a paper survey with phone, an online survey with paper, etc.) to increase response rates.

Another big issue was incorporating all the available data into public opinion research. New research has started looking at whether social media and other big datasets can be analyzed as another way of measuring public opinion.  A final big issue had to do with transparency in survey research. Although there is a ton of survey data floating around, it is not always clear who is following correct and ethical survey methodology standards and who is not. Moreover, a lot of survey data reports do not include enough information to even judge what standards are being used. AAPOR’s Transparency Initiative encourages organizations to include with public release of data certain survey methodology information that allows people to judge the quality of the data.

Beth and Kate w Nate Silver at AAPOR 2015On top of all the great discussions of survey methodologyy and of the future of public opinion research, Beth and I also got to meet (briefly) Nate Silver, who received an award from AAPOR this year. (Someone may have texted this photo to her mom.) After the awards reception, AAPOR held a casino night to raise money for student awards. Silver participated in the poker tournament and kindly posed for many photos. He also has posted his own thoughts  about some issues raised at the conference.

PHOTO CREDIT: DAVID QUACH


The Lifesavers Conference

As with any industry, it is important in market research to keep up with the latest thinking and practices by regularly attending workshops and conferences.  For this reason, members of the Corona staff can occasionally be found at conferences put on by the American Association of Public Opinion Research (AAPOR), Market Research Association (MRA), and American Marketing Association (AMA).  However, here at Corona, we have an additional challenge of keeping up not only on the latest and greatest research practices, but also on the issues most important to our clients.  For that reason, we also try and occasionally make an appearance at conferences focused on subject matters such as parks and recreation, traffic safety, and more.

Kevin Presenting @ Lifesavers Conference

In March of this year, Kevin and I made the trip to Chicago to participate in this year’s Lifesavers Conference.  This conference has been conducted annually for decades and brings together individuals from around the country whose jobs are dedicated to keeping Americans safe on our nation’s highways.  The minds present at these conferences have been instrumental at making changes both legislatively and in communications with the public to dramatically reduce traffic fatalities over the years, including laws aimed at requiring child safety seats and punishing drunk drivers, and communications aimed at increasing seat belt usage, reducing impaired driving, and more recently, reducing distracted driving.

The things we learned at this year’s conference were enlightening to say the least, and it would require a whole series of blog posts to cover them all.  However, a few highlights included:

  • Learning about how state traffic safety departments are effectively using social media to reach a new generation for whom traditional television advertising simply isn’t effective.
  • Learning about how small nonprofit organizations can conduct their own evaluations to make the case to funders that their work is making a true impact.
  • Understanding the efforts being made by law enforcement in Colorado and Washington to keep drivers safe in the age of legalized marijuana.

This year’s conference was particularly special for us, as Kevin had the opportunity to present the results of research we conducted with the Minnesota Office of Traffic Safety aimed at better understanding some of the characteristics of high-risk drivers (those who exhibit a combination of risky traffic behaviors, including drinking and driving, speeding, texting while driving, and not wearing a seat belt).  A few of the key findings of that research included:

  • High risk drivers tend to overestimate how common their behaviors are among their peers (drinking and driving isn’t near as common as one might think), overestimate their own driving ability (almost everyone believes they are “above average”), and underestimate the risk of their driving behaviors (those who speed regularly are considerably more likely to be in a crash than those who do not).
  • Those who text and drive know they shouldn’t and worry about being in an accident, but they do it anyway. (Another presentation at the conference suggested that texting and driving should be treated as an addition rather than a rational decision.)  On the other hand, those who speed regularly are relatively unlikely to think their behavior is a problem and are more worried about getting a ticket than being in a crash.

Overall, the conference was a great chance to catch up on some of the things going on in traffic safety and lend our own expertise as well, so we hope we have the opportunity to attend again in the future!


Colorado’s New Statewide Child Abuse Hotline

We were pleased yesterday to attend the unveiling of Colorado’s new rollout of a statewide hotline to report suspected child abuse or neglect.  Governor Hickenlooper and other dignitaries spoke on the Capitol steps, and we think it’s a great step forward for Colorado.

1-844-CO-4-KIDS
1-844-CO-4-KIDS

Our partners at Heinrich Marketing came up with several great concepts, and we were delighted to conduct concept testing that helped lead to the selection of the campaign that was announced today.  We conducted focus group research in urban and rural Colorado to weigh the strengths and weaknesses of five different concepts.  We think the selected theme is a great way to convey a complex and challenging message.

You can see examples of the campaign theme put to use here.


How Researchers Hire

Corona Insights just recently went through a round of hiring (look for a blog post soon about our newest member) and, while many of our hiring steps may be common, it did occur to me that our process mirrors the research process.

  • Set goals. Research without an end goal in mind will get you nowhere fast.  Hiring without knowing what you’re hiring for, will ensure an inappropriate match.
  • Use multiple modes.  Just as approaching research from several methodologies (e.g., quant, qual) yields a more complete picture, so too does a hiring process with multiple steps.  Reviewing resumes (literature review), screening (exploratory research), testing (quantitative), several rounds of interviews (qualitative), and mock presentation.
  • Consistency.  Want to compare differences over time or between different segments? Better be consistent in your approach.  Want to compare candidates? Better be consistent in your approach.

Needle in haystack imageI could go on about the similarities (drawing a broad sample of applicants?), but you get the idea.  The principles of research apply to a lot more than just research.

And as with any recurring research, we reevaluate what worked and what can be improved before iteration.  Therefore our process changes a little each time, but the core of it remains the same – asking the right questions and analyzing data with our end goals in mind.  Just like any good research project.

Stay tuned for a blog post about our new hire.


Data for your sleepless nights

sleepA few months ago, I purchased a fancy pedometer to start collecting more data about myself. For those of you fortunate enough to know my slothful self in real life, I’d like to interrupt your laughter to point out that one of the features I was most interested in was the pedometer’s ability to track my sleep. I’m not sure exactly how it tracks my sleep, nor how precise its measurements are, but it has pushed me to think a lot more about my sleep and about sleep in general. I decided I wanted to know how I compared to other people and to look for patterns in my own sleep data.

First of all, diving into the world of sleep data is like diving into a crazy rabbit hole. (Rabbits, by the way, sleep 8.4 hours per day, but only 0.9 hours of that are spent dreaming. Humans, however, sleep roughly 8 hours, of which 1.9 hours are spent dreaming.[1] Take that, rabbits.) Questions related to sleep are not necessarily where you would expect them to be. They are shockingly absent from the Behavioral Risk Factor Surveillance System (which measures many behaviors related to health) and yet appear on the American Time Use Survey (ATUS).

Even more interesting, you see different patterns in the data depending on the question format. In the ATUS, they have people track their time use via an activity diary. Basically the dairy has you input what activities you were doing when, with whom and where. Based on these diaries, the Bureau of Labor Statistics estimates that in 2012 Americans were getting more than eight hours of sleep per day on average. These data also show that women were getting more sleep than men, and that a woman my age was averaging almost 9 hours of sleep per night.[2]

Sadly, these numbers seemed a little high to me, and a Gallup survey seemed to agree. In a 2013 survey, when people were asked how many hours of sleep they usually got per night, people reported getting fewer than 7 hours.[3] The difference between the diary findings and the survey findings reminded me of a similar pattern in reports of what people eat. Basically, people are really bad at remembering what they ate during a week, so a daily food diary tends to be the more accurate measurement. So maybe people are also bad at recalling how much sleep they get on average during the week? However, I wonder if filling out the diary for ATUS sometimes feels embarrassing. For example, do people feel too embarrassed to admit to all the T.V. they watch/internet browsing they do, so they end up reporting more sleep?

Another sleep data source I found was this chart of the sleeping habits of geniuses.[4] I imagine that getting enough sleep probably helps all of us reach our genius potential. Based on the chart, the average amount of sleep across this sample of geniuses is about 7.5 hours, which seems reasonable. It is super interesting, though, to see how and when geniuses spread out their sleep across the day.

Back to my own data, I noticed two important things. One, the social context can have a big impact on my sleep. Beth and I went to AAPOR in May, and every night we stayed up too late discussing nerdy things that we had learned/ideas for our own analyses. This resulted in many nights of less than 7 hours of sleep. The week after AAPOR, I went to visit my sister. My sister would readily admit that she finds it almost impossible to be a functioning human being on anything less than 8 hours of sleep. Not surprisingly, I averaged more than 8 hours of sleep during that visit. So, insight number one is that I should only share hotel rooms and/or sleep at the homes of people who value sleep. Unfortunately, I don’t group my friends and family based on their sleeping habits and often I like staying up late to debate nerdy things, like what rabbits even dream about during their roughly one hour of dreaming each day. So, I’m not sure how actionable this insight really is.

Second, I noticed that I slept better when I had walked more during the day. Apparently the last laugh is on me because I’m beginning to suspect that even for those of us who really love sleep, being more physically active might be a critical component of the sleep routine.


Corona wins Gold Peak Award for Market Research

award winning market researchLast night, the Colorado American Marketing Association (CO+AMA) celebrated Colorado’s first class marketers at their annual Colorado Peak Awards. Corona Insights was honored to take home our 4th Gold Peak Award in the category of Market Research.  This year, we won the award for our member engagement and brand assessment for the American College of Veterinary Medicine (ACVIM).

Market research is fundamentally different from other categories honored at the CO+AMA Peak Awards. Market Research prepares brands and marketing campaigns for take-off. By doing proper research, companies are able to develop a sound marketing strategy that effectively reach their target audience.

In 2013 we were recognized with a Gold Peak for the research we did for Donor Alliance which resulted in a marketing campaign that addresses the trends in the data we helped uncover. In 2010 Corona took home the Silver Peak award for our rebranding and in 2011 Corona won a Gold Peak award for our market research work to inform the University of Denver Sturm College of Law’s strategic plan.

The 26th annual gala was held that Wings Over the Rockies and featured an aerospace theme. Kevin Raines, CEO, and Kassidy Benson, Marketing and Project Assistant accepted the award on behalf of the firm.

031