Category: In Action

Tuft & Needle: Incredible Mattresses. Incredible research?

If you have ever received a proposal from Corona Insights regarding customer research, you may have seen this line:

“We believe that surveying customers shouldn’t lower customer satisfaction.”

We take the respondent’s experience into account, from the development of our approach through the implementation of the research (e.g., survey design, participant invites, etc.), even in our choice of incentives. We work with our clients on an overall communications plan and discuss with them whether we need to contact all customers or only a small subset, sparing the rest from another email and request. For some clients, we even program “alerts” to notify them of customers that need immediate follow-up.

As such, I’m always interested to see how other companies handle their interactions when it comes to requesting feedback. Is it a poorly thought out SurveyMonkey survey? Personalized phone call? Or something in between?

Recently, I was in the market for a new mattress and wanted to try one of newer entrants shaking up the mattress industry. I went with Tuft & Needle, and while I won’t bore you with details of the shopping experience or delivery, I found the post-purchase follow-up worth sharing (hopefully you’ll agree).

I received an email that appeared to come directly from one of the co-founders. It was a fairly stock email, but not with overdone marketing content or design, and it is easy enough to mask the email to make it appear to come from the founder. In it, it had one simple request:

“If you are interested in sharing, I would love to hear about your shopping experience. What are you looking for in a mattress and did you have trouble finding it?”

The request made clear that I could simply hit reply to answer. So I did.

I assumed that was it, or maybe I’d get another form response, but I actually got a real response. One that was clearly not stock (or at least not 100% stock – it made specific references to my response). It wasn’t the co-founder who had responded, but another employee, but still impressive in my opinion.

So, what did they do right? What can we take away from this?

  • Make a simple request
  • Make it easy to reply to
  • Include a personalized acknowledgement of the customer’s responses

Maybe you think this is something only a start-up would (or should) do, but what if more companies took the time to demonstrate such great service, whether in their research or their everyday customer service?

How data-driven insights can reveal strategic advantages

A client recently asked us for guidance in the middle of their communication campaign.  They had already created and deployed a series of vivid ads encouraging a specific behavior. (For confidentiality reasons, I can’t state the behavior, but let’s pretend they wanted dog owners to register their dogs with the local humane society). Their desired outcome was to increase the percentage of registered dogs from the baseline, which hadn’t changed for many years. Their strategy was to use mass media messages to motivate all dog owners to register their dogs.

They hired Corona to evaluate the campaign and provide recommendations for improvement.  We found that a small percentage of the population had strong intentions to not register their dog (intention to do something is a relatively good predictor of what people will do).  Based on other scientific research, we know that it is difficult to change peoples’ strong intentions, especially through mass media.  Thus, we suggested that the client stop trying to influence all dog owners, at least in this campaign.

A better strategy was to motivate dog owners with weak intentions or were unsure what they would do.  Our research found that people with weak or unformed intentions had different barriers and reasons to register their dogs. Indeed, those with weak intentions often said they just “never got around to it.” This finding was the keystone of our research because it showed how a strategy shift aimed at influencing this sub-population – rather than all dog owners – would have the biggest impact on increasing overall registration!

Shifting strategy was not easy for this client, but the data and our recommendations compelled them to make the change.  We helped them see their issue from a new perspective, and our guidance made the transition of the communication strategy easier. Quality research and thoughtful analysis can reveal strategic advantages, and every strategic advantage can have a meaningful impact on success.

Corona Summer Camp 2015: AAPOR

Ah, summer camp. For those of us who were generally allergic to the outdoors as kids, summer camp did not necessarily mean cabins, building fires, and outdoor recreation. In my case, summer camp usually meant summer orchestra, which was the best. Not only did I get to play music for hours every day, I also got to spend time with a bunch of other orchestra kids who would not even bat an eye if I wanted to discuss which Suzuki book they were on or the pros and cons of catgut strings.

In many ways, AAPOR feels similar to those years of summer orchestra—conference attendees discuss sampling issues and response rates as if they were discussing the weather. This year over one thousand people met at AAPOR to discuss all the intricacies of public opinion research. Beth and I had a really difficult time choosing which talks to go to because they all sounded so relevant and interesting.

This conference in particular gives us a good sense of the changing landscape of survey research and of the big issues in the field. One big issue was the mode to use for survey research. Phone surveys have been the gold standard for opinion research for a while, but with the increasing rate of cell phone use (especially cell phone only households) and the decreasing response rates, it has become harder and more expensive to maintain phone survey quality. There was at least one presentation about a survey that has transitioned over to cell phone only. Other talks discussed the quality of online panel surveys or using multiple survey modes (e.g., a paper survey with phone, an online survey with paper, etc.) to increase response rates.

Another big issue was incorporating all the available data into public opinion research. New research has started looking at whether social media and other big datasets can be analyzed as another way of measuring public opinion.  A final big issue had to do with transparency in survey research. Although there is a ton of survey data floating around, it is not always clear who is following correct and ethical survey methodology standards and who is not. Moreover, a lot of survey data reports do not include enough information to even judge what standards are being used. AAPOR’s Transparency Initiative encourages organizations to include with public release of data certain survey methodology information that allows people to judge the quality of the data.

Beth and Kate w Nate Silver at AAPOR 2015On top of all the great discussions of survey methodologyy and of the future of public opinion research, Beth and I also got to meet (briefly) Nate Silver, who received an award from AAPOR this year. (Someone may have texted this photo to her mom.) After the awards reception, AAPOR held a casino night to raise money for student awards. Silver participated in the poker tournament and kindly posed for many photos. He also has posted his own thoughts  about some issues raised at the conference.


The Lifesavers Conference

As with any industry, it is important in market research to keep up with the latest thinking and practices by regularly attending workshops and conferences.  For this reason, members of the Corona staff can occasionally be found at conferences put on by the American Association of Public Opinion Research (AAPOR), Market Research Association (MRA), and American Marketing Association (AMA).  However, here at Corona, we have an additional challenge of keeping up not only on the latest and greatest research practices, but also on the issues most important to our clients.  For that reason, we also try and occasionally make an appearance at conferences focused on subject matters such as parks and recreation, traffic safety, and more.

Kevin Presenting @ Lifesavers Conference

In March of this year, Kevin and I made the trip to Chicago to participate in this year’s Lifesavers Conference.  This conference has been conducted annually for decades and brings together individuals from around the country whose jobs are dedicated to keeping Americans safe on our nation’s highways.  The minds present at these conferences have been instrumental at making changes both legislatively and in communications with the public to dramatically reduce traffic fatalities over the years, including laws aimed at requiring child safety seats and punishing drunk drivers, and communications aimed at increasing seat belt usage, reducing impaired driving, and more recently, reducing distracted driving.

The things we learned at this year’s conference were enlightening to say the least, and it would require a whole series of blog posts to cover them all.  However, a few highlights included:

  • Learning about how state traffic safety departments are effectively using social media to reach a new generation for whom traditional television advertising simply isn’t effective.
  • Learning about how small nonprofit organizations can conduct their own evaluations to make the case to funders that their work is making a true impact.
  • Understanding the efforts being made by law enforcement in Colorado and Washington to keep drivers safe in the age of legalized marijuana.

This year’s conference was particularly special for us, as Kevin had the opportunity to present the results of research we conducted with the Minnesota Office of Traffic Safety aimed at better understanding some of the characteristics of high-risk drivers (those who exhibit a combination of risky traffic behaviors, including drinking and driving, speeding, texting while driving, and not wearing a seat belt).  A few of the key findings of that research included:

  • High risk drivers tend to overestimate how common their behaviors are among their peers (drinking and driving isn’t near as common as one might think), overestimate their own driving ability (almost everyone believes they are “above average”), and underestimate the risk of their driving behaviors (those who speed regularly are considerably more likely to be in a crash than those who do not).
  • Those who text and drive know they shouldn’t and worry about being in an accident, but they do it anyway. (Another presentation at the conference suggested that texting and driving should be treated as an addition rather than a rational decision.)  On the other hand, those who speed regularly are relatively unlikely to think their behavior is a problem and are more worried about getting a ticket than being in a crash.

Overall, the conference was a great chance to catch up on some of the things going on in traffic safety and lend our own expertise as well, so we hope we have the opportunity to attend again in the future!

Colorado’s New Statewide Child Abuse Hotline

We were pleased yesterday to attend the unveiling of Colorado’s new rollout of a statewide hotline to report suspected child abuse or neglect.  Governor Hickenlooper and other dignitaries spoke on the Capitol steps, and we think it’s a great step forward for Colorado.


Our partners at Heinrich Marketing came up with several great concepts, and we were delighted to conduct concept testing that helped lead to the selection of the campaign that was announced today.  We conducted focus group research in urban and rural Colorado to weigh the strengths and weaknesses of five different concepts.  We think the selected theme is a great way to convey a complex and challenging message.

You can see examples of the campaign theme put to use here.

How Researchers Hire

Corona Insights just recently went through a round of hiring (look for a blog post soon about our newest member) and, while many of our hiring steps may be common, it did occur to me that our process mirrors the research process.

  • Set goals. Research without an end goal in mind will get you nowhere fast.  Hiring without knowing what you’re hiring for, will ensure an inappropriate match.
  • Use multiple modes.  Just as approaching research from several methodologies (e.g., quant, qual) yields a more complete picture, so too does a hiring process with multiple steps.  Reviewing resumes (literature review), screening (exploratory research), testing (quantitative), several rounds of interviews (qualitative), and mock presentation.
  • Consistency.  Want to compare differences over time or between different segments? Better be consistent in your approach.  Want to compare candidates? Better be consistent in your approach.

Needle in haystack imageI could go on about the similarities (drawing a broad sample of applicants?), but you get the idea.  The principles of research apply to a lot more than just research.

And as with any recurring research, we reevaluate what worked and what can be improved before iteration.  Therefore our process changes a little each time, but the core of it remains the same – asking the right questions and analyzing data with our end goals in mind.  Just like any good research project.

Stay tuned for a blog post about our new hire.

Data for your sleepless nights

sleepA few months ago, I purchased a fancy pedometer to start collecting more data about myself. For those of you fortunate enough to know my slothful self in real life, I’d like to interrupt your laughter to point out that one of the features I was most interested in was the pedometer’s ability to track my sleep. I’m not sure exactly how it tracks my sleep, nor how precise its measurements are, but it has pushed me to think a lot more about my sleep and about sleep in general. I decided I wanted to know how I compared to other people and to look for patterns in my own sleep data.

First of all, diving into the world of sleep data is like diving into a crazy rabbit hole. (Rabbits, by the way, sleep 8.4 hours per day, but only 0.9 hours of that are spent dreaming. Humans, however, sleep roughly 8 hours, of which 1.9 hours are spent dreaming.[1] Take that, rabbits.) Questions related to sleep are not necessarily where you would expect them to be. They are shockingly absent from the Behavioral Risk Factor Surveillance System (which measures many behaviors related to health) and yet appear on the American Time Use Survey (ATUS).

Even more interesting, you see different patterns in the data depending on the question format. In the ATUS, they have people track their time use via an activity diary. Basically the dairy has you input what activities you were doing when, with whom and where. Based on these diaries, the Bureau of Labor Statistics estimates that in 2012 Americans were getting more than eight hours of sleep per day on average. These data also show that women were getting more sleep than men, and that a woman my age was averaging almost 9 hours of sleep per night.[2]

Sadly, these numbers seemed a little high to me, and a Gallup survey seemed to agree. In a 2013 survey, when people were asked how many hours of sleep they usually got per night, people reported getting fewer than 7 hours.[3] The difference between the diary findings and the survey findings reminded me of a similar pattern in reports of what people eat. Basically, people are really bad at remembering what they ate during a week, so a daily food diary tends to be the more accurate measurement. So maybe people are also bad at recalling how much sleep they get on average during the week? However, I wonder if filling out the diary for ATUS sometimes feels embarrassing. For example, do people feel too embarrassed to admit to all the T.V. they watch/internet browsing they do, so they end up reporting more sleep?

Another sleep data source I found was this chart of the sleeping habits of geniuses.[4] I imagine that getting enough sleep probably helps all of us reach our genius potential. Based on the chart, the average amount of sleep across this sample of geniuses is about 7.5 hours, which seems reasonable. It is super interesting, though, to see how and when geniuses spread out their sleep across the day.

Back to my own data, I noticed two important things. One, the social context can have a big impact on my sleep. Beth and I went to AAPOR in May, and every night we stayed up too late discussing nerdy things that we had learned/ideas for our own analyses. This resulted in many nights of less than 7 hours of sleep. The week after AAPOR, I went to visit my sister. My sister would readily admit that she finds it almost impossible to be a functioning human being on anything less than 8 hours of sleep. Not surprisingly, I averaged more than 8 hours of sleep during that visit. So, insight number one is that I should only share hotel rooms and/or sleep at the homes of people who value sleep. Unfortunately, I don’t group my friends and family based on their sleeping habits and often I like staying up late to debate nerdy things, like what rabbits even dream about during their roughly one hour of dreaming each day. So, I’m not sure how actionable this insight really is.

Second, I noticed that I slept better when I had walked more during the day. Apparently the last laugh is on me because I’m beginning to suspect that even for those of us who really love sleep, being more physically active might be a critical component of the sleep routine.

Corona wins Gold Peak Award for Market Research

award winning market researchLast night, the Colorado American Marketing Association (CO+AMA) celebrated Colorado’s first class marketers at their annual Colorado Peak Awards. Corona Insights was honored to take home our 4th Gold Peak Award in the category of Market Research.  This year, we won the award for our member engagement and brand assessment for the American College of Veterinary Medicine (ACVIM).

Market research is fundamentally different from other categories honored at the CO+AMA Peak Awards. Market Research prepares brands and marketing campaigns for take-off. By doing proper research, companies are able to develop a sound marketing strategy that effectively reach their target audience.

In 2013 we were recognized with a Gold Peak for the research we did for Donor Alliance which resulted in a marketing campaign that addresses the trends in the data we helped uncover. In 2010 Corona took home the Silver Peak award for our rebranding and in 2011 Corona won a Gold Peak award for our market research work to inform the University of Denver Sturm College of Law’s strategic plan.

The 26th annual gala was held that Wings Over the Rockies and featured an aerospace theme. Kevin Raines, CEO, and Kassidy Benson, Marketing and Project Assistant accepted the award on behalf of the firm.


A dose of data for your springtime allergies

blooming-springtimeLike many people, I have “seasonal allergies.”  March and April bring sneezing fits and foggy brain days for me.  Often I get a sore throat and headaches.  One year I went through three strep throat tests and a course of antibiotics before my doctor decided my swollen throat was caused by allergies.

Knowing you’re allergic to “something” isn’t all that helpful.  Sure, you can keep antihistamines on hand and treat the symptoms as they arise, but you have no way to predict when symptoms will hit or minimize your exposure to the allergen.

A common first step in identifying the cause is to do a skin allergy test.  Typically, this involves getting pricked in the back with approximately 20 solutions containing the most common allergens.  The doctor marks off a grid pattern on your skin and each box gets pricked with one item and then you wait and see whether any of the pricked areas swell up or show other signs of allergic reaction.

I’ve had this done, but unfortunately (though not uncommonly) I didn’t react to any of the items tested.  Which, doesn’t mean you’re not allergic to something, just that you’re not allergic to one of the things tested.

Research on myself hadn’t provided any usable information, so recently I turned to external data instead.  Where I live, the city provides daily pollen counts for the highest pollen sources from about February through November.  They don’t provide aggregated data, however, so I had to build my own database of their daily postings.  In the part of town where I live, Ash, Juniper, and Mulberry are the most prevalent allergens during the time when my symptoms are greatest.

Last year, my worst day was April 1.  Even with my allergy pills, I sneezed the entire day.  Here’s what the pollen count showed for my area of town during that time:


Ash pollen counts peaked on April 1.  Juniper and Cottonwood were also relatively high, but Juniper had been fairly high for weeks without me having corresponding symptoms.

This year, my allergies were not so bad at all.  I was out of town for a week in mid-March and for two separate weeks in early and mid-April, which certainly helped, but I only had a few foggy-brain days in late March and mid-April.  The pollen counts for this year:

allergies 2014

Ash was lower overall compared to the previous year, and once again seemed to line up best with my symptoms.  This is a correlational analysis, so it doesn’t provide a definitive diagnosis, but because different allergens peak at different times, it offers some ability to rule out other things.  And it’s more efficient (and painless!) compared to the skin test.

Armed with this information, I did some additional research on the predominant types of Ash trees where I live (Modesto and Green Ash), and the geographic range for those species.  If I’m planning to travel to Ash-free zones, I can try to schedule those trips for the spring.  And otherwise, I can keep an eye on the pollen counts and try to stay inside with the windows closed when Ash counts are particularly high.

It’s not perfect data, but like most tough decisions, we have to do the best we can with limited data and our powers of educated inference.  Hopefully less sneezing awaits!