RADIANCE BLOG

Category: Qualitative Research

How do you measure the value of an experience?

When I think about the professional development I did last week, I would summarize it thusly: an unexpected, profound experience.

I was given the opportunity to attend RIVA moderator training and I walked away with more than I ever could have dreamed I would get. Do you know that experience where you think back to your original expectations and you realize just how much you truly didn’t understand what you would get out of something? That was me, as I sat on a mostly-empty Southwest plane (156 seats and yet only 15 passengers) flying home. While you can expect a RIVA blog to follow, I was struck by the following thought:

What does it mean to understand the impact your company, product, or service has on your customers?

I feel like I was born and raised to think quantitatively. I approach what I do with as much logic as I can (sometimes this isn’t saying much…) When I think about measuring the impact a company, product, or service has on its customers, my mind immediately jumps to numbers – e.g. who (demographically) and how satisfied are they with it. But am I really measuring impact? I think yes and no. I’m measuring an impersonal impact; one that turns people into consumers and percentages. The other kind of impact largely missed in quantitative research is the impact on the person.

If I were to fill out a satisfaction or brand loyalty survey for RIVA, I would almost be unhappy that I couldn’t convey my thoughts and feelings about the experience. I don’t want them to know just that I was satisfied. I want them to understand how profound this experience was for me. When they talk to potential customers about this RIVA moderator class, I want them to be equipped with my personal story. If they listen and understand what I say to them, I believe they would be better equipped to sell their product.

This is one of the undeniable and extremely powerful strengths of qualitative research. Interviews, focus groups, anything that allows a researcher to sit down and talk to people is creating some of the most valuable data that can be created. We can all think of a time where a friend or family member had such a positive experience with some company, product, or service that they just couldn’t help but gush about it. Qualitative research ensures that valuable of that feedback is captured and preserved. If you want to truly understand who is buying your product or using your service, I cannot stress the importance of qualitative research enough.


Breaking down the wall between quant and qual

Recently we had a project involving a large survey with numerous open-end questions. Taking the divide and conquer approach, it was all hands-on deck to quickly code the thousands of responses. As a qualitative researcher, coding survey responses can feel like a foreign process and I often found myself overthinking both my codes and the nuanced content of responses. When I had finished, I typed up a summary of my findings and even pulled out a few ‘rock star’ quotes that illustrated key trends and takeaways. The experience left me wondering—why is content analysis of survey open-ends not more common? It is qualitative data after all.

Simply put, the purpose of content analysis is the elicitation of themes or content in a body of written or other pointed media. Like many qualitative approaches, it does not produce numerical measurements; rather, content analysis measures patterns and trends in the data. Incorporating qualitative analysis techniques such as content analysis into traditionally quantitative studies better contextualizes survey results and produces greater insights.

Imagine a classic branding survey where participants are asked sentiment questions such as ‘what is your impression of Brand X’? Often, the questions are designed as a Likert scales with defined categories (e.g. very positive, somewhat positive, neutral, etc.). While this provides general insight into attitudes and impressions of the brand, it does not necessarily highlight the broader insights or implications of the research findings. When Corona does a brand survey, we regularly ask an open-end question for qualitative content analysis as a follow-up, such as ‘What specifically about Brand X do you find appealing?’ or, conversely, ‘What specifically about Brand X do you find unappealing?’. Inclusion of qualitative follow-up provides additional framing to the quantitatively designed Likert scale question and augments insights. Additionally, if the survey shows a sizeable negative sentiment towards a brand, incorporating qualitatively designed open-ends can uncover issues or problems that were unknown prior to the research, and perhaps outside of the original research scope.

Historically, quantitative and qualitative research has been bifurcated, both in design and in analysis. However, hybrid approaches such as the one described above are quickly gaining ground and the true value is being realized. Based on our experience here at Corona, for content analysis to be effectively done in a quantitative-dominant survey, it is best for this to be decided early in the research design phase.

A few things to keep in mind when designing open-ended questions for content analysis:

  • Clearly define research objectives and goals for the open-end questions that will be qualitative analyzed.
  • Construct questions with these objectives in mind and incorporate phrasing the invites nuanced responses.
  • Plainly state your expectations for responses and if possible, institute character minimums or maximums as needed.

In addition to the points mentioned above, it is important to note that there are some avoidable pitfalls. First off, this method is best suited for surveys with a smaller sample size, preferably under 1000 respondents. Also, the survey itself must not be too time intensive. It is well known that surveys which extend beyond 15 to 20 minutes often lead to participants dropping out or not fully completing the survey. Keep these time limits in mind and be selective about the number of open-ends to be include. Lastly, it is important to keep the participant engaged in the survey. If multiple open-ends are incorporated in to the survey, phrase the questions differently or ask them about different topics in an effort to keep participants from feeling as  though they are repeating themselves.

In an ideal world, quantitative and qualitative approaches could meld together seamlessly, but we all know this isn’t an ideal world. Time constraints, budgets, research objectives are just a handful of reasons why a hybrid approach such as the one discussed here may not be the right choice. If it is though, hybrid approaches provide participants an opportunity to think deeper about the topic at hand and also can create a sense of active engagement between the participant and the end-client. In other words—they feel like their voice is being heard and the end-client gains a better understanding of their customer.


Human Experience (HX) Research

About a year ago, I stumbled upon a TEDx Talk by Tricia Wang titled “The Human Insights Missing from Big Data”. She eloquently unfurls a story about her experience working at Nokia around the time smartphones were becoming a formidable emergent market. Over the course of several months, Tricia Wang conducted ethnographic research with around 100 youth in China and her conclusion was simple—everyone wanted a smartphone and they would do just about anything to acquire one. Despite her exhaustive research, when she relayed her findings to Nokia they were unimpressed and expressed that big data trends did not indicate there would be a large market for smartphones. Hindsight is 20/20.

One line in particular stuck out to me as I watched her talk— “[r]elying on big data alone increases the chances we’ll miss something, while giving us the illusion we know everything”. Big data offers companies and organizations plentiful data points that haphazardly paint a picture of human behavior and consumption patterns. What big data does not account for is the inherent ever-shifting, fickle nature of humans themselves. While big data continues to dominate quantitative research, qualitative research methods are increasingly shifting to account for the human experience. Often referred to as HX, human experience research aims to capture the singularity of humans and forces researchers to stop looking at customers exclusively as consumers. In human experience research, questions are asked to get at a respondent’s identity and emotions; for instance, asking how respondents relate to an advertising campaign instead of just how they react to the campaign.

The cultivation of HX research in the industry begs the question: what are the larger implications for qualitative research? Perhaps the most obvious answer is that moderators and qualitative researchers need to rethink how research goals are framed and how questions are posed to respondents to capture their unique experience. There are also implications for the recruiting process. The need for quality respondents is paramount in human experience research and will necessitate a shift in recruiting and screening practices. Additionally, qualitative researchers need to ensure that the best methodology is chosen in order to make respondents feel comfortable and vulnerable enough to share valuable insights with researchers.

Human experience research may just now be gaining widespread traction, but the eventual effects will ultimately reshape the industry and provide another tool for qualitative researchers to answer increasingly complex research questions for clients. At Corona, adoption of emerging methodologies and frameworks such as HX means we can increasingly fill knowledge gaps and help our clients better understand the humans behind the research.


Based on my experience…

Born from a conversation I had with a coworker earlier this week, I wanted to talk about research methodology and design and how a client relying solely on what they know – their own experience and expertise – might result in subpar research.

Quantitative and qualitative methods have different strengths and weaknesses, many of which we have blogged about before. The survey is the most frequently used quantitative methodology here at Corona, and it’s an incredibly powerful tool when used appropriately. However, one of the hallmarks of a quantitative, closed-ended survey is that there is little room for respondents to tell us their thoughts – we must anticipate possible answers in question design. When designing these questions, we rely on our client’s and our own experience and expertise.

We know how much the value of a statistically valid survey is appreciated – being able to see what percentage of customers believe or perceive or do something, compare results across different subpopulations, or precisely identify what “clicks” with customers is very satisfying and can drive the success of a campaign.  But the survey isn’t always the right choice.

Sometimes, relying on experience and expertise is not enough, perhaps because you can’t predict exactly what responses customers might have or, despite the phrase being somewhat cliché, sometimes you don’t know what you don’t know. This is why I advocate for qualitative research.

Qualitative research is not statistically valid. Its results are not really meant to be applied to your entire population of customers or even a segment of them. However, it is incredibly powerful when you’re trying to learn more about how your customers are thinking about X, Y, or Z. Feel stuck trying to brainstorm marketing materials, finding a way to better meet your customers’ needs, or come up with a solution to a problem you’re having? Your customers probably won’t hand you a solution (though you never know), but what they say will spark creativity, aid in brainstorming, and become a valuable facet in the development of it.

In the end, both serve a significant role in research as a whole. Personally, I’m a big supporter of integrating both into the process. Qualitative research can bridge the gap between your customers and the “colder” quantitative research. It can help you better understand what your customers are doing or thinking and therefore help you better design a quantitative survey that enables you to capture robust data. Alternatively, qualitative research can follow a quantitative survey, allowing you to explore more of the “why” behind certain survey results. In the end, I simply urge you not to underestimate the value of qualitative research.


Phenomenology: One way to Understand the Lived Experience

How do workers experience returning to work after an on-the-job injury? How does a single-mother experience taking her child to the doctor? What is a tourist’s experience on his first visit to Colorado?

These research questions could all be answered by phenomenology, a research approach that describes the lived experience. While not a specific method of research, phenomenology is a series of assumptions that guide research tactics and decisions.Phenomenological research is uncommon in traditional market research, but that may be due to little awareness of it rather than its lack of utility. (However, UX research, which follows many phenomenological assumptions, is quickly gaining popularity). If you have been conducting research, but feel like you are no longer discovering anything new, then a phenomenology approach may shed some fresh insights.

Phenomenology is a qualitative research approach. It derives perspectives defined by experience and context, and the benefit of this research is a deeper and/or boarder understanding of these perspectives. To ensure perspectives are revealed, rather than prescribed, phenomenology avoids abstract concepts. The research doesn’t ask participants to justify their opinions or defend their behaviors. Rather, it investigates the respondents’ own terms in an organic way, assuming that people do not share the same interpretation of words or labels.

In market research, phenomenology is usually explored by unstructured, conversational interviews. Additional data, such as observing behavior (e.g., following a visitor’s path through a welcome center), can supplement the interviews. Interview questions typically do not ask participants to explain why they do, feel, or think something. These “why” questions can cause research participants to respond in ways that they think the researcher wants to hear, which may not be what’s in their head or heart. Instead, phenomenology researchers elicit stories from research participants by asking questions like “Can you tell me an example of when you…?” or, “What was it like when…?” This way, the researcher seeks and values context equally with the action of the experience.

The utility of this type of research may not be obvious at first. Project managers and decision makers may conclude the research project with a frustrating feeling of “now what?” This is a valid downside of phenomenological research. On the other hand, this approach has the power to make decision makers and organization leaders rethink basic assumptions and fundamental beliefs. It can reveal how reality manifests in very different ways.

A phenomenology study is not appropriate in all instances. But it is a niche option in our research arsenal that might best answer the question you are asking. As always, which research approach you use depends on your research question and how you want to use the results.


Does This Survey Make Sense?

It’s pretty common for Corona to combine qualitative and quantitative research in a lot of our projects.  We will often use qualitative work to inform what we need to ask about in qualitative phases of the research, or use qualitative research to better understand the nuances of what we learned in the quantitative phase.  But did you know that we can also use qualitative research to help design quantitative research instruments through something called cognitive testing?

The process of cognitive testing is actually pretty simple, and we treat it a lot like a one-on-one interview.  To start, we recruit a random sample of participants who would fit the target demographic for the survey.  Then, we meet with the participants one-on-one and have them go through the process of taking the survey.  We then walk through the survey with them and ask specific follow-up questions to learn how they are interpreting the questions and find out if there is anything confusing or unclear about the questions.

In a nutshell, the purpose of cognitive testing is to understand how respondents interpret survey questions and to ultimately write better survey questions.  Cognitive testing can be an effective tool for any survey, but is particularly important for surveys on topics that are complicated or controversial, or when the survey is distributed to a wide and diverse audience.  For example, you may learn through cognitive testing that the terminology you use internally to describe your services are not widely used or understood by the community.  In that case, we will need to simplify the language that we are using in the survey.  Or, you may find that the questions you are asking are too specific for most people to know how to answer, in which case the survey may need to ask higher-level questions or include a “Don’t Know” response option on many questions.  It’s also always good to make sure that the survey questions don’t seem leading or biased in any way, particularly when asking about sensitive or controversial topics.

Not only does cognitive testing allow us to write better survey questions, but it can also help with analysis.  If we have an idea of how people are interpreting our questions, we have a deeper level of understanding of what the survey results mean.  Of course, our goal is to always provide our clients with the most meaningful insights possible, and cognitive testing is just one of the many ways we work to deliver on that promise.


How representative is that qualitative data anyway?

When we do qualitative research, our clients often wonder how representative the qualitative data is of the target population they are working with.  It’s a valid question.  To answer, I have to go back to the purpose of conducting qualitative research in the first place.

The purpose of qualitative research is to understand people’s perceptions, opinions, and beliefs, as well as what is causing them to think in this way.  Unlike quantitative research, the purpose is not to generalize the results to the population of interest.  If eight out of ten participants in a focus group share the same opinion, can we say that 80% of people believe that particular opinion?  No, definitely not, but you can be pretty confident that it will be a prevalent opinion in the population.

While qualitative data is not statistically representative of a population, we still have guidelines that we follow to make sure we are capturing reliable data.  For example, we suggest conducting at least three focus groups per unique segment.  Qualitative research is fluid by nature, so data gathered from across three groups allows us to see consistent themes and patterns across groups, and assess if there are any outliers or themes exclusive to one group that may not be representative of the unique segment as a whole.

Still not sure which methodology will best be able to answer your research questions?  We can help you choose!


Turning Passion into Actionable Data

Nonprofits are among my favorite clients that we work with here at Corona for a variety of reasons, but one of the things that I love most is the passion that surrounds nonprofits.  That passion shines through the most in our work when we do research with internal stakeholders for the nonprofit.  This could include donors, board members, volunteers, staff, and program participants.  These groups of people, who are already invested in the organization are passionate about helping to improve it, which is good news when conducting research, as it often makes them more likely to participate and increase response rates.

Prior to joining the Corona team, I worked in the volunteer department of a local animal shelter.  As a data nerd even then, I wanted to know more about who our volunteers were, and how they felt about the volunteer program.  I put together an informal survey, and while I still dream about what nuggets could have been uncovered if we had gone through a more formal Corona-style process, the data we uncovered was still valuable in helping us determine what we were doing well and what we needed to improve on.

That’s just one example, but the possibilities are endless.  Maybe you want to understand what motivated your donors to contribute to your cause, how likely they are to continue donating in the future, and what would motivate them to donate more.  Perhaps you want to evaluate the effectiveness of your programs.  Or, maybe you want to know how satisfied employees are with working at your organization and brainstorm ideas on how to decrease stress and create a better workplace.

While you want to be careful about being too internally focused and ignoring the environment in which your nonprofit operates, there is huge value in leveraging passion by looking internally at your stakeholders to help move your organization forward.

 


Informal Research for Nonprofit Organizations

While Corona serves all three sectors (private, public, and nonprofit) in our work, we have always had a soft spot for our nonprofit clients.  No other type of organization is asked to do more with less, so we love working with nonprofits to help them refine their strategies to be both more effective at fulfilling their missions and more financially stable at the same time.

However, while we are thrilled for the opportunities to work with dozens of nonprofits every year, we know that there are hundreds of other organizations that we don’t work with, many of which simply don’t have the resources to devote to a formal marketing research effort. I’m a huge fan of the Discovery Channel show MythBusters, so I’ll share one of my favorite quotes:

http://www.tested.com/art/makers/557288-origin-only-difference-between-screwing-around-and-science-writing-it-down/
Image from Tested courtesy of DCL

While few would argue that the results found in an episode of MythBusters would qualify as academically rigorous research, I think most would agree that trying a few things out and seeing what happens is at least better than just trusting your gut instinct alone.  Likewise, here are a few ideas for ways that nonprofits can gather at least some basic information to help guide their strategies through informal “market research.”

Informal Interviews

One-on-one interviews are one of the easiest ways to gather feedback from a wide variety of individuals.  Formal interview research involves a third-party randomly recruiting individuals to participate from the entire universe of people you are trying to understand, but simply talking to people one-on-one about the issues or strategies that you are considering can be very insightful.  Here are a few pointers on getting the most out of informal interviews:

  • Dedicate time for the interview. It may seem easy to just chat with someone informally at dinner or at an event, but the multitude of distractions will reduce the value you get out of the conversation.  Find a time that both parties can really focus on the discussion, and you’ll get much better results.
  • Write your questions down in advance. It’s easy to go down a rabbit hole when having a conversation about something you are passionate about, so be sure to think through the questions you need to answer so that you can keep the conversation on track.
  • Record the conversation (or at least take notes). Take the MythBusters’ advice, and document the conversation.  If you’ve talked to a dozen people about your idea, it will be impossible for you to remember it all.  By having documentation of the conversations, you can look back later and have a better understanding of what your interviewees said.

Informal focus groups

Similar to interviews, in an ideal world focus groups should be conducted by a neutral, third-party with an experienced moderator who can effectively guide the group discussion to meet your goals.  However, as with interviews, you can still get a lot of value out of just sitting down with a group and talking through the issues.  In particular, if you have an event or conference where many people are together already, grabbing a few to talk through your ideas can be very informative.  Our suggestions for this type of “research” are similar to those for informal interviews, with slight differences in their implications:

  • Dedicate time for the discussion. As mentioned before, it may be tempting to just say “We’ll talk about this over dinner” or “Maybe if we have time at the end of the day we can get together.”  You’ll get far better results if everyone can plan for the conversation in advance and participate without distractions.
  • Write your questions down in advance. Even more so than for interviews, having a formal plan about what questions you want to ask is imperative.  Group discussions have a tendency of taking on a life of their own, so having a plan can help you to guide the discussion back on topic.
  • Document the results. Again, you may think you can remember everything that was said during a conversation, but a few months down the road, you will be very thankful that you took the time to either record the conversation or take notes about what was said.

Informal Surveys

Surveys are, perhaps, the most difficult of these ideas to implement on an informal basis, but they can nevertheless be very useful.  If you’re just needing some guidance on how members of an organization feel about a topic, asking for a show of hands at a conference is a perfectly viable way of at least getting a general idea of how members feel.  Similarly, if you have a list of email addresses for your constituents, you could simply pose your question in an email and ask people to respond with their “vote.”

The trickiest part is making sure that you understand what the results actually represent.  If your conference is only attended by one type of member, don’t assume that their opinions are the same as other member types.  Likewise, if you only have email addresses for 10 percent of your constituents, be careful with assuming that their opinions reflect those of the other 90 percent.  Even so, these informal types of polling can help you to at least get an idea of how groups feel on the whole.

~

Hopefully these ideas can give your nonprofit organization a place to start when trying to understand reactions to your ideas or strategies.  While these informal ways of gathering data will never be as valuable as going through a formal research process, they can provide at least some guidance as you move forward that you wouldn’t have had otherwise.

And if your issues are complex enough that having true, formal research is necessary to ensure that you are making the best possible decisions for your organization, we’ve got a pretty good recommendation on who you can call…


Who’s Excited for 2016?

Oh man, where did 2015 even go? Sometimes the end of the year makes me anxious because I start thinking about all the things that need to be done between now and December 31st. And then I start thinking about things that I need to do in the upcoming year, like figuring out how to be smarter than robots so that they don’t steal my job and learning a programming language since I’ll probably need it to talk to the robots that I work with in the future. Ugh.2015 Calendar

Feeling anxious and feeling excited share many of the same physical features (e.g., sweaty palms, racing heart, etc.),  and research has shown that it is possible to shift feelings of anxiety to feelings of excitement even by doing something as simple as telling yourself you are excited. So, let me put these clammy hands to use and share some of the things that I am excited about for 2016:

  • Technological advancements for data collection. Changes in phone survey sampling are improving the cell phone component of a survey. Also, we have been looking at so many new, cool ways of collecting data, especially qualitative data. Cell phones, which are super annoying for phone surveys, are simultaneously super exciting for qualitative research. I’m excited to try some of these new techniques in 2016.
  • Improvements in the technology that allows us to more easily connect with both clients and people who work remotely. We use this more and more in our office. I’m not sure if in 2016 we will finally have robots with iPads for heads that allow people to Skype their faces into the office, but I can dream.
  • Work trips! I realize that work trips might be the stuff of nightmares at other jobs. But Coronerds understand the importance of finding humor, delicious food, and sometimes a cocktail during a work trip.
  • New research for clients old and new. This year I’ve learned all sorts of interesting facts about deck contractors, the future of museums, teenage relationships, people’s health behaviors, motorcyclists, business patterns in certain states, how arts can transform a city, and many more! I can’t wait to see what projects we work on next year.
  • Retreat. For people who really love data and planning, there is nothing as soothing as getting together as a firm to pore over a year’s worth of data about our own company and draw insights and plans from it.

Alright, I feel a lot better about 2016. Now I’m off to remind myself that these clammy hands also mean that I’m very excited about holiday travel, last minute shopping, and holiday political discussions with the extended family…