RADIANCE BLOG

Category: Qualitative Research

An Anthropologist’s Guide to Ethnographic Observation

Photo by Jordan Madrid on Unsplash

I want to give you an assignment. Take ten minutes to look around and make observations of the room you are in while reading this. Chances are you are in a room or a spot you are familiar with. Make a list of everything you see and then, I challenge you to observe five things you have never noticed before. It could be a stain on a ceiling, a rug corner curling up, or even a noise you’ve become accustomed to in the background.

This was an activity I first came across in Keri Smith’s book How to be an Explorer of the World. We have a tendency to not really pay attention to the world around us. Our mind fills in gaps and we walk through life making assumptions at every second. Ethnographic observations ask us to purposefully look past our blinders and observe the world around us with pure curiosity.

Continue reading

What is Qualitative Data?

Photo by Jason Leung on Unsplash

According to the Qualitative Research Consultants Association, qualitative data “uses in-depth studies of small groups of people to guide and support the construction of hypotheses.” While qualitative research methodologies are based in social and behavioral sciences like anthropology and psychology, many approaches and techniques can be applied to market research.

Qualitative data differs from quantitative data in numerous ways. Conceptually, qualitative data helps to:

  • Uncover the values and beliefs of individual life experiences,
  • Share how reality depends on a person’s point of view, and
  • Understand the nuances of individual experiences.

This differs from quantitative data, which aims to:

  • Measure values and beliefs of a population of people,
  • Make relative and absolute comparisons between groups, and
  • Reveal broad patterns among large groups of individuals.
Continue reading

What is data storytelling?

Photo by Nong Vang on Unsplash

While researchers hope that everyone is as fascinated with their research findings as they are, most people do not have time to read through long reports with dense paragraphs of  complex findings, graphs, and charts. Leaders and executives need to know the most important research findings so they can implement data informed and strategic next steps. Here is where effective data storytelling comes into play. Data storytelling is another way of saying data visualization. However, data storytelling is a step beyond data visualization because it involves taking research findings and transforming them into a visually appealing summary sheet that paints a picture of what the findings are, why they are important, and what they can be used for. You might have heard of the dashboard, one common and effective data storytelling style of report.

A data story typically involves a visual presentation of data that can easily be created with visualization tools like those available with the simple Microsoft Office package or with a more advanced tool like Tableau. Overall, data storytellers are often charged with the task of revealing useful and hard-hitting data in a manner that is cohesive and captivating. Quality data visualizations that effectively tell a story do so by removing the noise, making sense of the data in a coherent way, and highlighting trends.

Continue reading

The Future of Qual

Photo by Ross Findon on Unsplash

For years, qualitative market research has been dominated by focus groups and one-on-one interviews. Each methodology offers benefits. However, as timelines shrink and research objectives expand, it may be time to rethink our “go-to” qualitative methods. This blog will discuss the shortcomings of traditional methods and how refreshed methodological approaches can overcome these pitfalls.

Co-creation Activities

We all know the stereotype—a focus group room with a two-way mirror providing a thin veil between participants and researchers. It has long been argued that this barrier maximizes participants’ comfort and allows for authentic, unbiased, discussions. The two-hour focus group is typically designed as a question-answer format with activities peppered throughout to break up the monotony. Historically, activities ask participants to complete a task individually and then share their thoughts and responses with the group. This approach is especially effective when delving into emotions or personal reactions. However, researchers are increasingly designing focus group activities that are group-based and co-created.

At Corona, we have eagerly implemented co-creation activities for marketing campaign and message testing projects. In 2019, we were working with a local association developing messaging for a new nonprofit giving-based program called Refund What Matters. Participants were asked to get in groups of 2-3 and create a print advertisement for the program. More specifically, they were asked to come up with a visual for the print ad, a tagline, and a hashtag. The client observed as the participants worked together to create their advertisements. The advertisements revealed the inner thoughts and motivations for nonprofit giving and the discussion between participants while creating the advertisements provided insight into how Colorado residents may try and pitch this giving program to their family members, peers, and coworkers.

Continue reading

How do you measure the value of an experience?

When I think about the professional development I did last week, I would summarize it thusly: an unexpected, profound experience.

I was given the opportunity to attend RIVA moderator training and I walked away with more than I ever could have dreamed I would get. Do you know that experience where you think back to your original expectations and you realize just how much you truly didn’t understand what you would get out of something? That was me, as I sat on a mostly-empty Southwest plane (156 seats and yet only 15 passengers) flying home. While you can expect a RIVA blog to follow, I was struck by the following thought:

What does it mean to understand the impact your company, product, or service has on your customers?

I feel like I was born and raised to think quantitatively. I approach what I do with as much logic as I can (sometimes this isn’t saying much…) When I think about measuring the impact a company, product, or service has on its customers, my mind immediately jumps to numbers – e.g. who (demographically) and how satisfied are they with it. But am I really measuring impact? I think yes and no. I’m measuring an impersonal impact; one that turns people into consumers and percentages. The other kind of impact largely missed in quantitative research is the impact on the person.

If I were to fill out a satisfaction or brand loyalty survey for RIVA, I would almost be unhappy that I couldn’t convey my thoughts and feelings about the experience. I don’t want them to know just that I was satisfied. I want them to understand how profound this experience was for me. When they talk to potential customers about this RIVA moderator class, I want them to be equipped with my personal story. If they listen and understand what I say to them, I believe they would be better equipped to sell their product.

This is one of the undeniable and extremely powerful strengths of qualitative research. Interviews, focus groups, anything that allows a researcher to sit down and talk to people is creating some of the most valuable data that can be created. We can all think of a time where a friend or family member had such a positive experience with some company, product, or service that they just couldn’t help but gush about it. Qualitative research ensures that valuable of that feedback is captured and preserved. If you want to truly understand who is buying your product or using your service, I cannot stress the importance of qualitative research enough.


Breaking down the wall between quant and qual

Recently we had a project involving a large survey with numerous open-end questions. Taking the divide and conquer approach, it was all hands-on deck to quickly code the thousands of responses. As a qualitative researcher, coding survey responses can feel like a foreign process and I often found myself overthinking both my codes and the nuanced content of responses. When I had finished, I typed up a summary of my findings and even pulled out a few ‘rock star’ quotes that illustrated key trends and takeaways. The experience left me wondering—why is content analysis of survey open-ends not more common? It is qualitative data after all.

Simply put, the purpose of content analysis is the elicitation of themes or content in a body of written or other pointed media. Like many qualitative approaches, it does not produce numerical measurements; rather, content analysis measures patterns and trends in the data. Incorporating qualitative analysis techniques such as content analysis into traditionally quantitative studies better contextualizes survey results and produces greater insights.

Imagine a classic branding survey where participants are asked sentiment questions such as ‘what is your impression of Brand X’? Often, the questions are designed as a Likert scales with defined categories (e.g. very positive, somewhat positive, neutral, etc.). While this provides general insight into attitudes and impressions of the brand, it does not necessarily highlight the broader insights or implications of the research findings. When Corona does a brand survey, we regularly ask an open-end question for qualitative content analysis as a follow-up, such as ‘What specifically about Brand X do you find appealing?’ or, conversely, ‘What specifically about Brand X do you find unappealing?’. Inclusion of qualitative follow-up provides additional framing to the quantitatively designed Likert scale question and augments insights. Additionally, if the survey shows a sizeable negative sentiment towards a brand, incorporating qualitatively designed open-ends can uncover issues or problems that were unknown prior to the research, and perhaps outside of the original research scope.

Historically, quantitative and qualitative research has been bifurcated, both in design and in analysis. However, hybrid approaches such as the one described above are quickly gaining ground and the true value is being realized. Based on our experience here at Corona, for content analysis to be effectively done in a quantitative-dominant survey, it is best for this to be decided early in the research design phase.

A few things to keep in mind when designing open-ended questions for content analysis:

  • Clearly define research objectives and goals for the open-end questions that will be qualitative analyzed.
  • Construct questions with these objectives in mind and incorporate phrasing the invites nuanced responses.
  • Plainly state your expectations for responses and if possible, institute character minimums or maximums as needed.

In addition to the points mentioned above, it is important to note that there are some avoidable pitfalls. First off, this method is best suited for surveys with a smaller sample size, preferably under 1000 respondents. Also, the survey itself must not be too time intensive. It is well known that surveys which extend beyond 15 to 20 minutes often lead to participants dropping out or not fully completing the survey. Keep these time limits in mind and be selective about the number of open-ends to be include. Lastly, it is important to keep the participant engaged in the survey. If multiple open-ends are incorporated in to the survey, phrase the questions differently or ask them about different topics in an effort to keep participants from feeling as  though they are repeating themselves.

In an ideal world, quantitative and qualitative approaches could meld together seamlessly, but we all know this isn’t an ideal world. Time constraints, budgets, research objectives are just a handful of reasons why a hybrid approach such as the one discussed here may not be the right choice. If it is though, hybrid approaches provide participants an opportunity to think deeper about the topic at hand and also can create a sense of active engagement between the participant and the end-client. In other words—they feel like their voice is being heard and the end-client gains a better understanding of their customer.


Human Experience (HX) Research

About a year ago, I stumbled upon a TEDx Talk by Tricia Wang titled “The Human Insights Missing from Big Data”. She eloquently unfurls a story about her experience working at Nokia around the time smartphones were becoming a formidable emergent market. Over the course of several months, Tricia Wang conducted ethnographic research with around 100 youth in China and her conclusion was simple—everyone wanted a smartphone and they would do just about anything to acquire one. Despite her exhaustive research, when she relayed her findings to Nokia they were unimpressed and expressed that big data trends did not indicate there would be a large market for smartphones. Hindsight is 20/20.

One line in particular stuck out to me as I watched her talk— “[r]elying on big data alone increases the chances we’ll miss something, while giving us the illusion we know everything”. Big data offers companies and organizations plentiful data points that haphazardly paint a picture of human behavior and consumption patterns. What big data does not account for is the inherent ever-shifting, fickle nature of humans themselves. While big data continues to dominate quantitative research, qualitative research methods are increasingly shifting to account for the human experience. Often referred to as HX, human experience research aims to capture the singularity of humans and forces researchers to stop looking at customers exclusively as consumers. In human experience research, questions are asked to get at a respondent’s identity and emotions; for instance, asking how respondents relate to an advertising campaign instead of just how they react to the campaign.

The cultivation of HX research in the industry begs the question: what are the larger implications for qualitative research? Perhaps the most obvious answer is that moderators and qualitative researchers need to rethink how research goals are framed and how questions are posed to respondents to capture their unique experience. There are also implications for the recruiting process. The need for quality respondents is paramount in human experience research and will necessitate a shift in recruiting and screening practices. Additionally, qualitative researchers need to ensure that the best methodology is chosen in order to make respondents feel comfortable and vulnerable enough to share valuable insights with researchers.

Human experience research may just now be gaining widespread traction, but the eventual effects will ultimately reshape the industry and provide another tool for qualitative researchers to answer increasingly complex research questions for clients. At Corona, adoption of emerging methodologies and frameworks such as HX means we can increasingly fill knowledge gaps and help our clients better understand the humans behind the research.


Based on my experience…

Born from a conversation I had with a coworker earlier this week, I wanted to talk about research methodology and design and how a client relying solely on what they know – their own experience and expertise – might result in subpar research.

Quantitative and qualitative methods have different strengths and weaknesses, many of which we have blogged about before. The survey is the most frequently used quantitative methodology here at Corona, and it’s an incredibly powerful tool when used appropriately. However, one of the hallmarks of a quantitative, closed-ended survey is that there is little room for respondents to tell us their thoughts – we must anticipate possible answers in question design. When designing these questions, we rely on our client’s and our own experience and expertise.

We know how much the value of a statistically valid survey is appreciated – being able to see what percentage of customers believe or perceive or do something, compare results across different subpopulations, or precisely identify what “clicks” with customers is very satisfying and can drive the success of a campaign.  But the survey isn’t always the right choice.

Sometimes, relying on experience and expertise is not enough, perhaps because you can’t predict exactly what responses customers might have or, despite the phrase being somewhat cliché, sometimes you don’t know what you don’t know. This is why I advocate for qualitative research.

Qualitative research is not statistically valid. Its results are not really meant to be applied to your entire population of customers or even a segment of them. However, it is incredibly powerful when you’re trying to learn more about how your customers are thinking about X, Y, or Z. Feel stuck trying to brainstorm marketing materials, finding a way to better meet your customers’ needs, or come up with a solution to a problem you’re having? Your customers probably won’t hand you a solution (though you never know), but what they say will spark creativity, aid in brainstorming, and become a valuable facet in the development of it.

In the end, both serve a significant role in research as a whole. Personally, I’m a big supporter of integrating both into the process. Qualitative research can bridge the gap between your customers and the “colder” quantitative research. It can help you better understand what your customers are doing or thinking and therefore help you better design a quantitative survey that enables you to capture robust data. Alternatively, qualitative research can follow a quantitative survey, allowing you to explore more of the “why” behind certain survey results. In the end, I simply urge you not to underestimate the value of qualitative research.


Phenomenology: One way to Understand the Lived Experience

How do workers experience returning to work after an on-the-job injury? How does a single-mother experience taking her child to the doctor? What is a tourist’s experience on his first visit to Colorado?

These research questions could all be answered by phenomenology, a research approach that describes the lived experience. While not a specific method of research, phenomenology is a series of assumptions that guide research tactics and decisions.Phenomenological research is uncommon in traditional market research, but that may be due to little awareness of it rather than its lack of utility. (However, UX research, which follows many phenomenological assumptions, is quickly gaining popularity). If you have been conducting research, but feel like you are no longer discovering anything new, then a phenomenology approach may shed some fresh insights.

Phenomenology is a qualitative research approach. It derives perspectives defined by experience and context, and the benefit of this research is a deeper and/or boarder understanding of these perspectives. To ensure perspectives are revealed, rather than prescribed, phenomenology avoids abstract concepts. The research doesn’t ask participants to justify their opinions or defend their behaviors. Rather, it investigates the respondents’ own terms in an organic way, assuming that people do not share the same interpretation of words or labels.

In market research, phenomenology is usually explored by unstructured, conversational interviews. Additional data, such as observing behavior (e.g., following a visitor’s path through a welcome center), can supplement the interviews. Interview questions typically do not ask participants to explain why they do, feel, or think something. These “why” questions can cause research participants to respond in ways that they think the researcher wants to hear, which may not be what’s in their head or heart. Instead, phenomenology researchers elicit stories from research participants by asking questions like “Can you tell me an example of when you…?” or, “What was it like when…?” This way, the researcher seeks and values context equally with the action of the experience.

The utility of this type of research may not be obvious at first. Project managers and decision makers may conclude the research project with a frustrating feeling of “now what?” This is a valid downside of phenomenological research. On the other hand, this approach has the power to make decision makers and organization leaders rethink basic assumptions and fundamental beliefs. It can reveal how reality manifests in very different ways.

A phenomenology study is not appropriate in all instances. But it is a niche option in our research arsenal that might best answer the question you are asking. As always, which research approach you use depends on your research question and how you want to use the results.


Does This Survey Make Sense?

It’s pretty common for Corona to combine qualitative and quantitative research in a lot of our projects.  We will often use qualitative work to inform what we need to ask about in qualitative phases of the research, or use qualitative research to better understand the nuances of what we learned in the quantitative phase.  But did you know that we can also use qualitative research to help design quantitative research instruments through something called cognitive testing?

The process of cognitive testing is actually pretty simple, and we treat it a lot like a one-on-one interview.  To start, we recruit a random sample of participants who would fit the target demographic for the survey.  Then, we meet with the participants one-on-one and have them go through the process of taking the survey.  We then walk through the survey with them and ask specific follow-up questions to learn how they are interpreting the questions and find out if there is anything confusing or unclear about the questions.

In a nutshell, the purpose of cognitive testing is to understand how respondents interpret survey questions and to ultimately write better survey questions.  Cognitive testing can be an effective tool for any survey, but is particularly important for surveys on topics that are complicated or controversial, or when the survey is distributed to a wide and diverse audience.  For example, you may learn through cognitive testing that the terminology you use internally to describe your services are not widely used or understood by the community.  In that case, we will need to simplify the language that we are using in the survey.  Or, you may find that the questions you are asking are too specific for most people to know how to answer, in which case the survey may need to ask higher-level questions or include a “Don’t Know” response option on many questions.  It’s also always good to make sure that the survey questions don’t seem leading or biased in any way, particularly when asking about sensitive or controversial topics.

Not only does cognitive testing allow us to write better survey questions, but it can also help with analysis.  If we have an idea of how people are interpreting our questions, we have a deeper level of understanding of what the survey results mean.  Of course, our goal is to always provide our clients with the most meaningful insights possible, and cognitive testing is just one of the many ways we work to deliver on that promise.