• HOME
  • ABOUT
    • Overview
    • Our Staff
    • Join Our Team
  • SERVICES
    • Market Research
    • Evaluation
    • Strategic Consulting
  • CLIENTS
    • Client List
    • Case Studies
    • Testimonials
  • BLOG
  • CONTACT

Radiance Blog

All posts by Kate Darwent

Higher Education

Cousins with a Lot in Common: Culturals and Colleges—Part 2

April 19, 2018 Kate Darwent Comment

In this blog post, we’re exploring how some of the trends in arts & culture also apply to higher education. Previously, we focused on how arts & culture and higher education are both facing a shifting paradigm that is forcing them to rethink what it means to be a cultural or a college or university. Today, we’re going to dive deeper into changes in how people interact with these organizations and what they expect of them.

User-defined experiences

Technology has allowed a high level of personalization across many domains, and as a result, there are higher expectations for being able to personalize products and experiences. One of the big findings from CultureTrack ’14 was that people were interested in self-curating their experiences when going to an arts & cultural event or space. Further, CultureTrack ’17 found that different groups of people were interested in using technology to enhance their arts & cultural experiences.

What might self-curation mean for higher education? Many colleges and universities have already realized that the experience of getting a degree can be as or even more important than the actual degree. Offering students ways to customize this experience is vital. This doesn’t necessarily mean changing the content of your offerings. It does call for engaging and relevant messaging that resonates with students with their own big future goals. If they don’t find what they are looking for, they’ll look to self-curate their education someplace else.

Additionally, convenient formats are increasingly important for the higher education consumer. Instead of declaring online courses and degrees as inferior, colleges and universities should be figuring out how to make the quality of these options as high as the on-campus options because for some students, these may be their only options.

Another major finding from CultureTrack ’14 was that cultural consumers were “promiscuous” when it came to experiencing arts & culture—they wanted to experience a little bit of everything. And there is a similar pattern in education. As STEM transforms into STEAM and then STREAM (science, technology, reading, engineering, art, and math), we can see how the focus on a well-rounded education has become popular again. Given the growth of the knowledge economy, students are interested in experiencing a broader array of education opportunities, even after they have a degree, for both personal growth and career growth.

Wanted: Civic leaders

CultureTrack ’17 also examined people’s philanthropy for arts & culture. Most of the major reasons for giving or not giving to arts & cultural organization involved social impact. That is, cultural consumers are more interested now in the type of impact that arts & cultural organizations are having on society and their community. They are expecting arts & cultural organizations to act as civic leaders.

This pattern seems to hold true for institutions beyond cultural ones, such as colleges and universities. Both donors and students are expecting colleges and universities to be interested in having a social impact. Academia rewards faculty for scholarship and knowledge creation, but colleges and universities also need to encourage faculty to figure out ways to apply that knowledge in the community in an impactful way.

One of the other outcomes of recent technological advances is that people spend less time engaging with other people in real life. However, people crave contact, connection and a civic commons. Plus, we know that loneliness is bad for our health. Community institutions, like culturals and universities or colleges, are some of the remaining spaces for engaging with other people. Civic leaders can provide opportunities for people in the community to engage with other people.

Where to next?

While change is often stressful, it is exciting to see the ways that higher education and arts & culture are evolving to meet the needs and expectations of the future. Some colleges and universities and some culturals have been adapting to meet the needs of their students, patrons, and community. For example, the San Francisco Opera has been using pop-up events to reach new patrons. Importantly, the opera is using these pop-up events to test an idea quickly, learn from the outcome, and move forward. Similarly, Georgetown University is testing different ways of delivering higher education. In both these examples, the organizations are giving themselves permission and space to experiment.

If you are a higher education institution or a cultural who needs to adapt to this shifting paradigm, there are a couple of things to consider:

  • What parts of your identity as an organization are critical (like your mission and vision) and what parts could be adapted? Where is there room to experiment?
  • How can your organization bring people together? How can you have a greater impact on the community? How are you sharing the story of your impact?
  • How are you addressing issues of access and inclusion?
  • Can people create an experience at your organization that aligns with their needs and interests?

Quantitative Research, Strategy & Tactics, Surveying Surveys

How to measure what people want

December 18, 2017 Kate Darwent Comment

Recently after an interview for a project, some us at Corona had a discussion about whether or not it would be useful to use a survey for the project. Like a lot of projects, this potential client was interested in what new changes the public might want in their organization. And at first, this seems like it could be a great area to do a survey and ask people what they want. However, directly asking people about what they might want can backfire sometimes for a number of reasons:

  1. Various psychologists have found that people are not always great at predicting their emotional response to something (e.g., will X make me happy?). Part of the reason is that people don’t always do a good job of imagining what it will actually be like.
  2. People often think that they want more choices, but this is generally not the case.
  3. Depending on the topic, people might feel like there is a “socially correct” option and might choose that one instead of what they really want.
  4. I think in general, we don’t always know what we want, especially when the possibilities are vast. And sometimes what we want may not come through in the survey questions. Sometimes experiencing a change is very different than reading about a change, especially if you’re trying to gauge whether you will like the change or not.

In some situations, it may be more useful to try to measure behavior instead of opinions when trying to determine what people want. While it is sometimes difficult to do this, the data can be very rich and useful. One interesting approach is to temporarily make a change and record what happens. For example, New York City first made Times Square pedestrian only as a test to see what the impact might be. It was initially a hard sell because people were thinking about what the city would lose—one of the main thoroughfares. But there were lots of positives to making it pedestrian only—enough to make the change permanent. When you survey people about potential changes, sometimes it is easier to think about what you lose in the change, as opposed to what you might gain. And that can impact how people respond to the survey.

A pop up shop is another example of this. A shop can temporarily appear for a few days or a month to see whether a more permanent location is a good idea. Even if your online shoppers say in a survey that they would visit a physical location, a pop up store will let you know whether that actually happens.

So the next time your organization is considering making a change, it might be useful to think about whether a survey is going to be the most useful way to decide what to change or whether measuring behaviors as part of a test might be a better approach.


Market Research

When Data Collection Changes the Experience

September 22, 2017 Kate Darwent 1 Comment

One of the ongoing issues in any research that involves people is whether the data collection process is changing the respondents’ experience. That is, sometimes when you measure an attitude or a behavior, you may inadvertently change the attitude or behavior. For example, asking questions in a certain way may change how people would have naturally thought about a topic. Or if people feel like they are being observed by other people, they may modify their responses and behaviors.

We often think about this issue when asking about sensitive topics or designing research that is done face-to-face. Will people modify their responses if they are in front of a group of people? Or even just one person? For example, asking parents about how they discipline their children may be a sensitive topic, and they might omit some details when they are talking in front of other parents. Even in a survey, respondents may modify their responses to what they think is socially desirable (i.e., say what they think will make them seem like a good person to other people) or may modify them based on who specifically they think will read the responses. They may modify their responses depending on whether they trust whoever is collecting the data.

But beyond social desirability concerns and concerns about being observed, the research experience itself may not match the real-life experience. With surveys, the question order may not match how people naturally experience thinking about a topic. If a survey asks people whether they would be interested in a new service, their answer may change depending on what questions they have already answered. Did the survey ask people to think about competing services before or after this question? Did the survey have people list obstacles to using the new service before or after? Moreover, which of the question orders is most similar to the real-life experience of making a decision about the new service?

As discussed in a previous post, making a research experience as similar to the real life event can make it more likely that the results of the research will generalize to the real event. Collecting observational data or collecting data from third party observers can also maintain the natural experience. For example, if you want to determine whether a program is improving classroom behavior in school, you might collect teachers’ reports of their students’ behavior (instead or in addition to asking students to self-report). You could also record students’ behavior in the classroom and then code the behavior in the video. Technology also has made it easier to collect some data without changing the experience at all. For example, organizations can test different ad campaigns by doing an A/B test. Perhaps two groups of donors receive one of two possible emails soliciting donations. If two separate URLs are set up to track the response to the ad, then differences in response to the ad can be compared without changing the experience of receiving a request for donations.

One of my statistics professors used to say that no data are bad data. We just need to think carefully about what questions the data answer, which is impacted by how the data were collected. Recognizing that sometimes collecting data changes a natural experience can be an important step to understanding what questions your data are actually answering.


Quantitative Research, Surveying Surveys

Thinking strategically about benchmarks

May 31, 2017 Kate Darwent Comment

When our clients are thinking about data that they would like to collect to answer a question, we sometimes are asked about external benchmarking data. Basically, when you benchmark your data, you generally are asking how you compare to other organizations or competitors. While external benchmarks can be useful, there are a couple of points to consider when deciding whether benchmarking your data is going to be useful:

  1. Context is key. Comparing yourself to other organizations or competitors can encourage some big picture thinking about your organization. But it is important to remember the context of the benchmark data. Are the benchmark organizations similar to you? Are they serving similar populations? How do they compare in size and budget? Additionally, external benchmark data may only be available in aggregated form. For example, non profit and government organizations may be grouped together. Sometimes these differences are not important, but other times they are an important lens through which you should examine the data.
  2. Benchmark data is inherently past-focused. When you compare your data to that of other organizations, you are comparing yourself to the past. There is a time-lag for any data collection, and the data are reflecting the impacts of changes or policies that have already been implemented. While this can be useful, if your organization is trying to adapt to changes that you see on the horizon, it may not be as useful to compare yourself to the past.
  3. Benchmark data is generally more useful as part of a larger research project. For example, if your organization differs significantly from other external benchmarks, it can be helpful to have data that suggest why that is.
  4. What you can benchmark on may not be the most useful. Often, you are limited in the types of data available about other organizations. These may be certain financial data or visitor data. Sometimes the exact same set of questions is administered to many organizations, and you are limited to those questions for benchmarking.

Like most research, external benchmarking can be useful—it is just a matter of thinking carefully about how and when to best use it.


Market Research

Listening isn’t enough

March 10, 2017 Kate Darwent Comment

Recently, we’ve been having a few conversations at work about engagement processes, in part because we’ve seen a few requests for proposals that have some focus on engagement with a particular audience. Often, this engagement takes the form of listening in some way to the audience of interest. While hearing from a group of people that you are interested in engaging with is critical, I would argue that it’s just one part of an engagement process.

In fact, if you look at various types of research on engagement with different groups (employees, customers, etc.), there are a couple of similarities that stick out. Based on this research and some our own experiences at Corona, I identified a couple of themes of a successful engagement:

  1. Listening. Listening is a critical part of engagement. It is important to think carefully about which methods of listening will produce the type of information that is most useful. Are you making an attempt to hear from less engaged people? Are you interested in what kinds of ideas/concerns/problems/etc. people are having or are you interested in how common those are? Have you ensured that people feel comfortable being honest? Do people need additional information before giving input?
  1. Reflection. Often, it is easy to get so wrapped up in translating what you hear from the group into action that you forget to reflect what you heard back to the group. Telling people what you have heard from them is an important part of the engagement process. It makes sure that everyone is working with the same base of information and helps people understand why different decisions or changes are being made. Also, demonstrating that you understand what people were telling you can make later criticism less harsh.
  1. Expectations and Accountability. Finally, clarifying expectations and how accountability will be incorporated into the relationship is important. People generally like knowing what is expected of them and why. Initially, this can be as simple as explaining clearly the goals of an engagement process and why the group of interest is so vital to the process. Later in the process, this might be aligning expectations and goals with what you heard when listening to the group. Also, it’s important to think about how you will evaluate whether those expectations and goals are being met.

While there are definitely unique components to engagement processes with certain audiences (e.g., employees, stakeholders, community, etc.), the three components above stood out as common themes to all types of engagement.


Evaluations, Government, Higher Education, Market Research, Nonprofit

Writing an RFP

November 30, 2016 Kate Darwent Comment

So you’ve finally reached a point where you feel like you need more information to move forward as an organization, and, even more importantly, you’ve been able to secure some amount of funding to do so. Suddenly you find yourself elbow deep in old request-for-proposals (RFPs), both from your organization and others, trying to craft an RFP for your project. Where do you start?

We write a lot of proposals in response to RFPs at Corona, and based on what we’ve seen, here are a few suggestions for what to include in your next RFP:

  • Decision to be made or problem being faced. One of the most important pieces of information that is often difficult to find, if not missing from an RFP, is what decision an organization is trying to make or what problem an organization is trying to overcome. Instead, we often see RFPs asking for a specific methodology, while not describing what an organization is planning to do with the information. While specifying the methodology can sometimes be important (e.g., you want to replicate an online survey of donors, you need to perform an evaluation as part of a grant, etc.), sometimes specifying it might limit what bidders suggest in their proposals.

Part of the reason why you hire a consultant is to have them suggest the best way to gather the information that your organization needs. With that in mind, it might be most useful to describe the decision or problem that your organization is facing in layman’s terms and let bidders propose different ways to address it.

  • Other sources of data/contacts. Do you have data that might be relevant to the proposals? Did your organization conduct similar research in the past that you want to replicate or build upon? Do you have contact information for people who you might want to gather information from for this project? All these might be useful pieces of information to include in an RFP.
  • Important deadlines. If you have key deadlines that will shape this project, be sure to include them in the RFP. Timelines can impact proposals in many ways. For example, if a bidder wants to propose a survey, a timeline can determine whether to do a mail survey, which takes longer, or a phone survey, which is often more expensive but quicker.
  • Include a budget, even a rough one. I think questions about the budget are the number one question I see people ask about an RFP. While a budget might scare off a more expensive firm, it is more likely that including a budget in an RFP helps firms propose tasks that are financially feasible.

Requesting proposals can be a useful way to get a sense of what a project might cost, which might be useful if you are trying to figure out how much funding to secure. If so, it’s often helpful to just state in your RFP that your considering different options and would like pricing for each recommended task, along with the arguments for why it might be useful.

  • Stakeholders. Who has a stake in the results of the project and who will be involved in decisions about the project?  Do you have a single internal person that the contractor will report to or perhaps a small team?  Are there others in the organization who will be using the results of the project?  Do you have external funders who have goals or reporting needs that you hope to be met by the project?  Clarifying who has a stake in the project and what role they will play in the project, whether providing input on goals, or approving questionnaire design, is very helpful. It is useful for the consultant to know who will need to be involved so they can plan to make sure everyone’s needs are addressed.

Writing RFPs can be daunting, but they can also be a good opportunity for thinking about and synthesizing an important decision or problem into words. Hopefully these suggestions can help with that process!


Market Research, Trends and News

What are your organization’s demographics?

August 19, 2016 Kate Darwent Comment

It’s probably no secret that we at Corona like to think about demographics a lot (we posted this cool quiz to test your demographic knowledge a few months ago). A few weeks ago, it took all of my self-control to not turn to whoever was sitting next to me on a plane and start discussing this Atlantic article about China’s changing demographics. Not only is the article interesting for political nerds, it is also a great example of how a dramatic shift in demographics over a relatively short amount of time is going to have large effects on China’s status as a world power.

At Corona, we often see this same pattern on a much smaller scale at the organizations we work with. These demographic changes are apparent in both the work force and populations that many of our clients serve. Understanding demographic trends can help organizations plan for the future. If your organization works with children, you may have already had to plan for some significant changes in demographics, given that big changes have already started for younger generations.

There are two key steps that an organization can take to anticipate and grow with changing demographics. First, a demographic analysis can give your organization a good idea of how your customers/supporters compare to the broader population. For example, if your organization provides support for low income children in Denver, it is important to know whether you are serving that population well. A demographic analysis could show you what the population of low income children in Denver look like (e.g., what is their living situation, who are their parents, etc.). Then, your organization could compare the demographics of the children you are currently serving with the broader population to identify any gaps. For example, you may not be serving as many young parents as you would expect given the population in Denver.

Second, an organization can then do target research with a specific demographic to better understand how to reach them, serve them, etc. In the previous example, the organization might want to do focus groups with young, low income parents in Denver to get an idea of what the barriers are to serving that population. These steps can help an organization both better meet the needs of the current population and plan for population shifts in the future.


Market Research, Nonprofit

Building Empathy

April 7, 2016 Kate Darwent 1 Comment

In the past year I’ve been involved with a few projects at Corona that involve evaluating programming for teenagers. One commonality across these projects is that the organizations have been interested in building empathy in teenagers. As I’ve been reading through the literature on empathy, I’ve been thinking about how building empathy should be a goal of most nonprofits.

Perhaps not surprisingly, there’s research demonstrating that people are more likely to donate when they feel empathy for the recipient. This research builds upon the classic psychology research demonstrating that empathy increases the likelihood of altruism, especially when there are costs to being altruistic. It’s clear that empathy can play an important role in motivating people to give altruistically, but how can we build empathy especially for others who are not very similar to ourselves?

One useful way to build empathy in marketing materials is to create stories that allow people to connect to those who need help or to those who are helping. The idea that organizations should be engaging in storytelling to engage and attract stake holders has been recently promoted. Stories are most powerful when people are able to lose themselves in a character.  This is why reading or seeing a story from the first person perspective can be so powerful.

While you don’t necessarily need research to write an empathy-building story to use in marketing materials, research can provide useful information for creating those stories. Any data or information that you have collected about your donors or your recipients can provide a great foundation for creating a story. And if you develop new, empathy-building marketing materials, you might consider testing the impact of those materials.


Chronicling Corona, Market Research, Qualitative Research, Quantitative Research, Surveying Surveys

Who’s Excited for 2016?

December 18, 2015 Kate Darwent Comment

Oh man, where did 2015 even go? Sometimes the end of the year makes me anxious because I start thinking about all the things that need to be done between now and December 31st. And then I start thinking about things that I need to do in the upcoming year, like figuring out how to be smarter than robots so that they don’t steal my job and learning a programming language since I’ll probably need it to talk to the robots that I work with in the future. Ugh.2015 Calendar

Feeling anxious and feeling excited share many of the same physical features (e.g., sweaty palms, racing heart, etc.),  and research has shown that it is possible to shift feelings of anxiety to feelings of excitement even by doing something as simple as telling yourself you are excited. So, let me put these clammy hands to use and share some of the things that I am excited about for 2016:

  • Technological advancements for data collection. Changes in phone survey sampling are improving the cell phone component of a survey. Also, we have been looking at so many new, cool ways of collecting data, especially qualitative data. Cell phones, which are super annoying for phone surveys, are simultaneously super exciting for qualitative research. I’m excited to try some of these new techniques in 2016.
  • Improvements in the technology that allows us to more easily connect with both clients and people who work remotely. We use this more and more in our office. I’m not sure if in 2016 we will finally have robots with iPads for heads that allow people to Skype their faces into the office, but I can dream.
  • Work trips! I realize that work trips might be the stuff of nightmares at other jobs. But Coronerds understand the importance of finding humor, delicious food, and sometimes a cocktail during a work trip.
  • New research for clients old and new. This year I’ve learned all sorts of interesting facts about deck contractors, the future of museums, teenage relationships, people’s health behaviors, motorcyclists, business patterns in certain states, how arts can transform a city, and many more! I can’t wait to see what projects we work on next year.
  • Retreat. For people who really love data and planning, there is nothing as soothing as getting together as a firm to pore over a year’s worth of data about our own company and draw insights and plans from it.

Alright, I feel a lot better about 2016. Now I’m off to remind myself that these clammy hands also mean that I’m very excited about holiday travel, last minute shopping, and holiday political discussions with the extended family…


Analytics, Chronicling Corona, Market Research, Qualitative Research, Quantitative Research, Surveying Surveys

How to Choose your own Adventure when it comes to Research

October 28, 2015 Kate Darwent Comment

One of the things we’ve been doing at Corona this year that I’ve really enjoyed is resurrecting our book club. I enjoy it because it’s one way to think about the things we are doing from a bigger picture point of view, which is a welcome contrast to the project-specific thinking we are normally doing. One topic that’s come up repeatedly during our book club meetings is the pros and cons of different types of research methodology.

Knowing what kind of research you need to answer a question can be difficult if you have little experience with research. Understanding the different strengths and weaknesses of different methodologies can make the process a little easier and help ensure that you’re getting the most out of your research. Below I discuss some of the key differences between qualitative and quantitative research.

Qualitative Research

Qualitative research usually consists of interviews or focus groups, although other methodologies exist. The main benefit of qualitative research is that it is so open. Instead of constraining people in their responses, qualitative research generally allows for free-flowing, more natural responses. Focus group moderators and interviewers can respond in the moment to what participants are saying to draw out even deeper thinking about a topic. Qualitative research is great for brainstorming or finding key themes and language.

Qualitative data tend to be very rich, and you can explore many different themes within the data. One nice feature of qualitative research is that you can ask about topics that you have very little information about. For example, you might have a question in a survey that asks, “Which of the following best describes this organization? X, Y, Z, or none of the above.” This quantitative question assumes that X, Y, and Z are the three ways that people describe this organization, which requires at least some knowledge. A qualitative research question for this topic would ask, “How would you describe this organization?”. This is one of the reasons why qualitative research is great for exploratory research.

The primary weakness of qualitative research is that you can’t generate a valid population statistic from it. For example, although you could calculate what percent of people in focus groups said that Y was a barrier to working with your organization, you couldn’t generalize that estimate to the larger population. However, if you just wanted to identify the main barriers, then that would be possible with qualitative research. So even if 30% of focus group participants reported this barrier, we don’t know what percent of people overall would report that same barrier. We would only be able to say that this is a potential barrier. It’s important to think carefully about whether or not this would be a weakness for your research project.

Quantitative Research

The main goals of quantitative research are to estimate population quantities (e.g., 61% of your donors are in Colorado) and test for statistical difference between groups (e.g., donors in Colorado gave more money than those in other states). With quantitative research, you’re often sacrificing depth of understanding for precision.

One of the benefits to quantitative research, aside from being able to estimate population values, is that you can do a lot of interesting statistical analyses. Unlike a small sample of 30 people from focus groups, a large sample of 500 survey respondents allows for all sorts of analyses. You can look for statistical differences between groups, identify key clusters of respondents based on their responses, see if you can predict people’s responses from certain variables, etc.

There usually is not one single best way to answer a question with data, so thinking through your options and the benefits afforded by those options is important. And as always, we’re here to help you make these decisions if the project is complicated.


Posts navigation

1 2 Next →

CATEGORIES

  • Business
  • Chronicling Corona
  • Denver Metro
  • Evaluations
  • Government
  • Higher Education
  • In Action
  • Insights for Strategic Marketing
  • Market Research
    • Analytics
    • Qualitative Research
    • Quantitative Research
    • Surveying Surveys
  • Marketing Research's Past
  • Membership organizations
  • Nonprofit
  • Parks and Recreation
  • Strategy & Tactics
  • Stuff We Like
  • Trends and News
  • Uncategorized
  • Corona on Facebook
  • Corona on Twitter
  • Corona on LinkedIn
  • Corona on Google+

CORONA INSIGHTS

1580 Lincoln Street, Suite 510
Denver, CO 80203
(303) 894-8246
Contact Us

Quirk's Featured Company 2013

Snap Survey Software

© Corona Insights. All rights reserved.

  • HOME
  • BLOG
  • ABOUT
    • Overview
    • Our Staff
    • Join Our Team
  • RESOURCES
    • Research Participant Information
    • Research Privacy Policy
  • SERVICES
    • Overview
    • Market Research
    • Evaluation
    • Strategic Consulting
  • CLIENTS
    • Client List
    • Case Studies
    • Testimonials