We are thrilled to celebrate the creation of the College of Arts, Humanities & Social Sciences at the University of Denver.
The creation of the College is a direct result of their strategic planning process. The exciting Keystone Strategic Plan commits the college to nothing less than the transformation of the liberal and creative arts education in alignment with the University’s transformation under DU IMPACT 2025.
What role will the social sciences, arts, and humanities play in a world that increasingly operates through artificial intelligence, the internet of things, and big data? A very important one. The careers and lives of tomorrow will be defined by distinctly human qualities such as ethical judgment, creativity, adaptability, agility, and storytelling.
“This plan represents the best of our strategy work at Corona. We are thrilled with the resulting plan and look forward to the momentum and positive changes it creates for the students, staff, faculty, and alumni of the college,” said Karla Raines.
Read the full press release here. See the video that DU produced about the plan below.
While that statement might seem far-fetched, after all it wasn’t that long ago that we were getting used to the idea of a digital native as a person familiar with computers and the Internet from a very young age, it isn’t too early to plan for a future defined by virtual reality. Children born over the next two decades will grow up with it. Not only will these children grapple with developing their own imaginations, they’ll be discerning what is really real from what is virtually real. Might the 2 year-old of 2030 have a virtual imagination?
It’s time to get ready. Parents, care givers, educators and children’s museums will be shaped by this new reality.
You read it here first. Virtual native – a person familiar with virtual reality from a very young age.
Recently after an interview for a project, some us at Corona had a discussion about whether or not it would be useful to use a survey for the project. Like a lot of projects, this potential client was interested in what new changes the public might want in their organization. And at first, this seems like it could be a great area to do a survey and ask people what they want. However, directly asking people about what they might want can backfire sometimes for a number of reasons:
Various psychologists have found that people are not always great at predicting their emotional response to something (e.g., will X make me happy?). Part of the reason is that people don’t always do a good job of imagining what it will actually be like.
People often think that they want more choices, but this is generally not the case.
Depending on the topic, people might feel like there is a “socially correct” option and might choose that one instead of what they really want.
I think in general, we don’t always know what we want, especially when the possibilities are vast. And sometimes what we want may not come through in the survey questions. Sometimes experiencing a change is very different than reading about a change, especially if you’re trying to gauge whether you will like the change or not.
In some situations, it may be more useful to try to measure behavior instead of opinions when trying to determine what people want. While it is sometimes difficult to do this, the data can be very rich and useful. One interesting approach is to temporarily make a change and record what happens. For example, New York City first made Times Square pedestrian only as a test to see what the impact might be. It was initially a hard sell because people were thinking about what the city would lose—one of the main thoroughfares. But there were lots of positives to making it pedestrian only—enough to make the change permanent. When you survey people about potential changes, sometimes it is easier to think about what you lose in the change, as opposed to what you might gain. And that can impact how people respond to the survey.
A pop up shop is another example of this. A shop can temporarily appear for a few days or a month to see whether a more permanent location is a good idea. Even if your online shoppers say in a survey that they would visit a physical location, a pop up store will let you know whether that actually happens.
So the next time your organization is considering making a change, it might be useful to think about whether a survey is going to be the most useful way to decide what to change or whether measuring behaviors as part of a test might be a better approach.
Every week from August through January, millions of Americans tune in three nights a week to watch modern-day gladiators battle over a single objective – strategically progressing an oblong ball down the field in 10-yard increments towards the finish line at the 100-yard mark (the end zone, for the growing number of people not tuning in). We can use the word “strategically” here because the offensive team’s endeavor involves an objective (moving the ball into the end zone); a scope or domain (inside the touch lines and based on field position); and an advantage the offense will try to utilize (a dual-threat QB, for instance).
In the case of measuring the strategic progress of an NFL team advancing (or not) towards the end zone, the engagement comes in the drama of the struggle between the two teams and the theatrics that go into measuring each team’s progress. With all of its resources as an organization, why else would the NFL continue to use measurement techniques—like having a referee “eyeball” the spot of the ball and then trotting out crews of men with 10-yard-long chains to verify said spot—that are technically neither precise nor accurate if not to engage viewers in the theater of strategy execution, of collectively progressing towards a clearly defined objective?
Though this might be an off year for the NFL, now is an ideal time for organizations to consider taking a page out of the NFL’s playbook and make an effort to engage customers and employees in the drama of measuring strategic progress.
When people think of doing market research on a new idea, many think it works like this:
The problem with this mentality is that humans are notoriously awful at forecasting their own behavior. It’s easy to say “Sure, I would buy that!” in when clicking a button while taking a survey or when sitting in a focus group. When it comes down to the actual experience of standing in an aisle in a store and comparing one product to a dozen others, though, the decision gets to be considerably more difficult. There have been plenty of marketresearchfailures over the years, and marketers’ failure to put themselves in the shoes of their customers are often a key reason why.
So how do you get around this issue? Here are a few possible solutions:
Replicate the purchasing decision as closely as possible. Rather than putting your product (or service) in front of people and asking for feedback in a vacuum, ask participants to compare your offering with those of your competitors. Or better yet, don’t even tell them which is yours at first and see which they pick out and why.
Approach the problem from a variety of perspectives. Interest in a product or service has a wide variety of dimensions. While the overall reaction you get may be positive, you may be able to identify areas for improvement if you break interest down into key components, such as the look and feel, usability, ease of use, price, etc.
Get abstract. We at Corona will sometimes make use of questions like “If [this product or service] were an animal, what would it be and why?” While questions like this seem a little silly, it can be extremely informative to know if your offering is more of a cheetah or a walrus. And the explanations of why participants chose their animal can be even more informative. (Fair warning, though: some participants hate this question and will just refuse to answer. That’s OK.)
Consider advanced techniques. There are statistical techniques that have been developed over the years that can help to evaluate the relative weight that survey respondents place on various attributes of a product or service, such as a conjoint or MaxDiff analysis. The details of how these work are outside the scope of our humble little blog, but each of these ask participants to make a decision that requires comparing sets of choices to one another rather than just saying “Yes, I’m interested in A.”
Market research, as valuable as it is, will never be a silver bullet that will absolutely guarantee the success of a product or service launch. However, by considering the experience of making a decision when designing your approach, you’ll have a very strong chance of making the best decisions possible to make your launch a success.
Is anyone else tired of talking about millennials? Millennials have seemingly been on everyone’s mind, with many worrying over their spending habits, charitable giving, large debt, voting behaviors, and other things. Why do we care so much about this generation? Don’t they already have a problem with entitlement and being all about “me me me”; we probably shouldn’t feed into that, right?
Pictured: Gregory (myself) the Millennial Fun fact: depending on where you draw the line, 70% of Corona staff are classified as millennials.
Given the above, it’s unsurprising that millennials are attending about 1.75 cultural activities per month, the highest of all other generations (Culture Track ’14)
… and the facts don’t end there. If you haven’t already, I highly encourage you to pour over some of the linked materials to familiarize yourself with this impactful generation. If they haven’t yet, millennials will be disrupting your organization sometime in the near future, and it’s inescapable that we all need to adapt.
Research that just sits on the shelf (or these days, in a digital folder) is research that probably should not have been conducted. If it is not going to be used, then why do it?
Effective research takes many things, from the beginning through the end. We’ve blogged before about the need to start with end in the mind, but what happens when you get to the end? Then what?
Sharing results internally, with the right audiences, and in an effective medium, is key. Here are several ideas of how to do that beyond the common report or PowerPoint deck.
Make it interactive. Can the data (in part or whole) be made to allow for manipulation by users? This could be a fully interactive dashboard where the user gets to select variables to look at, or it could simply be a predefined analysis that users can pull up, filter, and review. For example, Corona often delivers open-ended verbatim responses with a series of filters built in so users can quickly drill down, rather than just reading hundreds or thousands of verbatim comments.
Video summaries. Can you tell the story through video for greater engagement? We have found that video works best in short clips to convey the primary findings and are often best accompanied by more detailed reporting (if users need more). Longer videos can be harder to digest and cause people to disengage. Corona has created short videos to communicate general findings to larger groups of employees who may need to know the general gist of the research, but do not need to know as much detail as core decision makers.
Initial readouts and workshops. Can you involve the users in designing reporting, such as holding a workshop to help build their dashboard so it includes the metrics they want? This not only helps create a more effective dashboard for them, but also creates buy-in since they were involved in its creation. Similarly, sharing preliminary findings can help focus additional analysis and ensure their questions are being addressed in the final report.
Also, consider the following to make any of the above more effective:
Who needs what? Who in the organization needs what information. Share what it is most important so critical points don’t get lost in the larger report.
How much? Consider the level of detail any one person or team needs. Executives may want top-level metrics with key points and recommendations; analysts may want every tabulation and verbatim response.
Who has questions? I think when people read a report or finding, they often think that’s it. Encourage questions and allow for follow-up to make sure everyone has what they need to move forward.
What challenges have you had making use of research? What have you done to try and overcome it? We’d love to hear bellow.
There are a multitude of tools available these days that allow organizations to easily ask questions of their customers. It is certainly not uncommon when Corona begins an engagement for the client to have made internal attempts at conducting surveys in the past. In some cases, these studies have been relatively sophisticated and have yielded great results. In others, however, the survey’s results were met with a resounding “Why does this matter?”.
The challenge is that conducting a good survey requires a much more strategic view than most realize. This starts with designing the survey questions themselves. We always begin our engagements by asking our clients to think through the decisions that will be made, the opportunities to improve, and the possible challenges to be addressed based on the results. By keeping the answers to these questions in mind as you design your survey questions, you can minimize the amount of “trivia” questions in your survey that might be interesting to know, but won’t really have any influence on your future decisions.
Even after having questions designed, you have to consider how you will get people to participate in the survey. If you have a database of 100,000 customers, it may be tempting to just send invitations to all of them. But what if you plan to send out a plea for donations in the next few weeks? Consider the impact of asking for 15 minutes of time from people who might be asked to support you very soon. Being careful to appropriately time the survey and perhaps only send it out to a small segment of customers might help to minimize fatigue that could negatively impact your overall business strategy in the near future.
Finally, once you’ve collected the results, simple tabulations will only tell a small part of the story. Every result should be examined through the lens of the actual strategic impact of the results. A good question to ask throughout the analysis of your results is, “So what?”. Keep the focus on the implications of the results rather than the results themselves, your final report of what you learned with have a much better chance of making a meaningful impact on your organization moving forward.
Obviously, we at Corona are here to help walk you through this process in order to ensure the highest-quality result possible, but even if you choose to go it alone, keeping a strategic view of what you need to learn and how it will influence your decisions will help to avoid a lot of wasted effort.
A client recently asked us for guidance in the middle of their communication campaign. They had already created and deployed a series of vivid ads encouraging a specific behavior. (For confidentiality reasons, I can’t state the behavior, but let’s pretend they wanted dog owners to register their dogs with the local humane society). Their desired outcome was to increase the percentage of registered dogs from the baseline, which hadn’t changed for many years. Their strategy was to use mass media messages to motivate all dog owners to register their dogs.
They hired Corona to evaluate the campaign and provide recommendations for improvement. We found that a small percentage of the population had strong intentions to not register their dog (intention to do something is a relatively good predictor of what people will do). Based on other scientific research, we know that it is difficult to change peoples’ strong intentions, especially through mass media. Thus, we suggested that the client stop trying to influence all dog owners, at least in this campaign.
A better strategy was to motivate dog owners with weak intentions or were unsure what they would do. Our research found that people with weak or unformed intentions had different barriers and reasons to register their dogs. Indeed, those with weak intentions often said they just “never got around to it.” This finding was the keystone of our research because it showed how a strategy shift aimed at influencing this sub-population – rather than all dog owners – would have the biggest impact on increasing overall registration!
Shifting strategy was not easy for this client, but the data and our recommendations compelled them to make the change. We helped them see their issue from a new perspective, and our guidance made the transition of the communication strategy easier. Quality research and thoughtful analysis can reveal strategic advantages, and every strategic advantage can have a meaningful impact on success.
We all know that anxiety and stress can bring out the worst in people. They go to extremes in behavior as they attempt to navigate unknowns and exacerbating risks. An inability to cope leads to three behaviors.
The flame thrower – this person may initially show up as the conversation dominator who simply must have the last word. What if this person is in fact a bully, or maybe worse? The flame thrower is the person who torches people and ideas they perceive as threatening to their preferred role and view of the world. They make incendiary comments, attempting to belittle, berate or anger those in roles of positional power. Rather than view the strengths and talents in others they prefer puffery and falling on the sword.
The feral cat – in the blink of an eye this person will lash out with claws and fangs. You never even saw the paw sweep through the air towards you. They appear like an average cat, perhaps with a bit of a different manner, but wowza, who would have thought they could pounce like that? After they harm they tend to slink away.
The poser – hey, let’s go along to get along. Why take a risk – and share an opinion or recommendation – that may rankle someone, especially if there are flame throwers and feral cats in the room. This person is a survivor. They appear to be a team player but you can’t assume true buy-in or cooperation.
What do all three of these people have in common? Self-preservation. Regrettably these behaviors and habits run counter to creativity, collaboration, and change. It’s difficult to have a meaningful design session when group members are committed to a scorched earth policy.