Thanks to Pure’s blog for posting this article. It’s an excellent summary of who Gen Y is, how they connect, work, use technology, and the marketing implications of all of the above. Definitely worth a read.
As we mentioned in a previous post, digital natives have been a topic of study for us before at Corona, and the demand for research into this segment is only going to increase.
A comment on yesterday’s post brought up how telemarketers have impacted the credibility of market research. Being able to conduct valid research is our life blood and if we were ever unable to get people to participate in our research, we would be unable to provide accurate results to our clients – at least in a cost efficient manner. That is why Sugging and Frugging are so frustrating.
In case you are unfamiliar with the acronyms, Sugging and Frugging stand for Selling Under the Guise of research and Fund Raising Under the Guise of research, respectively. This occurs when a company or organization tries to use research (e.g. surveys) as a cover for a sales pitch. Needless to say, it is manipulative, disrespectful of the intended audience, and if not strictly fraudulent it is highly unethical.
So what can we do about it? Of course just discussing it and informing marketers of its destructiveness is one way; ironically, the same marketers performing this deception are the same that would benefit most from solid research. To learn more, be on the lookout for the soon to be published Encyclopedia of Survey Research — Corona CEO Kevin Raines and Senior Analyst Geoff Urland wrote the entries on Sugging and Frugging! For those who don’t want to wait for the book to come out, the Marketing Research Association (MRA) has a resource for helping fight these practices. Maybe it’s not ironclad, but it’s a good start.
Ford’s recent commercial depicts real people test driving Fords under the premise of “market research.”
Does this have an impact on our industry? Does it discredit true research? Will people be suspicious the next time they’re invited to participate in research (especially for an automaker)?
I personally think the effect will be short lived as the ad campaign runs its course, but anyone doing similar research in the short term should be aware of the possible perceptions respondents may bring with them into the research as a result of this ad campaign.
While this video has been making the rounds for a while, I recently ran across it again.Clean presentation, gets to the point and it’s more motivating than daunting.That’s one of the reasons I like research – here are the questions, so now what are the answers?
Occasionally, we get to provide some of those answers.We have done many education-related projects here at Corona.Recently, one of our clients posted our findings on digital natives’ needs and desires related to library services.Like any business, libraries must change and adapt to remain relevant to fulfill their mission with the next generation. Just one example of how research is helping determine the needs of today’s generation.
The survey we recently completed for the Denver Office of Cultural Affairs has received two nice write ups in the local press after a great public presentation of the results by DOCA director Dr. Erin Trapp.
This extensive survey of Denver residents consisted of 814 interviews with residents, including 205 with self-identified African Americans and 204 with self-identified Latino Denverites. The final survey answers for the entire city were demographically weighted to ensure they are representative of the population of Denver.
The results show both positives and negatives for the performing arts in Denver. Primary among the positives is that 80 percent of residents are interested in live performing arts performances and a good number actually attend them, as within the past year 58 percent attended live theater, 41 percent went to a festival, 34 percent saw a live musical concert, and 11 percent attended a dance performance.
To read more about the results and their implications for performing arts in Denver click over to the Denver Post article* or to the article in the Rocky Mountain News. In addition, DOCA has released selected findings from the survey into a report available on the Denver City website.
*The Denver Post article begins “You can’t always trust surveys commissioned by people with a vested interest in the results.” We completely agree! When consuming data and survey results, you always need to be aware of who commissioned the research, who completed the research, and how they carried it out. And when you conduct research, this is why it is important to have someone (like us!) who is aggressively neutral, ethically unimpeachable, and methodologically sound.
When I recently bought my new car I was informed that I would be receiving a satisfaction survey in the mail shortly asking me about my buying experience. I thought, “Fair enough.” Then I was told that they really like to see top scores for everything, and that if I feel something wasn’t top notch that they would appreciate the chance to fix it first. Again, “Sounds fair.” But wait, will people actually come back and ask them to make it right? As I once read in the Ultimate Question, people will give a high score because they feel guilty not giving them the chance to correct it. So now no one wins: the dealer doesn’t get good feedback and the consumer is left unhappy.
This seems to be a trend in customer service research. From retail to a recent call to one of my credit card providers (Agent at end of call: “Would you say I provided you with great service today?”)
Obviously the research findings produced are faulty. So why do they do it? I think a lot of it is energetic employees and managers who have a very large incentive to show good results. Taking a longer term view would help these companies immensely (maybe provide short and long term incentives?), as well as better policing by those analyzing the research. Companies should be using customer research to evaluate their policies and practices in addition to employees’ performance. When the outcome of a customer service experience is unsatisfactory, it may be because the customer service representative wasn’t helpful when he/she could have been, or it may be because the customer service representative was perfectly helpful, but handcuffed by a problematic company policy. If the survey only asks whether the employee was helpful, and there’s no response category for “as helpful as they could have been given a stupid policy”, how do you respond? Ideally, companies should measure satisfaction with the interpersonal aspects of the experience separately from satisfaction with the outcome of the experience. (“Do you feel the employee did everything they could to address your problem?” and “How satisfied are you with the outcome of your experience?”)
Have you witnessed this as well? What was your reaction?
As any home handyman (or handywoman) who frequents Home Depot probably has experienced, the receipt you receive can be quite disproportionate to the actual purchase. Why is my receipt always 2 feet long even for only a handful of items? Their satisfaction survey, of course.
I first received one of these survey requests probably years ago. At the time, I thought, “Great, take a survey and maybe win something!” Then I went again and got another request. And another. And…
How many times will someone take time to go online and fill out that survey?Anyone who has knows it isn’t necessarily a short survey either.I didn’t fill one out for a long period until I had a particularly bad experience recently.
I imagine most people who get online to take the survey only do so when they have a particularly good or bad experience (the same things could be said about customer comment cards located at checkouts). That may be all Home Depot is after, but considering how many upset customers never tell the company (and only their friends), it seems they’re still probably missing valuable feedback.
A better bet? Proactively survey a random sample of customers, instead of hoping they will take the time to contact you. This would produce stronger results that capture the good, bad and the indifferent. A true measure of overall satisfaction could then be developed.
Welcome to Radiance, a blog brought to you by Corona Research. This is a blog that speaks about more than just market research and strategy. Within this corner of the blogosphere, innovations in research will be discussed, poor methods will be condemned, and – if nothing else – some great trivia conveyed. Radiance provides the crème de la crème of insight and a few useless (but still thought-provoking) facts about the world around you.
We hope you enjoy reading and don’t hesitate to offer up your comments – we’ll look forward to hearing from you.