Sigh, not only are the colours horrid, this 3d pie chart completely misconstrues the values it is trying to represent. Would you have guessed the three slices are identical?
Sigh, not only are the colours horrid, this 3d pie chart completely misconstrues the values it is trying to represent. Would you have guessed the three slices are identical?
As much as we’d like to think, market research is not a commodity. Though anyone can technically carry out the processes that can be deemed to be market research, not everyone can carry out those same functions and actually be conducting valid and reliable market research.
The problem is that so many problems can and do crop up along the way. Unless you can 1) actually recognize that there is a problem and 2) have the skills to actually fix the problem, your problem will just compound itself. Here are the top problems that I see.
- Not starting with a research objective. If you don’t know the questions you are trying to answer, you will waste many hours wandering in circles, playing with numbers, and accomplishing absolutely nothing. Coming up with cool results does you no good if it doesn’t solve the problem you were initially trying to solve.
- Using insufficient sample sizes. Forget the fact that insufficient samples sizes won’t generate any statistically significant results. I’m not concerned with signficance here. I’m concerned with trying to solve major problems based on only 100 responses. How the heck are 100 responses reflective of any group of people, unless the target population is only 500 to begin with. How the heck can you analyze subgroups of men and women, or older and younger people, if you’re only starting with 100 people. Did you not anticipate wanting to look at subgroups of people?
- Being bound by statistical significance, or lack thereof. We often forget about type 1 and type 2 errors. Any time you do statistical tests, some will be falsely significant and other will be falsely insignificant. What the means is that statistics will help guide you but they aren’t the be all and end all of what it important in your dataset. You absolutely must depend on your brain to determine what the important results are.
- Generalizing beyond your sample. In its worst form, this means gathering results from 100 women and assuming the results will apply to men as well. Or, generalizing results from your subsample of 5 men to the entire male population. How about generalizing results from people who completely a two hour survey to people who’ve never answered a survey in their entire life. Again, what were you thinking? You must realize ahead of time that you care about what men think or what non-robots would think.
- Creating something out of nothing. Surprise, surprise, the business world is indeed a publish or perish world. If you don’t publish surprising and interesting results in your research report, clients may be less likely to consider you as a vendor as you clearly don’t have the skills to find the surprising and interesting results. Alas, this philosophy should never lead you make a mountain out of a molehill simply so you have something cool to show your client. This is just another form of falsifying data. You will get caught. You will be horribly embarrassed.
- Focusing on entertaining, not educating. I’ll say it. Storytelling is a huge fad right now. If you don’t turn your research results into a story and delight and amaze your audience, your client may be hugely disappointed. But if your focus is on telling a pretty story instead of discovering whether there actually is a story, you are once again succumbing to falsifying data.
Remember, the research must come first. Decide on your research objective, build a great research methodology using the right sample sizes, the right scales, the right instruments, the right techniques. Analyze the data properly, thoughtfully, logically. If indeed there is a story worth telling, it will be done with integrity and validity. That’s the kind of story I want to hear.