What Market Research Trends Scare You or Excite You?

I know – I know – it’s hard to start thinking about the planning cycle for 2011 during these summer days.  But summers aren’t lazy anymore since we can’t stay away from our mobile and computer screens, so we shouldn’t be lazy either.

Here at QuestionPro and Survey Analytics we’re starting to think about the Market Research Trends that scare us and excite us.  The really fun part about this conversation is that part of our audience is scared and the other part of our audience is excited — and all about the same trends!

One huge trend that fits that bill is the whole DIY Marketing movement.  Of course, having the DIYMarketers brand, I’m a little partial and excited about DIYMarketing as it pertains to market research – but you might not be.

So tell me – which DIYMarketing trends scare you and which ones excite you?!

Pass it on!  Leave your comments here and we’ll take a look, maybe get back to you for more info and possibly feature them in a future webinar we’re planning. (So you could be a star!)

The Next Generation of Website Usability Testing

Not long ago having people “test” your web site meant going to a facility and literally watching people navigate your site.  A few years ago, I was recruited to be a tester for a retail site.

The process was very cool.  I received an email with instructions that told me that I was looking for a “red shirt as a Christmas present for my Uncle Bob” with very little other instruction, I was left to find my way through the site.  Every so many clicks a survey window would pop up and ask me about my experience — did I find what I was looking for?  Was it clear?  What would work better?  Those kinds of things.

Overall, it was a fascinating experience and it left me wondering if this was going to be the next evolution of usability web testing.  It certainly beat the expensive alternatives of recruiting people, having them sit in a room and record and watch their navigation and asking them questions.

You’d think that these apps would pop up all over the place and that more companies would be using them.  And maybe they were — but I had never seen another survey opportunity like that.

TryMyUI Makes Usability Testing Easy and Accessible

Today there is a new application called TryMyUi.  You can read a product intro and review over at Research Access – if you’ve been considering doing usability testing – you should check this out today because it’s FREE to try for now.

Q & A session from Webinar: How to Use Conjoint Analysis in the Innovation Process

On Thursday July 22nd, 2010 over 118 participants signed up for Survey Analytics and Dorian Simpson of Planning Innovations presentation on: How to Use Conjoint Analysis In the Innovation Process.

Conjoint Analysis is a powerful and often under-utilized marketing research tool that can provide powerful insight into how your customers actually think. The resulting information can be used to prioritize features, develop pricing strategies, and estimate market share… all before you develop your product or spend valuable marketing dollars.

Participants posted the following questions and both presenters, Dorian Simpson of Planning Innovations and Esther LaVielle of Survey Analytics, responded to each one.

1) What are new innovative ways to gather data and analyze it using Conjoint Analysis? What kinds of tools are available in market to perform conjoint analysis?

SA – At Survey Analytics we offer a robust yet easy to use Discrete Choice Conjoint Analysis tool. Guidelines are provided to ensure data is concise and accurate. We also provide a market segmentation tool, which offers you an opportunity to “test” new product ideas against your current data to help predict possible market share.

http://www.surveyanalytics.com/conjoint/index.html
2) Many times consumers don’t take surveys etc seriously and just complete surveys for the sake of it. How can we take that into account when applying CA?

DS – It’s important in the lead in that you let the respondent know that YOU are taking the survey seriously and that you would appreciate if they do also. This is less of a problem if you’re using your own databases. You should also try to screen out responses that are obviously completed just to finish, such as never varying their response.

SA – I agree with Dorian. Respondents always appreciate an introduction that is upfront with your intentions. Be honest with how long it would possibly take and provide an incentive that appeals to your targeted sample. In my experience of working with internal databases, you will become “familiar” with those who are not truthful or do not take your surveys seriously and can remove them from future surveys.

3) What should be the minimum sample size to conduct conjoint…any lower and upper limits and implications of sample sizes…for calculating utility values…(help me on the sample size limitations) ?

SA – This depends on your target market. The larger your target market, the larger your sample should be for statistically significant data.  The general rule of thumb for Conjoint Analysis is usually a minimum of 200-300 completed surveys. This, however you can go down to 100 completed surveys if your target market is relatively small.
4) Most use cases of conjoint focus on consumer electronics/durable goods. Is there a case for using conjoint in the FMCG/CPG industry?

SA – There is an example of a packaged goods study- Trail Mix: http://surveyanalytics.com/t/ADvnXZIA2S

As you can see in the results Dry Fruit had the highest relative importance compared to other ingredients whereas Nuts Type 1(sesame seed and sunflower seeds) did not make an impact on choice.

5) As attributes and levels are important in conjoint what should be appropriate “no” on attribute levels?

SA – It would depend if the feature is something you may want to add or not.

For example, if you wanted Trail Mix with/without Crackers you would set up the following:

Features: Crackers >> Level: Yes, No

6) Did you ask the “why” questions such as frequency and power questions in a study after the conjoint study?

DS/SA – It has been investigated in other research and will be tested again further.


7) How is conjoint used in the launch of a service business versus a product launch?

DS – Can be used similarly. E.g. Instead of price it may be price/mo., etc. You must identify attributes and levels similar to a product.

SA – A fun example is a hair salon. What kinds of services will you offer to your clients and at what price do you think they would pay for it?As Dorian said you must identify attributes and levels similar to a product.

8 ) With 6 attributes and multiple levels, how long was the [example] survey? I assume that you used experimental design to shorten the length of the survey?

SA – The case study survey that was used during the presentation took respondents on average 15 minutes to complete.

9) Do we see conjoint analysis used often in the food industry…specifically for product development?

SA – Yes. Conjoint Analysis can be used in any industry that is interested in doing a trade-off analysis of some type. Whether it is on a medication a pharmaceutical company is trying to develop or a new kayak model that would appeal to families with young children, Conjoint Analysis can be used to provide guidance in those industries.
10) Is there max levels else of options to ask in choice task?

SA – The minimum is 2 levels per feature/attribute. The standard is to stick to no more than 3-4 levels per feature/attribute. Every once in a while going up to 5 may be needed depending on the feature needed to be test.

11) Can I use conjoint in B2B surveys where sample size will be usually low almost 100 customers. In what cases can I use?

DS – You’ll want to keep the number of attributes and levels reasonably low.

SA – The fewer the respondents being surveyed the fewer attributes and levels should be used. At this point in your research you should have highly defined features and levels that would fit your targeted sample size.

12) Is there any relation between the number of attributes and minimum number of respondents required to get the results from Conjoint Analysis?

SA – From a technical standpoint, the system does NOT impose any limitations. You can have unlimited attributes and unlimited levels within each attribute.

However, from a practical standpoint, it is unreasonable to have more than 4-6 attributes, and about 3-4 levels per attribute. Our suggestion would be to keep the number of attributes to under 5 and try and seek about 3 levels for each attribute.

13) What is the ideal task count?

SA – Our experience has shown that there is a precipitous dropout rate after about 15 tasks. Unless there is a strong personal incentive for the end-users to complete the survey, we would suggest keeping the number of tasks to fewer than 15 especially in cases where users are volunteering to take surveys. Please keep in mind that conjoint product selection is a little more involved than simply “answering a survey question” — users have to comprehend each of the attributes/concepts and then make a choice.

On the lower side, we would suggest that 5-8 tasks be the minimum for a conjoint model with 3 attributes. The more attributes you have, the more number of tasks users has to fill out.


14) In the cases study in the webinar price was one of the key features. I didn’t get how the results are interpreted. Can you explain it again?

SA – See screen shot below:

15) When you say “market share”, you mean “share of preference”, right?

SA – Yes, that is correct.

16) Don’t we need any intelligence in the tool when designing the conjoint study? The tool may generate a profile which has worst features but its price is highest.

DS – This is true, but this is part of a conjoint analysis to understand what your customers deem which attributes and levels are the worst. I don’t think you want to limit options for a high price and low attribute levels.

SA – We have built intelligence into our conjoint tool such as the prohibited pairs tool to ensure certain combinations that are not possible will ever show up.  We must be careful in using this tool because the idea is not to limit the profiles based on what the client will not do, but to find out what resonates higher with your audience. We also provide a concept simulator that will calculate the number of times an attribute will be shown given the approximate number of people who will complete the survey.

17) I have QuestionPro enterprise account.  Do I have access to conjoint analysis tool?

SA – No you do not have access to this tool. To access the Discrete Choice Conjoint Analysis tool you must upgrade to Survey Analytics.

http://www.surveyanalytics.com

18) Is it possible to get a copy of the slides from the presentation?

Slides from Survey Analytics:

http://docs.google.com/present/edit?id=0ARLS1YfnuC-fZGhzYzVnamdfODhnYjJ3bjRoYw&hl=en&authkey=CIzY9psK

Slides from Planning Innovations:

http://docs.google.com/present/edit?id=0ARLS1YfnuC-fZGhzYzVnamdfMTIxaGI2cDZ2aGI&hl=en&authkey=CJeq3dgN

19) When you changed the price for the subwoofer [in market segmentation simulator], you used $800 (a level that existed in the study).  Can you use a price that is not in the study, such as $900?

SA – No you cannot. Before begin starting the conjoint analysis tool it is critical to have highly defined features in attributes before starting. In the case of our example in the presentation, we have established that  $600, $700, $800 are the price points to test. Any other price points would require another conjoint study to be run before using the market segmentation simulator.

20) What kind of conjoint analysis is better: adaptive or choice based?

SA – This depends on what you are interested in retrieving data for. If you are looking for data that mimics the purchase process then the Choice Based (Discrete Choice) Conjoint Analysis is the better bet. Adaptive Conjoint Analysis (ACA) is a computer-administered, interactive conjoint method designed for situations in which the number of attributes exceeds what can reasonably done with Choice Based Conjoint Analysis.

Survey Analytics specializes in Discrete Choice Conjoint Analysis.
21) If we have a product with 15 attributes and 5+ levels, is there a way to figure out it?

DS– It’s important to focus on the most important attributes that really drive decisions. You’ll probably want to do preliminary research such as interviews, focus groups or short quantitative to narrow it down.

22) You have said market share is a result from the utilities obtained, but this isn’t correct. Is it?

SA – Market Share: % of Profile 1 / All profiles in simulator based on % on relative importance and # of responses seen.

The market simulator uses aggregate utility values to project the probability of choice and hence the market share
23) What other kinds of conjoint exists? How can I compare advantages & disadvantages with method seen today?

SA – Here are other conjoint tools that you can review and compare:

Adaptive Conjoint Analysis

Choice Based Conjoint Analysis

Discrete Choice Conjoint Analysis * Survey Analytics specializes in Discrete Choice Conjoint Analysis*

Full profile Conjoint Analysis

Adaptive Choice Based Conjoint Analysis

24) In your experience what was your worst result in a conjoint study? I mean when results were not logical or useless?

DS– Can’t really say. Most studies have had some “interesting” results that needed further investigation, but they’ve never been useless unless the company didn’t do the upfront work to understand the right attributes and levels.

SA – With Survey Analytics we appoint a dedicated account manager who will help with conjoint studies to ensure statistically significant data.

About the Presenters:

Dorian Simpson founded Planning Innovations in 2002 to help technology-driven companies launch successful products and services through focused innovation management and planning. He has significant experience in both engineering and marketing to help build the bridge between these two critical innovation functions.

http://www.planninginnovations.com

Esther LaVielle is a Senior Account Manager at QuestionPro and Survey Analytics, which was started in 2002 in Seattle and is now one of the fastest growing private companies in the US. Prior to her adventure at QuestionPro she spent 3 years as a Qualitative Project Manager at the Gilmore Research Group.

http://www.surveyanalytics.com

Coding Social Media “Responses”

Bassett Hound with HUGE Ears perked upThere is a whole new and improved research segment blooming out there — social media research.  There are experts out there who will scan the social media sites for chatter about your company and your brand.  They will make sense of a and filter seemingly meaningless chatter and mine it for marketing intelligence you can use in your next campaign.

But what if you don’t need that high a level of research but still want to listen to your target audience?

Here are some ideas for analyzing social media comments that I found :

Basic Categories to Listen For:

  • Praise
  • Criticism
  • Recommendation

Next, tag the social media channel:

  • Blog
  • Facebook
  • LinkedIn
  • Twitter
  • Other

The next level of coding might include:

  • Brands mentioned
  • People mentioned
  • Subsidiaries/Locations

Finally, you can start breaking out specific items or concepts that appear in the comments.

What to Listen For

Coding social media comments is a little different from coding data in that these aren’t responses to specific questions.  It’s actually a lot like Jeopardy (the game show) where the comments are answers to questions that you will have to intuit.

Each comment or statement gives you more information than you might expect.  You will have a user — the person who made the comment.  You can look at their profile, and learn about what demographic or psychographic they are in.  Instead of asking questions YOU want answers to — you’ll be able to see what’s really most important to your audience and the hot button issues they have around your brand.

You can ultimately use this kind of information to develop survey questions that are more targeted and in tune with where your audience is.

Response Style Bias and How to Overcome It

Happy Face Neutral Face Sad Face with ButtonsA few weeks ago, Research Access had an article that talked about cleaning data from panels.  One of the methods was to remove all the “extreme” responses that occurred in rating scales.  If you’ve ever received a survey with too many matrix rating questions and then mindlessly checked all the lowest ratings, mid-ratings or high ratings — you know who you are and you know what I’m talking about.

This put a bug in my mind about those people who never give the highest rating — or those people who consistently give a high rating.  In many ways, they are no better than those panelists who were mindlessly answering their survey.  In other words — their responses were suspect in being valid answers to help us make a decision.

Then I ran into this article by Jeff Henning and realized that I wasn’t the only one thinking about this.  In fact, people much smarter than I’ll ever be have done loads of research on the topic and have come up with several solutions that I’m going to show you here.

Henning’s article actually gets into the discussion about cultural response bias, and you can read more about that in his article.  The following are question types that Henning listed that will help you overcome general response style bias:

  • Binary scales – As far back as 1946, to minimize response style bias, Lee Joseph Cronbach (ofCronbach’s alpha fame) advocated using two-item scales: yes/no, agree/disagree, dissatisfied/satisfied, describes/does not describe. Essentially this treats everyone as Extreme Response Style responders.
  • Choose-many questions – Presenting a list of choices and instructing the respondent to “select all that apply” is an economical form of binary scale, prompting respondents to choose the items they agree with, find important or are dissatisfied with.
  • Ranking questions – Another way to avoid traditional response bias is to use ranking scales, where each choice on the scale may be used only once: most important to least important, most satisfactory to least satisfactory, most likely to least likely.
  • MaxDiff scaling — Maximum-difference discrete-choice models are a more sophisticated type of ranking question, typically showing attributes four at a time and asking the respondent to select the best and worst attributes from each set: the attributes with the maximum difference. Research has demonstrated that MaxDiff scaling is superior to rating scales for cross-cultural analysis. (Steve Cohen & Leopoldo Neira, 2003, “Measuring Preference for Product Benefits Across Countries: Overcoming scale usage bias with Maximum Difference Scaling”.)

Lessons Learned: Surveys should provide additional data for your decision — not MAKE Your Decision

So it isn’t just panels that can “dirty” your data enough to mis-represent how respondents really feel.  Response bias comes in as many shapes and forms as there are human opinions.  The key point here is that decisions are ultimately made by human beings.  Managers and business owners can use survey data as an additional information resource to help them decide — sort of like the  “Ask the audience” option in “Who Wants to be a Millionaire” but that doesn’t mean you shouldn’t use your own judgement.

What’s the Intersection of Neuromarketing and Market Research?

If you haven’t done much reading on Neuromarketing, I’d recommend that you pick up any or all of these books — because you’ll need it.

There are tons of other books out there, but these three are a great start.  The short skippy is that the advancement of fMRI technology has allowed science to literally map how our brains work and react when we are exposed to a variety of pictures, products, or other stimuli.  It’s almost like a shopping lie detector in a way.  Whereas the market research we’ve been used to has huge swings of error based who the respondent is and how they perceive the question.  The other inherent danger is just plain “lying”  — even though it may be unintentional.  Human beings will think one thing and do another.  We already know that.  But Nueromarketing has sort of blown back some of the smoke and mirrors and allowed us to see some of the seedy side of our human nature.

In a recent article in Research Access, Tim O’Connor does a wonderful job of explaining the practice of “priming” or asking leading questions in a survey.  He brings out how critical good clean question design is to actually getting valid survey results.  For example — phrasing a question with “Is it important” will lead the respondent toward a YES answer.

Still more interesting Neuromarketing stuff for you to chew on – are women more perceptive than men?

This video clip was featured in the blog “Neuromarketing” and it shows the actual biometric readings from both men and women and their reaction to Tony Hayward’s speech after the oil spill.  Please note that this was NOT a reading of men and women turning knobs — these “respondents” were actually hooked up to biometric sensors that measured their heart rate and sweat and other biometric readings.

While having respondents hooked up to equipment doesn’t quite qualify as market research the way we’ve known it in the past — how long before this idea of asking questions becomes obsolete?

How to use Conjoint Analysis in the Innovation Process

Webinar Presentation
Thursday July 22nd, 2010
9:00am PST

Ever thought about using Conjoint Analysis as part of your research strategy?

Your customers are constantly making trade-offs when making purchase decisions between you and your competitors. Traditional research questions, such as ranking features and asking pricing sensitivity questions are valuable tools, but often leave you wondering which features are really important and how you should price vs. real competition. So how can you simulate a real-world purchase decision before you go to market?

Conjoint Analysis is a powerful and often under-utilized marketing research tool that can provide powerful insight into how your customers actually think. The resulting information can be used to prioritize features, develop pricing strategies, and estimate market share… all before you develop your product or spend valuable marketing dollars.

Join Survey Analytics and Planning Innovations for this one-hour webinar on how to effectively use Conjoint Analysis in the innovation process to prioritize needs, explore pricing options, and validate your product and service concepts.

We’ll answer:

1) What is Conjoint Analysis and how does it work to simulate real world trade-off decisions?

2) How can you develop Conjoint Studies that provide guidance in innovation planning?

3) How can Conjoint Studies help you predict potential market share for new product concepts?

This webinar will answer these questions and more as well as provide a forum to discuss specific challenges.

Click Here To Sign Up: https://www2.gotomeeting.com/register/447044739

About the Presenters:

Dorian Simpson founded Planning Innovations in 2002 to help technology-driven companies launch successful products and services through focused innovation management and planning. He has significant experience in both engineering and marketing to help build the bridge between these two critical innovation functions.

http://www.planninginnovations.com

Esther LaVielle is a Senior Account Manager at QuestionPro and Survey Analytics, which was started in 2002 in Seattle and is now one of the fastest growing private companies in the US. Prior to her adventure at QuestionPro she spent 3 years as a Qualitative Project Manager at the Gilmore Research Group.

http://www.surveyanalytics.com

Can a Blog Help You Groom Your Audience for Brand Research?

A blog is such a wonderful marketing tool.  I can’t help but see your company’s blog space as a “computer billboard.”  In so many way’s it’s like FREE advertising and education space where you can groom your audience to see you as you want to be seen.

Too often, companies treat their blog as a chore.  That’s too bad because it’s an opportunity to really build your brand and establish your company as THE one to choose when your customers are looking for what you are selling.

Not only that, but you can actually bring your audience into the conversation, ask questions and actually read their responses.  It’s not exactly a focus group, but it really does give you more control over your message and interacting with your audience than you can hope to have through direct marketing or advertising.

Because you have blog registrants and subscribers, you can post articles about products or services and then direct your audience to a survey afterwards.

In what creative ways have you used your blog?

Mobile Panels and Surveys are the Latest Trend

It was just a matter of time before surveys went mobile.  Welcome the latest entrant into the mobile survey sphere — Thumspeak.  This is a terrific tool that allows businesses to recruit a mobile panel of respondents and collect feedback via phone.

Research Access has a terrific industry news write up on this trend.  I’m going to give you my impressions of Thumspeak in this article.

First Impressions

When you land on the website – you’ll see right away that they are recruiting panels of mobile users AND business customers.  Right away, I’m clear that if I’m going to use this application and the panel of mobile users, I have to be clear that my sample is “self-selected” to include people who use apps on their mobile phone.  I know, a lot of people do — but I still think that the kind of person that’s going to take the time to do a survey on their iPhone is psychologically different from a person that uses their mobile device in a purely utilitarian way.

Registering is easy.  It takes less than a minute to fill out the form and start your free trial of questions.  You get to create a survey with 5 questions.

I created a quick mobile survey questionnaire and got this email as a response.  I sort of wish they had given me this FIRST – I think I would have asked better questions.

Here is what happens immediately after you clicked on the submit button. (1) Thumbspeak staff will quickly approve the content of the questions just to make sure it is appropriate for our audience. (2) After approval, our system will deliver your questions to our mobile opinion network for a quick 100 answers. (3) We then package your results with an incredible amount of data so you can gain insight from on-the-go mobile users.

We also include at no additional charge the following about each respondent;
– age
– gender
– location
– education
– household income
– race
– marital status
– employment status

They also give you a link to give feedback — I’m going to do that right now.

Click over to Thumspeak and explore it for yourself.  What’s been YOUR experience?

From Crowd “Sourcing” to Crowd “Solving” There’s a Better Way to Use the Wisdom of Crowds

On the game show “Who Wants to Be a Millionaire?”, players have the option to “ask the audience” if they are stumped on a question.  When they ask the audience,  a chart shows up and shows how many audience members voted for each answer.  Usually (I’ve never seen it go otherwise- but I’m not an avid fan), the player picks the answer that the majority of the audience members chose.  The idea is that a group of people is smarter than the individual.

This is also called “Crowd Sourcing” and it’s the principle behind IdeaScale and most other research.  But it isn’t always the best resource to use.

Check out Andrew Jeavons latest article in Research Access brings up a fascinating discussion about more strategic uses of your crowd in order to actually improve your process and solve real problems.