How to do Ad Testing on Tablets and Smartphones: A Webinar Replay

mobile world.005

In our latest webinar, we demonstrated how to conduct ad testing, and specifically how you can do ad testing on tablets and smartphones. Learn the traditional ways of testing ads and get introduced new ways to test ads on your mobile devices.  Advertising today is more innovative and risk taking than ever before. It is important to collect data and make sure that you are sending the right message to the right demographic in advertising. Taking your ad testing mobile can help you achieve this and have better insights on how to effectively advertise.

Up until recently there were limitations to testing ads on mobile devices. You may be asking yourself questions such as – Why test ads on mobile devices? Why not test ads the way it is already being done? A lot has changed in the past several years, and mobile devices are a huge part of our lives day in and day out.  With ad testing on tablets and smartphones, you can test the reaction to advertising in the moment and collect feedback from your target audience. You can verify who is responding to your ads by actual seeing them in person, and being able to tell that they are who they actually say they are. Most importantly, you can see the return on investment from your advertising and methods used to test your various types of media.

If your ads are sensitive materials, there is no need to worry. Your information will not get into the wrong hands because you can instantly delete the mobile device of content – something you can’t do with computers.

We invite you to watch a replay of our webinar and access the slides from the presentation below.


Are We Practicing “Infographics” Like the Cavemen?

If you haven’t had the opportunity to check out the new Research Access Blog,  I encourage you to visit.

There, you’ll find brain massaging articles, essays and information on research from some different perspectives.

Last week, Alex Gofman, Vice President of Moskowitz Jacobs Inc. wrote an interesting essay on early graphic “infographics” a visual representation of information.

After reading this article, I found it interesting that we seem to be evolving somewhat full circle as we strive to take quantitative data and make it more consumable by the use of presentations, video, slides and other visual “infographics”.

Read more about infographics at Research Access…

Reblog this post [with Zemanta]

Successful Survey Tips: How to Get “All of It” From Your Surveys

Successful surveys don’t just happen.  They are a function of doing some very logical simple things really, really well.  The first element is knowing what you want to know; defining objectives, laying out what decisions you’re making and planning out the infrastructure of the survey.  Another element of the series is actually writing the questions and making it easy for the respondent to participate. Next, we’re going to focus on leaving our respondent happy and preparing ourselves for analyzing the data.

My favorite part of this clip is actually the very end when Mr. Miyagi informs Daniel that he will be using his new technique to paint the WHOLE fence.  Daniel says “All of it?” and the camera sssslllloooowwwllly pans around the garden.  All you hear in the background is Mr. Miyagi saying “up….down….up….down”

Yes.  All of it.  If we really want to get all the benefit from the work that we had done in defining our objectives and creating engaging questions, than the very least we can do is finish the job and get as much information and future cooperation from our respondents.    This next set of tips is designed to reduce the amount of work for you and also reducing the need to go back to your respondents for information you might have missed.

  • Segment your sample.  If you are using an existing customer list, pre segment your sample using the “custom variable” feature in QuestionPro.  You have the ability to use as many as 255 custom variables.  If you already know specific demographics about your respondents, then this is the ideal place to program them in.  In addition to that, you can place up to 5 custom variables in an e-mail invitation to personalize it to each respondent.  You can also compare as many as 10 segments at a time by the specific questions that you ask.
  • Pre-test your survey. The easiest way to test your survey is to literally give it internally to your company or a trusted group of respondents.  Be sure to tell your test group who the audience or the respondents are and to act is if they were the target respondent when answering the questions.  Look for two specific types of feedback; first check for clarity of the questions.  Did the respondent perceive the question as it was intended?  Next check the test data and see if you can make the decision that was the core of your objective.  If you don’t have enough information to make the decision, then you will have to go back and tweak the questions.
  • Use a Thank You. QuestionPro gives you a variety of ways to say “Thank You” to your respondents.  There is, of course, a Thank You page.  This is actually a wonderful piece of promotional real estate where you can give your customers a “downloadable” thank you gift.  Another use of the Thank You page is to send your respondents to another page on your web site where they can get more information about the topic that they’ve been surveyed about – maybe even a blog post where they can provide more feedback.  You can also send your respondents a Thank You e-mail in addition to a thank you page.  I would recommend using BOTH the Thank You page AND a Thank You e-mail especially if you are providing a downloadable gift.

What are some of the ways that you get the most out of your surveys?  And what tips do you have for rewarding respondents and/or saying thanks?

Related articles:

Reblog this post [with Zemanta]

Successful Survey Tips: Setting Your Survey Up For Success

I’ve been thinking about the “zen” of doing a successful survey.  As with many things, it’s taking the time to perfect specific techniques that ultimately leads to not only high response rates, but high quality feedback that actually means something.

I’ve pulled together a series of  successful survey tips that I’ll be sharing with you over the next few days.  Take those in and why not add your own successful survey tips.  When the series closes, I’ll include your tips and put out a best practices list!

As I was thinking about this series, it dawned on me that none of these tips are actually new.  Yet, it’s our skill at implementing each of these elements that ultimately determines our success.  The next thing that popped into my mind was the “Wax on, Wax off” scene from the “Karate Kid” and how the learning to do basic mundane actions can yield winning results.  Enjoy.

  • Focus on what decision you’re making. This is a twist on setting a survey objective.  Often the reason we do surveys or gather feedback is to collect data so that we can make a decision.  State the decision that you are making and include the criteria of the decision.  For example, “Should we launch product X?”  You might say that if more than 100 people are very likely to purchase product X at price Y, then you will go forward.  This puts a laser focus on the questions that you will include in the survey.
  • Use an invitation with well written subject that grabs the respondent’s attention.  It’s no secret that respondents are focused on what’s important to THEM and not you.  Write your invitation in a way that points out the potential benefits to the respondents in filling out the survey.  The invitation is actually a PR opportunity for you to communicate to your respondents that you are engaged in creating a product or service that will benefit them.  It’s an opportunity to differentiate your organization from others and highlight some potential improvements that your competition may not be offering it.  Don’t let this opportunity go to waste.
  • Use an introduction that makes the respondent feel important. Just because you’ve sent an invitation doesn’t mean that you should ignore the introduction to the survey.  Today’s respondents want to know what you’re up to.  Use the introduction to the survey as an opportunity to make them part of your team and include them in the development of something new and beneficial that will bring them value.  This will put them in a mindset to provide honest and valuable feedback.

What are your successful survey tips BEFORE the survey even starts?

Reblog this post [with Zemanta]

How to Develop Survey Questions That Help You Make Good Decisions

I’ve been involved in a back and forth with a client around creating survey questions that measure the effectiveness of a team. I was most concerned about the nature and the quality of the questions that we would ask.  We all know the old saying “Garbage in. Garbage out.”  And this is certainly true as it relates to constructing good survey questions.

QuestionPro already gives us many options and question types to choose from.  But if you’re not asking the right questions, you’re not going to get actionable results.

The most basic reason we conduct surveys is to help us make decisions.   In fact, surveys aren’t just used for marketing decisions.  They are often most useful for making improvements in our operations.

How to Know If Your Question Will Yield Actionable Results?

The best way to test your questions for good results is to literally run your survey internally as a test and ask people to answer the questions.  Then look at your results.

Gather a small team in the room and set your objective as reviewing the test results for specific action items.

For example:

“We received a score of 6.7 out of 10 as a response to “Overall, how would you rate the training you received?”  Our objective is to raise this score to an 8 out of 10.  What are some specific action items we could do to improve this score?”

Chances are your team may take in this information and come up with more questions such as:

  • Which aspects of the training were rated low?
  • Which aspects were rated high?
  • What specific parts of the training drive the respondents’ experience?  Is it the trainer, the materials, the exercises, the venue?

If your group is tasked with coming up with specific changes to drive improvement, and they are asking these kinds of questions — then the original survey question is too broad.

The good news is that the questions you’re asking yourselves in order to come up with actions that will drive your score up are the key to creating more specific questions that will help you take action.

Focus on What Decision You’re Making And The Objectives

What is the objective of your survey and what decisions will you make with the results?  Are you looking for ways to improve your training process and system so that your trainees retain more information?   In that case, you can focus the question on your respondents’ ability to actually do a specific task.

With that in mind, you can create a more focused question such as:

“After completing the training, I can process an online order easily”

This question is focused on the area of online order processing.  If you get a low rating, your team will know to focus on that specific area for improvement.

Managing Survey Length

The downside of focused questions is that they tend to proliferate and make your survey too long.   This is why it’s critical to be clear about how you will use the survey to help you make decisions.  It’s worth the time and effort to discuss and come to agreement on what decision you are making and what information you will need to make that decision.  That alone, will eliminate those “nice to know” questions that yield interesting results, but really don’t help you make improvements.

Developing a Survey Strategy

If you’re using your survey results to make decisions, and you don’t want to overwhelm your respondents with a long survey.  The best thing to do is develop a survey plan.

  1. Identify the decision that you are making
  2. Identify the data you will need to support your decision
  3. Develop survey objectives around your decision
  4. Create ALL the questions that will give you the information that you need.
  5. Break those questions out into separate surveys that your respondents can take at different stages of the process.

Focusing your surveys on decisions and actionable steps that you can take will not only yield better results, it will make your team more efficient and your respondents more responsive because they can see their feedback in action.

Reblog this post [with Zemanta]

Questionnaire Length: The Long and Short of Participant Engagement

Survey length in online interviews continues to be a bone of contention among people all along the marketing research continuum. While some within the industry recommend keeping questionnaires short to promote participant engagement, others often dismiss this advice. Short questionnaires, they assert, won’t unearth the depth and breadth of information that is needed.

As a sample provider offering guidance based on research, SSI suggests keeping online interview length at 20 minutes or less. Generally, when interview length increases, fatigue also increases and, conversely, attention span decreases potentially damaging data integrity. In effect, researchers who insist on longer questionnaires, sincerely believing they’ll get more information, may, in actuality, be sabotaging their efforts.

Research Design

In order to more fully understand the possible effects of survey length, fatigue and subsequent response quality, SSI recently fielded two surveys: one long and one short. This study replicated a ground-breaking study conducted in 2004 by Sandra Rathod and Andrea la Bruna which concluded, among other things, that data quality suffers as interview length increases.

The surveys, both then and now, utilized a block design and were divided into four blocks of questions, each representing a different subject matter. The blocks were randomized for each respondent so that the effect of survey length on response quality could be investigated by comparing whether the different order of the blocks led to different response patterns as the block position varied in the survey.

Fatigue Effects

One of the hypotheses of the 2004 study was that respondents would take less time and exert less effort later in the questionnaire than they would earlier in the questionnaire, due to fatigue. This hypothesis was proven in 2004: as the same block of questions was moved further back in the study, the time taken to complete it gradually reduced.

It could be argued that the decrease in block completion time was due to increased familiarity with the question set. It is true that the question blocks were similar in their construction and contained somewhat similar questions. However, additional evidence on panelist fatigue shows that at least some of the increased speed was due to fatigue.

Panelist Fatigue and Satisficing

One of the behavior outcomes of cognitive fatigue is satisficing – doing just enough work to satisfy the task. To see if this behavior was present in the 2004 study, researchers looked at a question, in each block, that it was possible to skip. This question offered a set of scales, presented in the form of sliders. The slider bar was positioned at the mid-point so it was possible to click on “next” without moving the slider and still leave some data behind. The likelihood of skipping the question rose as the skippable question was encountered further and further into the questionnaire.

In 2009, SSI found precisely the same pattern. The first time the skippable question was encountered it was more likely to be completed than when it was seen on subsequent occasions.  This was particularly true for the long survey. A reduction in elapsed survey times did not mitigate the effect. The long survey, at nearly 25 minutes, was still too long.


In both 2004 and 2009, the long survey proved itself too long. It fatigued the respondent and led to satisficing behavior. When questions could legitimately be skipped, they were. Perhaps the most unsettling finding was that the instances of cheating, deliberately telling a falsehood in order to skip an entire section, also increased as the survey progressed.

Following the 2004 study, researchers indicated that there is a “critical point in online survey response when the fatigue effects become significantly more pronounced. That critical time is around the 20 minute mark. If researchers work to keep surveys shorter, it will not only help ensure response quality, but it will also make for more motivated and responsive respondents.”

Today’s research confirms that interview lengths of 20 minutes or less can produce wonderful and engaged responses if well designed. The fact that there was much less satisficing and cheating in the short survey attests to this. If researchers work to keep surveys shorter, it will not only help ensure response quality, but it will also make for more motivated and engaged participants.

Note: For a copy of the white paper Questionnaire Length, Fatigue Effects and Response Quality Revisited, with the results of the studies cited in this article and to share your comments, go to

About the Author: Pete Cape is Global Knowledge Director for Survey Sampling International. SSI provides access to more than 6 million research respondents in 72 countries. Sources include SSI proprietary panel communities in 27 countries and a portfolio of managed affiliates. SSI can potentially access anyone online to give their opinions via a network of relationships with websites, panels, communities and social media groups.

Reblog this post [with Zemanta]