Loyola Marymount University A well-designed survey can be a powerful tool. This resource highlights five common steps in good survey design and can serve as a guide when designing your own surveys.
Step 1: Identify the objectives of your survey Before designing your survey it is essential that you first identify your objectives, or the reason why you are conducting the survey. The following questions will help you to clarify your objectives to the survey. Consider and make note of the answers to these questions as you begin to develop your survey:
1. What are you trying to learn from the survey results? Having a clear understanding of the purpose of your survey will help you identify the type of information you must collect in order to meet your objectives.
2. Who is the target population, i.e., who will you be surveying? Identifying the characteristics of your population (e.g., education level, age range) will help you to determine the type of information that you will be able to collect from them. It will also help you to understand what factors, if any, may influence their responses to your questions.
3. Who is your audience, i.e., who will use the information from your survey? Determining the type of information your audience (e.g., decision-makers, the academic community) is looking for will help you to identify the questions that will address their needs.
4. How will the information be used? Understanding the way in which the information you collect will be used also helps to identify what questions to include. For instance, if you need to make comparisons between different groups of people, you must include a question that will help you group them appropriately.
Step 2: Write high quality questions based on the objectives The questions that you include on your survey should always be guided by your objectives. This will help to ensure that you gather quality data and are able to address both your needs and the needs of your audience. Gathering quality data is also dependent upon the quality of the questions that you have constructed. Good questions have the following characteristics:
1. Clear and unambiguous. Here are some ways to achieve this: a. Use simple language. Not all respondents will be familiar with complex terminology. To reduce confusion or misunderstanding,
use the simplest language available to you. If your objectives for the survey require you to use a more complex term, define it for the respondent. Consider the following,
Poor Example: What is the frequency of your use of the computers in the William H. Hannon Library in the past 7 days? “Frequency” is not a term that is commonly used by the average respondent. The term is better understood as, Better Example: How many times did you use the computers in the William H. Hannon Library in the past 7 days?
b. Be specific. Your questions should be precise enough that the respondent is able to identify what the question is referring to without being overly wordy. Consider the following example,
Poor Example: Did you vote in the last election? This question doesn’t specify the type or date of the election. A more specific question can be phrased as, Better Example: Did you vote in the November 2008 presidential election?
c. Avoid double-barreled questions. Double-barreled questions include multiple parts, but ask for a single answer. Survey questions should focus on one question at a time. Consider the following opinion question,
Poor Example: To what extent did your instructor address your concerns and questions in class? Using the term “and” can be an indication of a double-barreled question. In this case, addressing a student’s concerns and questions are two separate things, and should be developed into two separate questions, Better Example: To what extent did your instructor address your concerns in class?
To what extent did your instructor answer your questions in class?
d. Avoid double negatives. Respondents can become confused when reading questions and responses that contain both the use of negative words, such as “not,” “no,” or “didn’t.” Reduce confusion by minimizing the use of negative words in the body of the question, as in the following examples,
Poor Example: I did not participate in the community service program last semester. A negative response to this question would indicate that the respondent did participate, while a positive response would indicate that the respondent did not participate. It is less confusing to remove the negative phrasing in the question, Better Example: Did you participate in the community service program last semester?
2. Concise and to the point. A respondent’s time is precious. Long questions with unnecessary details take up too much time and can cause respondents to lose interest. On the other hand, your questions should also be specific enough to be clear and unambiguous. To achieve this, write questions that get to the point as quickly as possible in as few words as possible. Consider the following,
Poor Example: During an average week of the semester, what amount of time, in hours, do you devote to preparation for the next class, whether in reviewing notes, reading course material, or discussion with other students?
This question is unnecessarily long and detailed, so that a respondent may miss the point of the question entirely or become frustrated. It can be shortened while still capturing the objective of the overall question, Better Example: In a typical week this semester, how many hours did you spend preparing for class?
3. Free of bias or leading statements. If questions are biased or leading in any way, they will steer a respondent toward the response that is considered socially desirable. This can occur either in the question itself or by limiting the types of response options that you provide for the respondent. Consider the following example,
www.lmu.edu/surveys 2 | S u r v e y D e s i g n
Poor Example: Tutoring services have been shown to improve college GPA. Do you plan on participating in LMU’s new tutoring program this semester?
This question pushes the respondent toward a positive response by implying that participating in the program would be the responsible choice. Develop questions that are neutral or present the case for both sides of the argument equally. In this case, simply remove the first sentence of the question, Better Example: Do you plan on participating in LMU’s new tutoring program this semester?
4. Avoid or minimize sensitive topics. Asking respondents about sensitive topics can make them uncomfortable or embarrassed and should be avoided. Certain types of questions, however, are sensitive to some respondents but are necessary for the objectives of your survey. For instance, many demographic questions, such as race, income, education level, are of a private nature and not all respondents will wish to divulge this information. In these cases, place the questions at the end of the survey and provide a response option that allows refusal, such as “Decline to state.” There may also be times when the very nature of your research question may be a sensitive topic, for example, substance abuse. In these cases, to minimize the impact your questions may have on a respondent provide information on resources and seek the approval of the IRB.
Step 3: Determine the format of the response options There are a wide variety of response options available to you. The response option you choose, however, should always be based on the objectives of the question and survey. Below is a list of common response options and things to consider when using them in your survey.
1. Yes/No options. These options are quick and easy to answer and allow you to generate simple comparisons. Very few answers, however, are ever truly dichotomous, containing no grey area in between. This type of question also provides you with very little depth of understanding. Consider the following,
Have you used ITS services in the last month? __ YES __ NO A student may have used the service in the last month, but a simple yes/no response will not provide an understanding of the extent of that use. For this reason, yes/no options should be only used on rare occasions. One such occasion is for contingency questions. This type of question first prompts the respondent with a yes/no question to determine if subsequent questions apply.
2. Multiple choice options. These options provide a fixed set of answers to choose from and can be designed to allow the respondent to select only one or multiple response options. They are often quick and easy to answer and can be good for collecting factual information. One disadvantage, however, is that by providing fixed answers, you may miss an important factor that you failed to otherwise consider. If you decide to use multiple choice options, keep the following in mind: a. Response options should be mutually exclusive. All of the options you include should be clearly distinct from one another, with no
overlap. If the options are not distinct, respondents will not know which option to select. Consider the following, Poor Example: How many hours did you study for this class in the last week? __ 0 hours __ 1-3 hours __ 3-6 hours __ More than 6 hours Which option would a student select if they studied 3 hours? Recategorize the options to be mutually exclusive, Better Example: __ Less than an hour __ 1-3 hours __ 4-6 hours __ More than 6 hours
b. Capture all possible responses. Strive to include all responses while using the smallest number of categories possible. The “other” category is useful to capture options that you may have overlooked. Include this at the end of the list of response options. Consider the following,
Poor Example: What is the highest level of education your mother completed? __ Graduated from high school
__ Completed bachelor’s degree (B.A., B.S., etc.) __ Completed master’s degree (M.A., M.S., etc.) __ Completed doctorate degree (M.D., J.D., Ph.D., etc.)
This question provides no option for those that did not complete high school, attended college but did not complete a degree, or those that completed an associate’s degree. These options are common enough that they should be included in the response options itself and not simply through the use of an “other” category, Better Example: __ Did not complete high school
__ Graduated from high school __ Attended college but did not complete degree __ Completed associate’s degree (A.A., A.S., etc.) __ Completed bachelor’s degree (B.A., B.S., etc.) __ Completed master’s degree (M.A., M.S., etc.) __ Completed doctorate degree (M.D., J.D., Ph.D., etc.)
c. Arrange the options logically. Arranging the options logically will reduce confusion and the chance that respondents may overlook an option. For instance, when using ordered categories, place the options in an increasing order. Consider the following,
Poor Example: What is your current class year? __ Graduate __ Freshman __ Senior __ Junior __ Sophomore
It can take a respondent a longer amount of time to navigate through this question as the response options are not in a logical sequence. Place the options in an increasing order, Better Example: __ Freshman __ Sophomore __ Junior __ Senior __ Graduate
3. Likert scales. These response options are used when measuring opinion-based questions. Respondents are asked to rate their preferences, attitudes, or subjective feelings on a scale. One benefit of using this type of scale is that many respondents are familiar
www.lmu.edu/surveys 3 | S u r v e y D e s i g n
with the format and will find them easy to complete. They can also provide you with a good deal of information. If you decide to use Likert scale response options, keep the following in mind: a. Use a 5-6 point scale. The typical Likert scale includes 5-6 points, with the 6th point reserved for the “don’t know” response. A
smaller scale runs the risk of not capturing the respondent’s choice, while a larger point scale can lose meaning as the difference between one point and another is minimized. Consider the following,
Poor Example: On a scale from 1-100, 1 being “strongly disagree” and 100 being “strongly agree,” rate your level of agreement or disagreement.
This scale is so large that the respondent will not consider the difference between the points to be significant. Here is a good example of a 6 point Likert scale, Better Example:
b. Describe each point. Provide a description for each point on the scale. Unlabeled numbers on a scale can have different meanings to respondents. Labeling the scale eliminates the possibility of misunderstanding, as the choices are explicitly stated.
c. Consider a Neutral Category. Neutral types of responses are placed in the middle of the scale and indicate familiarity with the topic but no opinion one way or the other. Keep in mind that by not including this option, you will force respondents to have an opinion on the topic. Note that the “don’t know” option is not equivalent to neutral options. “Don’t know” indicates that the respondent is unfamiliar with the topic or refuses to answer.
d. Arrange the options logically. As with multiple choice options, the options for Likert scales should be presented in a logical order. After you select the ordering of response options, be consistent in your ordering for all similar questions in the survey.
4. Open-ended responses. Open-ended responses allow the respondent to provide their own free-form answer. For example, What aspect of this course was most beneficial to your learning? ________________________________________________
These types of responses can provide you with a rich source of information. They are most useful when there are many possible responses to a question or when you wish to probe deeper into an issue. Despite these advantages, open-ended responses are time- consuming. It will take longer for the respondent to answer these types of questions and it will take you longer to read and analyze the responses. For this reason, open-ended questions are typically used sparingly in surveys.
5. Alternative responses. Several alternative response options are commonly used in surveys to allow the respondent to essentially “opt-out” of answering or provide their own answer to a question. There is no one right answer when deciding to include these responses. Use your best judgment, based on your objectives and knowledge of the survey population. Here are some issues to consider when deciding to use some of the most common alternative responses: a. “Don’t Know” and “Not Applicable.” These alternatives allow a respondent to opt-out of answering a question if they are not
familiar with the topic or the topic does not apply to them. Consider the following when deciding to use these alternatives: i. If a respondent is not familiar with the topic or if the topic does not apply to them, they should not be forced to provide
a response. Including these alternatives will allow you to capture their response accurately. ii. On the other hand, you can never be certain why a respondent selected these alternatives. It could be that the
respondent was not familiar with the topic of interest, did not understand the question, or simply selected this to quickly finish the survey. The only thing that you can be certain of is that the respondent decided to opt-out of the question.
b. “Decline to State.” Providing this alternative allows respondents to formally refuse a question. Consider the following when deciding to use it:
i. Certain question topics can be sensitive to some individuals. For instance, not all respondents will be comfortable providing their income on a survey. In these cases, using the “decline to state” option will help to reduce the chance that a respondent will become upset by the question and abandon the survey altogether.
ii. As with the “don’t know” and “not applicable” alternatives you can never be certain why a respondent selected this option, only that they decided to opt-out of the question.
c. “Other.” This alternative response allows the respondent to provide their own answer to a question if their answer does not fall under one of the provided response options. It can be extremely beneficial when you are uncertain as to whether you have captured all possible responses. Similar to open-ended responses, this alternative can be time consuming. Consider limiting its use to those instances where you believe important information would be missed otherwise. Here’s an example of one such instance,
What types of professional communications organizations do you belong to? (Mark all that apply) __ International Communication Association __ National Communication Association __ Public Relations Society of America __ Society of Professional Journalists __ Other(s), please specify ________________________________________________ In this example, a Communication Studies program would like to know how their alumni stay connected to the field after graduation. They included a list of common associations, but provided an “other” response for less common associations. This enables them to provide respondents with common options, while not overloading them with a long list of choices.
1 2 3 4 5 6 Strongly Disagree
Disagree Neither Agree nor Disagree
Agree Strongly Agree
www.lmu.edu/surveys 4 | S u r v e y D e s i g n
Step 4: Format the survey The format of your survey can have a great impact on your response rate. Poorly organized surveys run the risk of respondents losing interest, becoming confused, or refusing to participate. Here are some tips for formatting an effective survey:
1. Begin with an introduction. Provide the respondent with a brief introduction to the survey. This is your opportunity to convince respondents that participating in your survey is worth their time and effort. To do this, include the following in the introduction: a. Title. This seems obvious, but can be easily forgotten when developing the survey. b. Topic. Inform the respondent of the topic of the survey, unless this is contrary to the objectives of your study. This can be as
short as a sentence. You may also want to provide a short explanation as to how their participation will help. c. Voluntary & Confidentiality. Ensure respondents that their participation is completely voluntary and their responses will remain
confidential. If your survey is also anonymous, inform respondents. This will address many of their privacy related concerns. d. Sponsor & Contact. Inform respondents of who is conducting the survey and how to contact you with questions or concerns.
2. Logically order & group the questions. Group questions by topic and place these groupings in a logical order. Your initial questions are critical to ensuring continued participation. Questions that are intriguing, easy to answer, and impersonal are best. Be sure that the grouping and order of the following questions have a natural flow to them. Reserve the last questions in your survey for demographic and sensitive topics. Placing these at the end and providing an alternative response option, for example “decline to state,” reduces the chance of respondents refusing to participate in the entire survey.
3. Keep it short. Shorter surveys are more likely to be completed by respondents as they require less of a time commitment. The ideal survey is short while still capturing all of the information necessary to meet its objectives. After drafting your survey, review it for unnecessary words or questions that duplicate information or measure topics that are not relevant to your objectives. Once you have removed the unnecessary elements, consider the following: a. Use multiple pages. Listing all questions of an online survey on one page gives the appearance of an excessively long survey,
increasing the chances that the respondent will abandon it all together. b. Use contingency questions. Contingency questions prompt the respondent with a preliminary question to determine if subsequent
questions will apply. Respondents will not have to weed through questions that do not apply to them, reducing the amount of time they must devote to the survey. For instance, when surveying alumni on employment experience, first ask a contingency question similar to the following,
What is your current employment status? __ Employed full-time __ Employed part-time __ Not employed, seeking employment __ Not employed, not seeking employment
Asking a contingency question like this one allows you to design specific follow-up questions for particular response options that are selected. In this case, you may want to ask additional employment questions for those respondents that are employed full- time or part-time. Those respondents that are not currently employed will be able to skip this series of questions as the initial contingency question determined that they will not apply.
c. Use a progress bar. Including a progress bar to indicate the number or percentage of questions remaining will give the respondent a sense of the length of the survey and the amount of time remaining after each page.
4. Close with a thank you. Always remember to thank respondents for their participation at the end of the survey. You may also wish to provide them with your website and remind them of your contact information for their questions and concerns.
Step 5: Pilot test the survey and make any necessary changes Even the best survey designer is bound to write a question that seems clear to them, but may be confusing to others. Testing your survey before implementing it on a large scale will help you to improve your survey so that you are better able to obtain informative and useful information. Here are a few things to keep in mind when conducting a pilot test:
1. Keep it manageable by testing your survey on a small group of people. 2. It is always best to test your survey on individuals that are similar to your actual survey population (e.g., for student surveys, test on
a small group of students). 3. After completing the survey, ask participants to provide feedback on the clarity of questions and response options, as well as the
length of time it took them to complete the survey. 4. Review the test responses to the survey, looking for any inconsistencies or unexpected answers. 5. Make any necessary changes to the survey before implementing it on a large scale. Consider conducting a second pilot test if
extensive changes have been made to the original survey.
Reference Suskie, L. A. (1996.) Questionnaire survey research: What works. (2nd ed.). Tallahassee, FL: The Association for Institutional Research.
Questions? Christine Chavez, M.A. Associate Director of Survey Research Xavier Hall, Room 117 310.568.6691 Christine.Chavez@lmu.edu