skip to content

Guide to Writing Survey Questions

Things to think about before you start

Writing survey questions is easier if certain things are considered before actual question construction. In general, you should first clearly outline the information you need to collect. Don't worry about how before defining what.

First, think about:

  • What information is necessary? At the end of the process, what data do you hope to have? What information is necessary for a decision? Separate need to know from nice to know information.
  • What will you do with the information? Who will see it? How much detail will they want?
  • Who will receive the survey? Define the population.
  • How much time is available to analyze the results? How will you code the information you receive?
  • How will you distribute the survey? How can you reach the most people at one time?
  • How will respondents benefit from completing the survey?

Design in relation to analysis

In addition to the actual construction of questions, it is important to consider options for question formatting. In general, scales should be balanced; there should be an equal number of good and bad options. Avoid open-ended questions if you cannot clearly identify their usefulness (that is, what you will do with the information you collect), and minimize ranking of individual items. Instead of leaving a blank space for the reply, consider listing possible choices. For example, if you ask the respondents what type of car they drive, give them possible options with an other blank, rather than just a blank line. This will speed up analysis and tabulation.

Some argue that all questions should be unforced. That is, in a scale of agree to disagree, a neutral middle option is provided. There may be cases, however, where a forced choice is necessary, particularly in policy areas where you want opinions. To determine whether a forced or non-forced scale should be used, ask yourself, If all the respondents chose 'neutral,' what would that tell me, and could I make a decision based on that information?

The following are general principles to keep in mind:

  • It is easier to compile survey data when checklists of answers are provided on the form. However, provide space for other comments.
  • Checklist surveys are easier to fill out and raise response rates.
  • Open-ended questions take longer to answer and analyze, but they usually provide richer information.
  • Consider the use of a hybrid - mostly checklists, with one or two open-ended questions. This way you can get clear-cut answers and supporting beliefs and feelings.
  • Don't get carried away with scales, particularly those that are sets of numbers with end labels. Usually a five-point scale is sufficient. Beyond that, people have trouble defining the points. For example, what is a 5 on a scale of 1 to 12?
  • Scales should be meaningful. For example, don't ask respondents to differentiate between small increments of time (for example, 1-2 minutes or 2-3 minutes).
  • You can make safer generalizations from closed-ended data.
  • Consider whether you want to develop a coding system before the survey goes out or whether you want to develop a coding system based on responses.
  • In situations where you are measuring quality, satisfaction, value, or any other subjective characteristic, do not rely on the respondent for definitions. Rather, ask about specifics. For example, rather than asking a respondent to rate the value of a product, you might ask about factors that constitute value, such as reliability, performance, ease of use, and price.
  • Try to end with an uplifting question, perhaps an open-ended question asking for their view (but only if you will use the data they provide).
  • Consider whether a more or less personalized style will produce better results.

Question construction

Once you have defined your data and analysis needs, you may begin writing questions. Every question should have a purpose. If you cannot clearly identify how the information will be used, don't bother the respondent with the request. Keep things simple, concise, and clear.

General guidelines include:

  • Don't ask a question if the answer is obvious. For example, How would no change in the cost of raw materials affect your production?
  • Avoid abbreviations and jargon. If they must be used, clearly define them.
  • Ask yourself whether several questions are actually necessary or if you can get the information in one question. Don't try to cram too much into one question.
  • Make your questions easy to understand. Make sure your sample population understands them.
  • Consider whether respondents will have the information to answer your questions. Is it readily available? Will they know the answers? Will they have to research? Remember, if they have to look it up, they'll probably skip the question or throw the survey out.
  • Avoid misleading or biased questions. (More information is in the section Red flag words and value-laden questions)
  • Consider whether respondents will willingly provide the information. How personal is it? In cases where you need to collect very personal information, for example, HIV infection status, repeat your policy on data practices.
  • If a list of answers is provided, make sure all possible answers are present. Even with yes and no questions, it may be necessary to include a neutral undecided or don't know.
  • Start a sequence with the question that is most comfortable to answer. This focuses the respondent.
  • Don't mix I feel or I think questions with questions regarding facts. Keep factual and perception questions in separate groupings.
  • Place sensitive demographic questions (such as age or income) at the end of the survey.

Sometimes it takes just one word to bias a question. Avoid using inflammatory words in surveys, such as: allege, allude, arbitrary, blame, claim, demand, error, failure, fault, ignore, ill-advised, ill-informed, incompetence, ineptness, insist, just, maintain, misinformed, must, neglected, one-sided, only, overreact, peremptory, purport, questionable, rejection, rigid, so-called, unfortunately, unilateral, unreasonable

Value-laden questions, especially those that attempt to be global in scope, tend to overwhelm respondents. For example, making respondents choose between a healthy environment and a vital economy will probably bias results. Don't distill complex issues into black or white scenarios. Rather, explore the gray areas.

Questions to ask about questions

After you have designed your questions, take another look at them and think about the following:

  • Is the question relevant? Is it consistent with survey goals?
  • Does the question ask for need to know or nice to know information?
  • What will be the value of a response? If 95 percent say, Yes, would this affect decision making?
  • Might the question elicit a vague answer? Make sure you ask directly for the information.
  • Will respondents be able to answer the question? Will they have the information?
  • Does the question lead to a particular response? (Is it a leading question?)
  • If a set of answers is provided, are all possible answers listed? Is one side of the issue represented more than another?
  • Does the question use negative phrases or words?
  • Are positive adjectives or phrases used?
  • If a scale is used for responses, is it balanced (for example, 1 to 5, with 3 being neutral)?
  • Might the question antagonize the respondent?
  • Are dead giveaway words used, such as all, every, or always ?
  • Are many demographic questions asked?
  • Is potentially offensive language used ?
  • Is the question wordy?
  • Were ambiguous words used - words with more than one meaning?
  • Is the question worded simply?
  • Are abbreviations used?
  • Does the question contain technical terms or jargon?
  • Have adjectives been quantified and/or clearly defined?
  • Does each question ask for one piece of information?
  • Have multiple negatives been used?
  • Does the question presume a previous situation or state of affairs?
  • If responses are provided, are they mutually exclusive?

Don't forget clear instructions and explanations

Always provide a good explanation of why you are conducting the survey and how the information will be used, either at the top of the survey, in a cover letter or e-mail invitation, or both. Make your instructions short and to the point. Let respondents know why it is important that they complete the survey and thank them for their cooperation. Be friendly in your appeal.

Provide directions for completing the survey, including a Tennessen warning for how the data is protected and how it will be used and an informed consent form, if needed.

If you use jargon, abbreviations, or acronyms (which should be limited, if possible), you may want to define the terms in a separate section at the front of the survey.

Provide a due date. Give yourself about a week's leeway (that is, have respondents complete the survey a week before you need to analyze the data). Fridays make good due dates.

An ounce of prevention …

Always have someone else look at your work. Even a small pre-test can help. The best option is to let part of your sample population take the survey. If nothing else, show it to a couple of people and get their input. Proofread a second time.

Increasing response rates

There's more to a good survey than the phrasing of its questions. To maximize response rates, consider the following:

  • Make the survey brief (no more than 15 minutes to complete).
  • Make the survey look professional and esthetically pleasing.
  • Provide reasonable incentives or rewards if possible (and appropriate).
  • Review the survey population list (e-mail addresses/mailing addresses, etc.) for accuracy.
  • Assure privacy or confidentiality if you can, but carefully consider data practices laws (most government data is considered public information).
  • Sometimes it helps to have the survey endorsed by someone, such as an agency commissioner.
  • Describe how the survey results will directly affect them (improving services they will receive, etc.).
  • Don't try to make the survey more than it is. If it's not scientific research, don't intimidate respondents by making it seem so.
  • Use pre-letters to announce the survey.
  • Develop a follow-up system for non-respondents, such as phone calls, reminder postcards or follow-up e-mails.

Surveys are not focus groups

Surveys allow you to collect information in a consistent manner from a large number of people, but are limited in terms of collecting in-depth data or information that requires follow-up. With that in mind, you may want to start your analysis with a survey to achieve breadth and follow up with focus groups. You may also want to use a focus group before conducting a survey to ensure you are asking questions that make sense to your target population. For information about writing focus group questions, go to Guide to Writing Focus Group Questions.

back to top