Skip to McMaster Navigation Skip to Site Navigation Skip to main content
McMaster logo

Minimizing Bot Responses

Ed Hardie, Unsplash

There are various settings and general techniques that can be applied to reduce bot responses. Always start by reviewing the available features in your tool to prevent automated responses. This information can be found in the Survey Design & Functionality section of each tool. Below offers some ideas on settings and general ways to avoid bot responses. The right combination of platform settings and strategies will help safeguard the integrity of your data.

Configure survey settings

  • Enable CAPTCHA (text-based) and reCAPTCHA (real-world images): Use built-in CAPTCHA tools (text-based or image-based) to verify human users.
  • Set time-based completion rules: Set a minimum time required to complete the survey. Bots usually complete surveys much faster than humans.
  • Capture timestamps so you can remove bots. Weird dates/times, participants beginning and completing the survey at the same time. Survey designed to take at least 5 minutes; anything much less can be removed.
  • Screener survey: set up a screener survey prior to recruitment. In the consent page that should initiate your survey, include a code of conduct, a description of monitoring procedures, and penalties for fraudulent or untruthful reporting on the main page.

Restrict access

  • Password Protection: Protect the survey with a password, which you can distribute to your intended audience.
  • Unique Links: Provide unique links to each respondent, ensuring each link can only be used once.
  • Limit Access: Limit users based on group/organization/email.
  • Verify information: Require respondents to verify their email addresses or phone/contact before they can complete the survey.
  • Unique invitation link: Send a unique link to their email that they must click to access the survey.

Add questions for ‘humans’

  • Add a Simple Math CAPTCHA:  A simple math problem (e.g., “What is 3 + 4?”) can be an effective deterrent. This can be implemented in most platforms by adding it as a question and filtering responses.
  • Add attention-type absurd question
    • For example, ‘Have you ever visited the Pluto?’ Any entry other than “No” could be removed.
  • Repeat questions and remove any entries with a different answer to the same question.
    • For example, Q3: “How many hours per week do you spend on research?” and Q15: “On average, how many hours per week do you dedicate to research tasks?” (Check for consistency between answers.)
  • Add open-ended question which requires a response – with bots, the answers to these questions will be similar.
    • For example, ‘What has your experience been accessing ABC?
  • Require respondents to answer questions that demonstrate insider knowledge. These are useful for closed populations (e.g., researchers, students in a particular department, employees in an organization). Examples
    • Familiarity-based question: ‘Which of the following is located in the McMaster University Student Centre (MUSC)?’ Options: Pharmasave, The Underground, Chattime, I don’t know
    • Referral source: “Where did you hear about this study?” Include a few insider channels, like specific department emails or research bulletin boards, alongside more general options.
    • Study context clue: “What is the main topic of this study, in your own words?” Open-ended question; bot responses will typically be irrelevant or repetitive.
  • Add an instructional attention check.
    • For example, ‘To show you’re paying attention, please select ‘Strongly Disagree’ for this question.’ Then provide a standard Likert scale with unrelated content.
  • Hidden Fields: Add hidden fields to your survey that real users won’t see and fill out, but bots will. If these fields are filled out, you can discard those responses.
    • For example, add a section that includes a question only visible to HTML and not in the actual layout. If this field is filled, you know it’s a bot.

Monitor responses

  • Behavioral Analysis: Analyze patterns in responses (e.g., speed of completion, response consistency) to identify and filter out suspicious entries.
  • Regularly monitor survey responses for suspicious activity and clean your data by removing responses that appear to be from bots.
  • Regularly review submissions for duplicate addresses, email addresses, names, social media handles, responses, etc.

A note about Using IP Addresses to Prevent Bot Responses

Collecting IP addresses in research studies involving human participants raises significant privacy and ethics concerns. By default, IP address collection should be disabled and is not permitted by MREB or HiREB unless explicitly justified and approved in the research ethics protocol.

Using IP addresses to prevent bots is also unreliable—bots can easily bypass IP restrictions using VPNs or other tools. Additionally, IP-based restrictions can create unfair barriers (e.g., households or public spaces with shared devices) and even pose risks in sensitive contexts, such as studies involving intimate partner violence.

If you believe IP collection is necessary for your study, you must consult and obtain approval from the research ethics board to which you are applying.

This resource is produced and maintained by AskResearch, a collaborative network of support units providing accessible and effective digital research services to the McMaster research community. If you can’t find what you are looking for or aren’t sure where to start, contact AskResearch.

Updated: May 2025