Skip to main content

Crawl Times

Learn about the variables that effect a site's crawl time.

Updated today

The Crawler typically inventories between 750 and 1000 pages per site daily.

Please note that this can vary greatly depending on:

  • Time of day the crawl is run, with business hours often resulting in slower crawls.

  • Size of documents / files on site, and

  • The complexity of the site can increase crawl time.

DubBot can also be limited to certain traffic levels by web servers. DubBot's crawl time is heavily influenced by the web server, which affects page traffic over time.

Crawl settings affect this as well. For example:

  • Processing time or delays may be added to accommodate the site's server settings.

  • Additional dictionaries for multilingual sites take additional time for spellchecks.

  • Scripted logins may take longer.

It is important to remember that sites aren't just gathering inventory during a Crawl. Before a Site shows (done) by the Latest Crawl date, the following steps must be completed:

  • HTML pages, including their images and links, are gathered in the site inventory.

  • PDF files (if enabled) are gathered in the inventory.

  • All configured checks are run on each page and file.

  • The Site’s Dashboard information is built.

  • Standard and custom reports are generated.

  • Information is shared with relevant Page Sets.

Learn more about Site and Account Settings.


If you have questions, please contact our DubBot Support team via email at help@dubbot.com or through the blue chat bubble in the lower-right corner of your screen. We are here to help!

Did this answer your question?