Skip to main content
Video Learning Center
English
;
English
Video Learning Center
English
;
English
Advice and answers from the DubBot Team
Search results for:
regular expressions
Exclude webpages from your site inventory using
regular
expressions
DubBot now allows administrators to ignore webpages utilizing a
regular
expression… Webpages or folders will be excluded from site crawl based on the
regular
expression added
Regex and XPath for Custom Policies
Resources for learning how to use
Regular
Expressions
and XPath to write Custom Policies in DubBot… In this article we will focus on
Regular
Expressions
and XPath…
Regular
Expression Learn Regex Resources XPath Learn XPath Resources
Regular
Ignored Paths for Specific Sites
Ignored Paths by
Regular
Expressions
… Ignored Paths by
Regular
Expressions
vs… These
regular
expressions
will be evaluated on the crawl URLs without looking at the domain
Remove page from Crawl (from Preview Page display)
See also: Exclude Web pages from your site inventory using
regular
expressions
Understanding Custom Policy Rules
The options include Word or Phrase, Link Text, Link URL, CSS Selector, XPath, and
regular
Expression… Page Content Check Behavior For the Search Criteria Word or Phrase and
Regular
Expression users are
Create a Work Plan
your Account Checks Settings, you can ignore link checking from particular sites using simple text or
regular
…
expressions
, if you think that would be useful in your situation
Create a Custom Policy
dropdown, choose from these options: Word or phrase, Link Text, Link URL, CSS Selector, XPath, and
Regular
… Page Content Check Behavior is a dropdown only present for the Word or Phrase and
Regular
Expression
Fine-tune Account Checks
You can also select the
Regular
Expression tab for more complicated URL exclusions
Custom Policy Examples
Rules: Search By >
Regular
Expression Rule > \(?\d{3}\)?-?\s?\d{3}-?… Rules: Search By >
Regular
Expression Rule > update CityName with your appropriate name(s
View Issues on Your Webpage
notice an issue with a link or set of links that are not actually broken, the specific broken link or a
regular