Spider trap

A spider trap (literally " Spider Trap " ) is a web structure that is designed to prevent unwanted web crawler in the collection of web content.

The goal is to spread unwanted web crawlers, spam or make vulnerabilities locate should exclude from detection of an Internet content, while desirable crawler how the bots of search engines, are not affected in their work and human visitors are not impaired in their experience.

The Spider Trap utilizes the fact that desired bots adhere to the rules defined by him (eg in a robots.txt file) and thus ignore certain contents of a website. Unwanted Crawler keep usually not such rules. Therefore, it is possible the developer to place a invisible to the user and locked for a desired crawler link that leads to the blocking of unwanted crawler used by the IP address.

In the event that a visitor gets lost in this block page, the option can be exercised by a CAPTCHA cancel the blocking.

741520
de