Recommended articles
Social share
Want to keep learning?
Sign up to receive security learning articles from Verified Visitors
The information you provide to Verified Visitors is governed by the terms of our Privacy Policy
What is Bot Reconnaissance, and how do Web Scanners work?
Sign up to receive security learning articles from Verified Visitors
The information you provide to Verified Visitors is governed by the terms of our Privacy Policy
A Bot reconnaissance mission is the use of automated bots to systematically discover weaknesses and vulnerabilities in the target's infrastructure while remaining undetected. The reconnaissance mission uses web scanning bots to covertly observe and extract data and intelligence gleaned from weaknesses observed in the platform defences.
It’s a hard life being a hacker. You constantly have to think of new ways of scamming people. Why not be proactive? Constantly run crawler bots to check for vulnerabilities, and simply exploit the weak sites that are going to be easy. Target the vulnerable, the old and those in poor health. Crawl millions of websites and let the opportunities come to you.
Crawler or Spider bots are the bot army that does the basic reconnaissance mission. They fake user agent strings to look like “normal” scrapers and visit each page on your site, just like ‘regular’ search engines do. They are begging you to whitelist them. Once whitelisted they can roam with impunity behind the ‘enemy’ lines, and set-up all sorts of further attacks. See our bot database for more information
The reconnaissance missions are analogous to the use of special reconnaissance units in the Army, who operate covert operations often behind enemy lines to build up intel for future strikes, and more recently to the use of Drones to provide specific intelligence around targets behind enemy lines.
These types of reconnaissance bots focus on three main areas:
Typically, these are not custom scripted bots, they are just the standard scripts used across millions of sites. Most web admins will ignore them. These bots aren’t doing any actual harm.
Many sites allow these crawler bots to systematically access all the pages on the website. Nothing happens for a while - you probably don’t even notice.
However, ignoring them is not wise. If they do discover any potential vulnerabilities, these bots typically will report back their finding to the hackers, who can then decide on the next course of action to take.
If the account login paths are seen as vulnerable and don’t have sufficient protections, the next stage may well be a credential stuffing or even full blown account takeover attack, depending on the perceived value of the target property.
These footprint probing mission work to identify the different properties of the web service or API, such as platform configurations, tech stack, and security defenses, to identify the underlying logic, structures, and methods, of the security defense architecture.
Know exploits are usually contained in older versions of the software, and this is where version control is really necessary, to ensure the stack is up-to-date and has the latest patches for any known vulnerabilities. Although this is basic good cybersecurity best practice and routine hygiene, the bots discover millions of sites every day, so it’s just a matter of running the numbers until they find a target website that is vulnerable.
No! Ingore them at your peril. Having zero trust for bots at the edge of network reduces the chance of further more targeted attacks.
The bots are trying to detect weaknesses in your login protection, or may be launching credential stuffing or account take over attacks.