Fast One-Shot Passive Recon Script with
7 min readApr 10


In the article “Building a Fast One-Shot Recon Script for Bug Bounty” by ProjectDiscovery you can find a great guide on developing a script for a bug hunter or pentester. The author did an excellent job. Let’s say many thanks to @pry0cc. However, this guide requires working with many utilities. Let’s see how to rewrite it point by point in order to understand whether it is possible to get by with

So we will be working with Netlas CLI tool and web application. To install CLI tool use “pip install netlas” (Python 3 should be installed). You will also need an API key, which can be found in your users’ profiles.

Have you already chosen an organization that will pay you for the vulnerabilities found?

1. Root domains

The first step is to find as many root domains as possible. This is perfectly done with Domain Whois Search tool. Request the domain of your interest with the first request and find the Registrant data as it is shown below.

Finding registrant data with Domain Whois Search tool

This search result gives us an organization name and some other properties, that we can use to build further search queries. Let’s search domains registered to the same organization name with the next query:

registrant.organization:“Organization name”

Finding root domains with Domain Whois Search tool

Wow! More than 3 thousand domains were found. Nice result! Sometimes it is possible to use “” field instead.

And this is how these queries will look on the command line:

Finding root domains with CLI tool

Let’s take a look at command line queries in more detail. The key “-d whois-domain” here means the source from which the data will be obtained, in this case, Domain Whois Search tool. Option “-i registrant” removes all fields from the output except for the “Registrant” section. Download command gives results as a stream (without pagination). Option “-c NUMBER” indicates the number of results to download. Use “netlas count” to get the exact count of results. And finally “jq” command helps us to format JSON output to the list of domains we need. In further queries, these moments will be missed, but their meaning will remain the same.

Finally, it is possible to append a list of domains and subdomains using Netlas Certificates Search tool as it is shown in the next screenshot.

Root domains and subdomain search using Certificate Search tool

Most of these domains should be already listed. But since this search is using the organization name, there is a chance to find additional root domains and subdomains.

Be careful on this step due to certificate names section usually contains wildcard domains. Wildcard domains are not a problem when you work with Netlas DNS Search tool — there are no wildcard domains there. But significant part of SSL certificates are issued to wildcard domains. So we have to filter it out with “sed” command.

Domains and subdomains search with Certificate Search tool using CLI

2. Bonus step: IP ranges

This step is not presented in the original article, but I believe it’s necessary, especially for large companies. Host Summary view

For example, an IP-address on the screenshot is related to our company of interest. What is more interesting this IP address is included in IP range, which is definitely operated by our company of interest. So we use IP Whois Search tool to find networks related to our target.

Finding IP ranges with IP Whois Search tool

It is recommended to make Steps 1 and 2 without automation. Unfortunately, whois databases are often filled incorrectly or incompletely. Using automation in these steps is easy to make a mistake. We assume that selected IP ranges are saved in “target_ip_ranges.txt” in form of CIDRs (e.g. or

So at this point, there should be three files:
– target_root_domains.txt
– target_ip_ranges.txt
– target_domains_from_certs.txt

You can create them using CLI as it is shown above, or you can search and download these lists from Netlas web application, which is sometimes a more convenient way. These files will be used as input for our recon script.

3. Subdomain enumeration

So we have an initial list of targets. Let’s start with subdomains. A lookup like


in the DNS Search tool will instantly give us the results we need. Another way is to use regex. This is a bit more flexible. For example we can query a root domain and subdomains with one simple search:

domain:/(.*\.)? a:*

You can find explicit information and examples of regex usage in Netlas help pages. The “a:*” means to get only records where at least one A-record exists.

Subdomain search using DNS Search tool
Subdomain search with Netlas CLI tool

Saved CIDRs list is another option. We can do forward DNS transforms for the whole subnet using commands like:


Forward DNS search by CIDR

You can do the same using the command line. By the way, you can use IP addresses instead of CIDRs. Netlas will behave in the same way.

Forward DNS search by CIDR using CLI tools

Let’s put all these requests together in one script and run it:

user@host:~$ ./ target_root_domains.txt
target_ip_ranges.txt target_domains_from_certs.txt

Netlas domains and IPs recon script

I got almost 1,5K of IP-addresses and 4,5K of domains and subdomains when I tested this script. About 15K Netlas coins were spent and about 10K requests were sent. The initial data of the goal is quite voluminous. So it took a lot of requests.

4. Downloading HTTP responses

In the original article, the author ends the process of gathering data by downloading index pages from HTTP servers hosted on the attack surface of interest. Responses Search tool is what we needed to archive this.

You can download index pages using the “host:” filter, the “ip:” filter, and the “prot7:” filter. A query like “ip:target_ip” is used to address all services on target_ip. A query like “host:target_domain” is used to address specific web services. The “Prot7” filter is used to filter responses by OSI layer 7 protocols, like HTTP, FTP, SSH, and so on.

Netlas Responses Search tool

Sometimes Netlas returns not only index pages. This happens because of 301 and 302 redirects. So if you want only root pages use the ‘path:”/”’ filter.

So, we have received several services of interest to us on the http/https protocols. What’s next? It’s just json/yaml parsing. It already contains the body of the sites being studied, so we can easily get the necessary information from there. Moreover, in the event that we are looking for a specific title/favicon/text, this item can be combined with the fifth item as an addition to the query. For example:


Here is the script I used to collect html pages from Netlas.

Netlas HTTP responses download script

If you execute it passing a file with domains or IPs as an argument, you will get a set of HTML files in response subfolder named like this:

A file’s names consist of a domain/IP, a port number, a path (all “/” replaced by “_”), and an HTTP status code after an exclamation mark.


Further points of the original article are about the processing of the results: their tokenization, sorting, and parsing. All those questions are out of the scope of this short note.

So we’ve seen some of what is capable of by looking at the most basic commands that can make it much easier to script pen testers and bug hunters.

You can find described scripts in Github repo. Give this article some likes and we will make some improvements or maybe rewrite it to full-fledged Python scripts.

Arthur Kotylevskiy, Vyacheslav Makhrov



Discover, research and monitor any assets available online