# Table of Contents
1. **[Information Gathering](#information-gathering)**
    - [What are we looking for?](#what-are-we-looking-for)
    - [DNS Records](#dns-records)
2. **[Passive Information Gathering](#passive-information-gathering)**
    - [WHOIS](#whois)
    - [DNS Enumeration](#dns-enumeration)
    - [Passive Subdomain Enumeration](#passive-subdomain-enumeration)
3. **[Active Information Gathering](#active-information-gathering)**
    - [Looking for a real IP](#looking-for-a-real-ip)
    - [Active Infrastructure Identification](#active-infrastructure-identification)
    - [Active Subdomain Enumeration](#active-subdomain-enumeration)
    - [Virtual Hosts](#virtual-hosts)
    - [Crawling](#crawling)


---
# Information Gathering
## What are we looking for?
* IP addresses
* Directories hidden from search
engines
* Names
* Email addresses
* Phone Numbers
* Physical Addresses
* Web technologies being used


---
## DNS Records

| Type | Description |
| ---- | ----------- |
| **A**| Resolves a hostname or domain to an IPv4 address |
| **AAAA**| Resolves a hostname or domain to an IPv6 address |
| **NS**| Reference to the domains nameserver |
| **MX**| Resolves a domain to a mail server |
| **CNAME**| Used for domain aliases |
| **TXT**| Text record |
| **HINFO**| Host information |
| **SOA**| Domain authority |
| **PTR**| Resolves an IP address to a hostname |


---
# Passive Information Gathering
## WHOIS
| Command | Description |
| ------- | ----------- |
| **export TARGET="domain.tld"**| Assign target to an environment variable |
| **whois $TARGET**| WHOIS lookup for the target |


---
## DNS Enumeration
| Command | Description |
| ------- | ----------- |
| ``nslookup $TARGET``| Identify the ``A`` record for the target domain. |
| ``nslookup -query=A $TARGET``| Identify the ``A`` record for the target domain. |
| `dig $TARGET @<nameserver/IP>` | Identify the `A` record for the target domain.  |
| `dig a $TARGET @<nameserver/IP>` | Identify the `A` record for the target domain.  |
| `nslookup -query=PTR <IP>` | Identify the `PTR` record for the target IP address. |
| `dig -x <IP> @<nameserver/IP>` | Identify the `PTR` record for the target IP address.  |
| `nslookup -query=ANY $TARGET` | Identify `ANY` records for the target domain. |
| `dig any $TARGET @<nameserver/IP>` | Identify `ANY` records for the target domain. |
| `nslookup -query=TXT $TARGET` | Identify the `TXT` records for the target domain. |
| `dig txt $TARGET @<nameserver/IP>` | Identify the `TXT` records for the target domain. |
| `nslookup -query=MX $TARGET` | Identify the `MX` records for the target domain. |
| `dig mx $TARGET @<nameserver/IP>` | Identify the `MX` records for the target domain. |


---
## Passive Subdomain Enumeration
| **Resource/Command** | **Description** |
| -------------------- | --------------- |
| `VirusTotal` | [https://www.virustotal.com/gui/home/url](https://www.virustotal.com/gui/home/url) |
| `Censys` | [https://censys.io/](https://censys.io/) |
| `Crt.sh` | [https://crt.sh/](https://crt.sh/) |
| `curl -s https://sonar.omnisint.io/subdomains/{domain} \| jq -r '.[]' \| sort -u` | All subdomains for a given domain. |
| `curl -s https://sonar.omnisint.io/tlds/{domain} \| jq -r '.[]' \| sort -u` | All TLDs found for a given domain. |
| `curl -s https://sonar.omnisint.io/all/{domain} \| jq -r '.[]' \| sort -u` | All results across all TLDs for a given domain. |
| `curl -s https://sonar.omnisint.io/reverse/{ip} \| jq -r '.[]' \| sort -u` | Reverse DNS lookup on IP address. |
| `curl -s https://sonar.omnisint.io/reverse/{ip}/{mask} \| jq -r '.[]' \| sort -u` | Reverse DNS lookup of a CIDR range. |
| `curl -s "https://crt.sh/?q=${TARGET}&output=json" \| jq -r '.[] \| "\(.name_value)\n\(.common_name)"' \| sort -u` | Certificate Transparency. |
| `cat sources.txt \| while read source; do theHarvester -d "${TARGET}" -b $source -f "${source}-${TARGET}";done` | Searching for subdomains and other information on the sources provided in the source.txt list. |


---
#### Sources.txt
```txt
baidu
bufferoverun
crtsh
hackertarget
otx
projecdiscovery
rapiddns
sublist3r
threatcrowd
trello
urlscan
vhost
virustotal
zoomeye
```


---
#### TheHarvester
[TheHarvester](https://github.com/laramies/theHarvester) is a simple-to-use yet powerful and effective tool for early-stage penetration testing and red team engagements. We can use it to gather information to help identify a company's attack surface. The tool collects emails, names, subdomains, IP addresses, and URLs from various public data sources for passive information gathering.

```bash
[pakhom:~]$ export TARGET="facebook.com"
[pakhom:~]$ cat sources.txt | while read source; do theHarvester -d "${TARGET}" -b $source -f "${source}_${TARGET}";done

<SNIP>
*******************************************************************
*  _   _                                            _             *
* | |_| |__   ___    /\  /\__ _ _ ____   _____  ___| |_ ___ _ __  *
* | __|  _ \ / _ \  / /_/ / _` | '__\ \ / / _ \/ __| __/ _ \ '__| *
* | |_| | | |  __/ / __  / (_| | |   \ V /  __/\__ \ ||  __/ |    *
*  \__|_| |_|\___| \/ /_/ \__,_|_|    \_/ \___||___/\__\___|_|    *
*                                                                 *
* theHarvester 4.0.0                                              *
* Coded by Christian Martorella                                   *
* Edge-Security Research                                          *
* cmartorella@edge-security.com                                   *
*                                                                 *
*******************************************************************


[*] Target: facebook.com

[*] Searching Urlscan.

[*] ASNS found: 29
--------------------
AS12578
AS13335
AS13535
AS136023
AS14061
AS14618
AS15169
AS15817

<SNIP>
```


---
## Passive Infrastructure Identification

| Resource/Command | Description |
| ---------------- | ----------- |
| `Netcraft` | [https://www.netcraft.com/](https://www.netcraft.com/) |
| `WayBackMachine` | [http://web.archive.org/](http://web.archive.org/) |
| `WayBackURLs` | [https://github.com/tomnomnom/waybackurls](https://github.com/tomnomnom/waybackurls) |
| `waybackurls -dates https://$TARGET > waybackurls.txt` | Crawling URLs from a domain with the date it was obtained. |


---
# Active Information Gathering
## Looking for a real IP
Check the IP history of this domain, you can even on different sites, which in google as shit behind the banana on the query "**domain ip history**". In this case I checked on [https://viewdns.info/iphistory/](https://viewdns.info/iphistory/).

![](https://i.imgur.com/UYBJjVU.png)

Here we see that the domain hung on IP addresses that belong to **OVH**, **CONTABO**, **SEDO**, **GOOGLE**. NameCheap - do not take into account and we will not check, as the probability that it continues to host there since 2016 and to this day - extremely small. Next, we need to get a list of all subnets of these providers/hosters, so we go for example [here](https://suip.biz/ru/?act=ipintpr) and type in the addresses from the history, in response we get a list of subnets of the provider.

![](https://i.imgur.com/tXXkhSm.png)

Copy and save to a text file. Copy only IPv4. We need to get a list of subnets of each ISP/hoster on which the domain was lit. Save all subnets into one text file.

Run MASSCAN. Scan these subnets for 80 and 443 ports.

```bash
[pakhom:~]$ masscan -iL /etc/ips.txt -p443,80 --rate=2000 --open-only -oH /etc/result.txt
```

The output is a clean list of IPs of these providers, on which the 80/443 port is open. In this case I got **1,485,398** IPs.

Now we need to try to manually determine the name and version of the target's web server, or at least the name. The simplest options are to look at the hiders, cause an error on the site, try to check the 404 page, apply dudos. In our case it was LiteSpeed Web Server. We managed to determine it by causing a 404 error - [https://target.com/sukablyat.html](https://target.com/sukablyat.html). Also can come in handy interesting hiders that give the real server through CloudFlare. This is all optional, but can simplify the search itself. Further I will explain how and why it is necessary at all.

Next, we will need the [httpx](https://github.com/projectdiscovery/httpx) utility, which we will use to access each IP via HTTP and HTTPS with the host hitter of our domain.

```bash
[pakhom:~]$ httpx -l '/etc/result.txt' -web-server -location -match-string SPOOFER -title -nf -nc -H 'Host: target.com' -t 250 -rl 750 -o '/etc/temp.txt'
```

| Flags | Description |
| ---------------- | ----------- |
| `-l '/etc/result.txt'` | List of scanned IPs. |
| `-web-server` | Define the name of the web server (Apache, Nginx, LiteSpeed, etc.) |
| `-location` | Show where redirected to if redirected to |
| `-match-string SPOOFER` | Display in the results only those IPs on whose pages a certain word is found, in our case it is the word SPOOFER. |
| `-title` | Showing the page title. |
| `-nf` | Display both HTTP/HTTPS protocols in the results. |
| `-nc` | Do not use multicolored output in the console. |
| `-H 'Host: target.com'` | Host Header. Here we specify the name of our domain/subdomain whose IP we are trying to find. |
| `-t 250` | Number of threads. Here everything depends on your server on which you are working. Select it yourself. |
| `-rl 750` | The maximum number of requests per second that the utility can make. It is recommended to use a figure of 3 times the number of threads. |
| `-o '/etc/temp.txt'` | Where we save the result. |

```bash
    __    __  __       _  __
   / /_  / /_/ /_____ | |/ /
  / __ \/ __/ __/ __ \|   /
 / / / / /_/ /_/ /_/ /   |
/_/ /_/\__/\__/ .___/_/|_|
             /_/              v1.2.7

https://144.217.15.70 [] [Evolution Store | Home] [nginx/1.18.0 (Ubuntu)]
http://147.135.210.135 [] [Index of /] [Apache/2.4.54 (Debian)]
https://164.68.107.38 [] [Index of /] [Apache]
https://167.114.170.186 [] [Build Your Own Service Plan] [Apache]
http://167.86.70.149 [] [Amibot-Cheats] [nginx/1.14.2]
https://167.86.70.149 [] [Amibot-Cheats] [nginx/1.14.2]
http://173.212.201.141 [] [Kiwi Technology Limited] [Apache/2.4.41 (FreeBSD) OpenSSL/1.1.1d-freebsd PHP/7.4.0RC3]
https://173.212.201.141 [] [Kiwi Technology Limited] [Apache/2.4.41 (FreeBSD) OpenSSL/1.1.1d-freebsd PHP/7.4.0RC3]
https://198.244.249.33 [] [SMS SPOOFER] [LiteSpeed]
https://198.27.78.195 [] [Web Hosting Email Hosting Australian Domain Registration] [Apache/2.4.54 (CentOS)]
https://35.213.165.197 [] [SMS SPOOFER] [nginx/1.22.1]
https://51.79.255.170 [] [Map] [nginx]
https://51.81.138.126 [] [PPD EZ LiPo RC products, Custom billet products for Toyota Tundra, custom CNC products design, modeling, and programming consultant.] [Apache/2.4.38 (Debian)]
https://54.38.210.117 [] [Web Hosting, Email Hosting and UK Domain Registration] [Apache/2.4.54 (CentOS)]
https://54.38.210.119 [] [Email Hosting NZ Domain Registration] [Apache/2.4.54 (CentOS)]
http://92.222.232.208 [] [Undetected No Recoil Macros 🥇 Top Macros/Scripts Since 2016 🥇 Legit-Helpers.com] [Apache/2.4.54 (Debian)]
https://94.23.218.135 [] [SMS SPOOFER] [Apache/2.2.16 (Debian)]
```

We get the following output, which shows all IP addresses, when accessed with our host-hider **httpx** detects the word **SPOOFER** on the page. In this case we have found three IP addresses, which when accessed with the host-hider target.com opens our site, it can be seen by the page's title - **SMS SPOOFER**.

```bash
https://198.244.249.33 [] [SMS SPOOFER] [LiteSpeed]
https://35.213.165.197 [] [SMS SPOOFER] [nginx/1.22.1]
https://94.23.218.135 [] [SMS SPOOFER] [Apache/2.2.16 (Debian)]
```

Now it is clear why you should have tried to determine the web server and if possible its version, directly through CloudFlare? If you do not understand, it often happens that the same site is hosted on different servers, with different configurations and web servers, it is done for a variety of reasons, I will not go into the explanation of which I will not here. As a result, in the results you can get and 10 identical sites that hang at different hosts, on servers with different configurations, and what-to understand which of them is proxied through CloudFlare - there are two options.

1. Compare by web server name/version, some unusual hiders/cookies.
2. DDoS! Hit each IP with the target's host hitter, and see if it's down or not. Or pour TCP/UDP on 443/80 port, and also check whether the site is down or not.

![](https://i.imgur.com/NfIu1UN.png)

![](https://i.imgur.com/KhOc7AD.png)

![](https://i.imgur.com/w6h64N7.png)

`In my case, the target didn't shine the IP anywhere, I put Cloudflare protection right away, and the subdomains just aren't there >.<)`

In this case, you can try to scan the subnets of all large and popular hosts, or at least the whole world, and check the above method. 


---
## Active Infrastructure Identification

| Resource/Command | Description |
| ---------------- | ----------- |
| `curl -I "http://${TARGET}"` | Display HTTP headers of the target webserver. |
| `whatweb -a https://www.facebook.com -v` | Technology identification. |
| `Wappalyzer` | [https://www.wappalyzer.com/](https://www.wappalyzer.com/) |
| `wafw00f -v https://$TARGET` | WAF Fingerprinting. |
| `Aquatone` | [https://github.com/michenriksen/aquatone](https://github.com/michenriksen/aquatone) |
| `cat subdomain.list \| aquatone -out ./aquatone -screenshot-timeout 1000` | Makes screenshots of all subdomains in the subdomain.list. |
| `webhttrack` | Website copier. |


---
## Active Subdomain Enumeration

| Resource/Command | Description |
| ---------------- | ----------- |
| `HackerTarget` | [https://hackertarget.com/zone-transfer/](https://hackertarget.com/zone-transfer/) |
| `SecLists` | [https://github.com/danielmiessler/SecLists](https://github.com/danielmiessler/SecLists) |
| `nslookup -type=any -query=AXFR $TARGET nameserver.target.domain` | Zone Transfer using Nslookup against the target domain and its nameserver. |
| `gobuster dns -q -r "${NS}" -d "${TARGET}" -w "${WORDLIST}" -p ./patterns.txt -o "gobuster_${TARGET}.txt"` | Bruteforcing subdomains. |


---
## Virtual Hosts

| Resource/Command | Description |
| ---------------- | ----------- |
| `curl -s http://192.168.10.10 -H "Host: randomtarget.com"` | Changing the HOST HTTP header to request a specific domain. |
| `cat ./vhosts.list \| while read vhost;do echo "\n********\nFUZZING: ${vhost}\n********";curl -s -I http://<IP address> -H "HOST: ${vhost}.target.domain" \| grep "Content-Length: ";done` | Bruteforcing for possible virtual hosts on the target domain. |
| `ffuf -w ./vhosts -u http://<IP address> -H "HOST: FUZZ.target.domain" -fs 612` | Bruteforcing for possible virtual hosts on the target domain using `ffuf`. |


---
## Crawling

| Resource/Command | Description |
| ---------------- | ----------- |
| `ZAP` | [https://www.zaproxy.org/](https://www.zaproxy.org/) |
| `ffuf -recursion -recursion-depth 1 -u http://192.168.10.10/FUZZ -w /opt/useful/SecLists/Discovery/Web-Content/raft-small-directories-lowercase.txt` | Discovering files and folders that cannot be spotted by browsing the website.
| `ffuf -w ./folders.txt:FOLDERS,./wordlist.txt:WORDLIST,./extensions.txt:EXTENSIONS -u http://www.target.domain/FOLDERS/WORDLISTEXTENSIONS` | Mutated bruteforcing against the target web server. |