- The Chaining Concept
- Basic Chaining β Pipes and Redirects
- Extract Open Ports from nmap and Feed to Tools
- Chain Subdomain Discovery into Live Host Check
- Full Enumeration Chain β CTF Workflow
- httpx β Check Which Hosts Have Web Servers
- Tool Output Formats β Feeding One Into Another
- CTF vs Professional Use
Running tools one at a time manually is fine when you're learning. But once you understand what each tool does and why, the next level is chaining them together β using the output of one tool as the input for the next, automating the repetitive parts, and building workflows that cover your entire enumeration methodology without you having to babysit every step.
This is how professionals work. Not faster clicking β smarter sequencing.
Every enumeration tool produces output. That output is data. Data is input for the next tool.
nmap finds open ports
β
gobuster targets the web server on port 80
β
ffuf fuzzes the API endpoints gobuster found
β
arjun finds hidden parameters on those endpoints
β
sqlmap tests those parameters for injection
Each tool hands off to the next. The output of step 1 defines what step 2 does. You're building a pipeline β not running random commands and hoping something sticks.
The most fundamental chaining technique is using the terminal's built-in pipe (|) and redirect (>) operators.
# Pipe β send output of one command directly into another
nmap -sV <target> | grep "open"
# Redirect β save output to a file
nmap -sV <target> > nmap-output.txt
# Append β add to existing file without overwriting
echo "10.10.10.1" >> targets.txt
# Pipe into grep to filter
gobuster dir -u http://<target> -w wordlist.txt | grep "Status: 200"
# Chain multiple commands β run second only if first succeeds
nmap -sV <target> && gobuster dir -u http://<target> -w wordlist.txt
# Run second command regardless of first result
nmap -sV <target> ; gobuster dir -u http://<target> -w wordlist.txtOne of the most useful chains β run a fast port scan, extract the open ports, then run a targeted deep scan on only those ports.
# Step 1 β fast full port scan, save grepable output
nmap -p- --min-rate 5000 -oG ports.gnmap <target>
# Step 2 β extract open ports into a variable
ports=$(grep "open" ports.gnmap | grep -oP '\d+/open' | cut -d'/' -f1 | tr '\n' ',' | sed 's/,$//')
# Step 3 β confirm what you got
echo $ports
# Output: 22,80,443,8080
# Step 4 β deep scan only open ports
nmap -sV -sC -p $ports -oN deep-scan.txt <target>Find subdomains, then automatically check which ones are actually responding.
# Step 1 β find subdomains with amass
amass enum -passive -d example.com -o amass-subs.txt
# Step 2 β resolve which ones are live with dnsx
cat amass-subs.txt | dnsx -a -silent > live-subs.txt
# Step 3 β check which live subdomains have web servers
cat live-subs.txt | httpx -silent > live-web.txt
# Step 4 β run gobuster against every live web subdomain
while read url; do
gobuster dir -u $url \
-w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt \
-t 30 \
-o gobuster-$(echo $url | sed 's/[^a-zA-Z0-9]/_/g').txt
done < live-web.txtThis is a complete enumeration pipeline for a CTF box. Run it in order and you'll have thorough coverage across every common attack surface.
#!/bin/bash
# Usage: ./enum.sh <target-ip> <domain>
# Example: ./enum.sh 10.10.10.1 example.htb
TARGET=$1
DOMAIN=$2
OUTDIR="./enum-$TARGET"
mkdir -p $OUTDIR
echo "[*] Starting enumeration of $TARGET"
# Step 1 β fast port scan
echo "[*] Running fast port scan..."
nmap -p- --min-rate 5000 -oG $OUTDIR/ports.gnmap $TARGET
PORTS=$(grep "open" $OUTDIR/ports.gnmap | grep -oP '\d+/open' | cut -d'/' -f1 | tr '\n' ',' | sed 's/,$//')
echo "[+] Open ports: $PORTS"
# Step 2 β deep scan on open ports
echo "[*] Running deep scan on open ports..."
nmap -sV -sC -p $PORTS -oN $OUTDIR/deep-scan.txt $TARGET
# Step 3 β web enumeration if port 80 or 443 open
if echo $PORTS | grep -qE "(^|,)(80|443|8080|8443)(,|$)"; then
echo "[*] Web server detected β running gobuster..."
gobuster dir \
-u http://$TARGET \
-w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt \
-x php,html,txt,bak \
-t 50 \
-o $OUTDIR/gobuster-dir.txt
echo "[*] Running vhost fuzzing..."
ffuf \
-u http://$TARGET \
-H "Host: FUZZ.$DOMAIN" \
-w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt \
-fs $(curl -s http://$TARGET | wc -c) \
-o $OUTDIR/vhosts.txt \
-of md \
-t 50
fi
# Step 4 β SMB enumeration if port 445 open
if echo $PORTS | grep -q "445"; then
echo "[*] SMB detected β running enum4linux-ng..."
enum4linux-ng -A $TARGET -oY $OUTDIR/smb-enum
fi
# Step 5 β DNS enumeration if port 53 open
if echo $PORTS | grep -q "53"; then
echo "[*] DNS detected β attempting zone transfer..."
dig axfr @$TARGET $DOMAIN > $OUTDIR/zone-transfer.txt
fi
# Step 6 β SNMP check
echo "[*] Checking SNMP..."
nmap -sU -p 161 $TARGET | grep "open" && \
snmpwalk -v2c -c public $TARGET > $OUTDIR/snmp-walk.txt
echo "[+] Enumeration complete. Results in $OUTDIR/"Save this as enum.sh, make it executable, and run it:
chmod +x enum.sh
./enum.sh 10.10.10.1 example.htbhttpx is a fast HTTP toolkit that checks which hosts/URLs are actually running web servers. Essential for filtering large lists down to targets worth enumerating.
Install:
# Linux
sudo apt install httpx
# macOS
brew install httpx
# Windows
# Download from: https://github.com/projectdiscovery/httpx/releases# Check a list of subdomains for live web servers
cat subdomains.txt | httpx -silent
# Get status codes
cat subdomains.txt | httpx -silent -status-code
# Get titles β useful for quickly identifying what each site is
cat subdomains.txt | httpx -silent -title
# Get full info
cat subdomains.txt | httpx -silent -status-code -title -tech-detect
# Save results
cat subdomains.txt | httpx -silent -o live-hosts.txtDifferent tools output in different formats. Here's how to extract what you need:
# Extract IPs from nmap output
grep -oP '\b(?:[0-9]{1,3}\.){3}[0-9]{1,3}\b' nmap-output.txt
# Extract open ports from nmap grepable output
grep "open" scan.gnmap | grep -oP '\d+/open' | cut -d'/' -f1
# Extract hostnames from gobuster output
grep "Status: 200" gobuster-output.txt | awk '{print $1}'
# Extract found subdomains from amass output
cat amass-output.txt | sort -u
# Extract URLs from ffuf JSON output
cat ffuf-output.json | python3 -c "import sys,json; [print(x['url']) for x in json.load(sys.stdin)['results']]"| Situation | CTF | Professional Engagement |
|---|---|---|
| Automation scripts | Use freely | Use β but document every command run |
| Aggressive timing | T4, high threads | T2-T3, lower threads |
| Full auto pipeline | Great for speed | Run in stages β review output at each step |
| Save all output | Good habit | Required β everything goes in the report |
by SudoChef Β· Part of the SudoCode Pentesting Methodology Guide