Skip to content

I created this bash script that combines multiple tools and techniques to discover subdomains efficiently.

Notifications You must be signed in to change notification settings

MrRockettt/Rocket-Enum

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 

Repository files navigation

Advanced Subdomain Enumeration Script

A high-performance bash script that combines multiple tools and techniques to discover subdomains efficiently.

Features

  • Passive Enumeration: Collects subdomains from 15+ sources (APIs, Certificate Transparency, Web Archives)
  • Active Enumeration: DNS bruteforcing with puredns, massdns, and shuffledns
  • Permutation Generation: Creates subdomain variations using alterx and dnsgen
  • Bulk Processing: Scan multiple domains in parallel
  • Fast Resolution: High-speed DNS validation with configurable threads
  • Technology Detection: Identifies web technologies using httpx
  • Port Scanning: Discovers open ports with naabu
  • Comprehensive Reports: Detailed statistics and consolidated results

Prerequisites

Required Tools

# Core tools (required)
go install -v github.com/projectdiscovery/subfinder/v2/cmd/subfinder@latest
go install -v github.com/tomnomnom/assetfinder@latest
go install -v github.com/projectdiscovery/dnsx/cmd/dnsx@latest
go install -v github.com/projectdiscovery/httpx/cmd/httpx@latest
sudo apt install jq curl -y

# Enhanced tools (recommended)
go install -v github.com/d3mondev/puredns/v2@latest
go install -v github.com/projectdiscovery/shuffledns/cmd/shuffledns@latest
go install -v github.com/projectdiscovery/alterx/cmd/alterx@latest
go install -v github.com/projectdiscovery/naabu/v2/cmd/naabu@latest
go install -v github.com/OWASP/Amass/v3/...@master
pip3 install dnsgen

Wordlists

mkdir -p ~/wordlists
git clone https://github.com/danielmiessler/SecLists.git ~/wordlists/SecLists
wget https://raw.githubusercontent.com/trickest/resolvers/main/resolvers.txt -O ~/wordlists/resolvers.txt

Installation

git clone https://github.com/MrRockettt/Rocket-Enum.git
cd Rocket-Enum
chmod +x rocket_enum.sh

Configuration

Edit the script and update these sections:

1. API Keys (Lines 7-15)

Replace "replace_with_your_key" with your actual API keys:

export VIRUSTOTAL_API_KEY="your_actual_key_here"
export SECURITYTRAILS_API_KEY="your_actual_key_here"
export GITHUB_TOKEN="your_actual_token_here"
# ... etc

2. Paths (Lines 17-22)

Update these paths to match your system:

export RESOLVERS_PATH="$HOME/wordlists/resolvers.txt"
export WORDLIST_PATH="$HOME/wordlists/SecLists/Discovery/DNS/bitquark-subdomains-top100000.txt"
# ... etc

Getting API Keys

Usage

Single Domain

# Source the script
source rocket_enum.sh

# Basic scan (100 threads by default)
subenum example.com

# Custom thread count
subenum example.com 200

# Or run directly
./rocket_enum.sh subenum example.com 100

Bulk Domains

Create a file with domains (one per line):

# domains.txt
example.com
target1.com
target2.com

Run bulk scan:

source rocket_enum.sh

# Process 5 domains in parallel, 50 threads each
sublist domains.txt 5 50

# Process 10 domains in parallel, 30 threads each
sublist domains.txt 10 30

Output

Single Domain Output

example.com-results/
├── passive/                        # Results from each passive source
├── active/                         # Bruteforce and permutation results
├── resolved/                       # DNS resolution results
├── final/
│   ├── example.com_final_resolved.txt    # All resolved subdomains ✓
│   ├── example.com_ips.txt               # Unique IP addresses
│   ├── example.com_httpx_results.txt     # HTTP info + technologies
│   └── example.com_open_ports.txt        # Open ports
├── all_subdomains_example.com.txt        # All discovered subdomains
└── SUMMARY_example.com.txt               # Statistics and summary

Bulk Output

bulk_enum_20241013_143022/
├── example.com-results/
├── target1.com-results/
├── ALL_SUBDOMAINS_CONSOLIDATED.txt       # All subdomains from all domains
├── ALL_RESOLVED_CONSOLIDATED.txt         # All resolved subdomains
├── ALL_IPS_CONSOLIDATED.txt              # All unique IPs
├── BULK_SUMMARY.txt                      # Complete statistics
├── completed_domains.txt
└── failed_domains.txt

How It Works

  1. Passive Enumeration: Queries APIs and public sources (parallel execution)
  2. API Sources: Calls rate-limited APIs sequentially with delays
  3. Active Enumeration: DNS bruteforcing with wordlists (parallel)
  4. Permutation: Generates subdomain variations from discovered subdomains
  5. Resolution: Fast DNS validation using multiple resolvers
  6. Information Gathering: Extracts IPs, technologies, and open ports
  7. Reporting: Consolidates results and generates statistics

Performance Tips

# Conservative (slow systems)
subenum example.com 50

# Balanced (recommended)
subenum example.com 100

# Aggressive (powerful systems)
subenum example.com 300

# Bulk processing
sublist domains.txt 5 50    # Safe approach
sublist domains.txt 10 100  # Faster, needs good system

Troubleshooting

No results found?

  • Check API keys are configured correctly
  • Verify wordlist and resolver paths exist
  • Test DNS resolution: dig @8.8.8.8 example.com

Tool not found errors?

  • Ensure all tools are installed: which subfinder dnsx httpx
  • Add Go bin to PATH: export PATH=$PATH:$HOME/go/bin

Rate limit errors?

  • Reduce thread count
  • Check API key rate limits
  • Increase delay between API calls (edit script line ~200)

Script hangs?

  • Check system resources: htop
  • Kill zombie processes: pkill -f subdomain_enum
  • Reduce parallel jobs in bulk mode

Legal Disclaimer

This tool is for educational and authorized testing only. Only scan domains you own or have explicit permission to test. Unauthorized access to systems is illegal.

Credits

Built with tools from:


Star ⭐ this repo if you find it useful!

About

I created this bash script that combines multiple tools and techniques to discover subdomains efficiently.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages