Nokizaru is a CLI tool purpose-built for enumerating the core web recon surface. Its goal is to provide a sufficiently expansive, high-signal overview of a target quickly, subverting the need to reach for heavier OSINT suites. Instead of running several tools in sequence, Nokizaru aims to produce comparable recon results with a single full-scan command. The ideal use case is collecting relevant information on a web target during the recon phase of a bug bounty/web app pentest engagement. As such, the primary audience is security researchers (not CTI analysts who may still prefer larger, more comprehensive OSINT suites).
Nokizaru started as a Ruby reimplementation of FinalRecon by thewhiteh4t. The original goal was straightforward: keep the familiar reconnaissance workflow while rebuilding it with Ruby-first design choices.
Over time, the project expanded beyond a direct rewrite. Nokizaru now includes structured findings output, broader provider coverage (with additional integrations planned), Ronin-powered workspaces for persistent target profiling, and targeted performance improvements oriented around stable runtime behavior.
Nokizaru keeps the same high-level recon module flow as FinalRecon, but the Ruby implementation is tuned around bounded concurrency, continuity under hostile targets, and operator-friendly runtime behavior.
- Bounded concurrency + module isolation: modules run with explicit budgets and resilient error handling so one degraded module does not derail full recon completion.
- Reusable networking: shared HTTP clients are used where practical to reduce handshake/setup overhead.
- Adaptive runtime control: modules (especially Directory Enum) can downshift mode and request policy when sustained hostile pressure is detected.
- Stability-first execution: runtime policy prioritizes continuity and signal preservation over brute-force persistence when targets become aggressively defensive.
Nokizaru treats early recon results (especially initial headers) as shared target context that can influence later modules. The goal is simple: stay fast while avoiding low-signal brute forcing on targets that intentionally normalize responses (CDNs/WAFs, redirect-heavy setups, etc.).
- Headers -> Target Profile: the Headers module collects response headers and derives a lightweight target profile (redirect behavior, canonical scheme/host hints)
- Custom Request Headers: repeatable
-H/--headervalues are applied across in-scope web requests so authenticated or role-specific scans can observe the target as that session sees it - Re-Anchor: Crawler and Directory Enum consume that profile and may automatically “re-anchor” to the most appropriate in-scope URL (for example, HTTP -> HTTPS canonicalization). You’ll see this as:
⟦+⟧ Re-Anchor...⟦ https://target.tld (http->https) ⟧⟦+⟧ Re-Anchor...⟦ https://target.tld (same-scope) ⟧
- Crawler Feeds Dir Enum: Directory Enum uses crawler artifacts (robots, internal links, sitemap URLs, URLs found inside JavaScript) as high-signal seed paths, instead of blindly relying on a large wordlist for every target
Directory Enum selects an initial mode from preflight and can adapt during runtime using rolling pressure heuristics (transport errors, throughput, and prioritized-yield behavior).
- full: normal targets; full wordlist strategy with adaptive safety rails
- seeded: mixed/challenged targets; prioritizes crawler-derived and high-signal paths
- hostile: sustained defensive targets; tighter budgets and lighter request policy
Additional runtime behavior in current Dir Enum:
- Windowed adaptation: pressure is evaluated over rolling request windows (not one-off anomalies)
- Yield-aware escalation: downshift decisions account for prioritized finding growth, not just raw error volume
- HEAD-first hostile policy: hostile mode uses lighter probing with selective GET confirmation for finding-candidate statuses
- Workspace-only short-term memory: when using
--project, recent host hostility posture can warm-start future runs; ephemeral scans remain stateless - Persistent live status line: progress rail and average
r/sstay active while scanning, including under hostile pressure
Directory Enum keeps stdout focused on bug bounty pivot statuses (200/204/401/403/405/500) and reports notable 3xx redirect signals only when they look meaningful. Exported data remains raw and unfiltered so you can inspect every discovered status when needed.
Homebrew is the primary install method for Linux/macOS:
brew tap hakkuri01/nokizaru https://github.com/hakkuri01/nokizaru
brew install nokizaru
nokizaru --help
man nokizaruFor updates:
brew update
brew upgrade nokizaruNokizaru Homebrew releases are pinned to stable git tags. brew upgrade nokizaru will update your install whenever a newer stable formula version is published.
git clone https://github.com/hakkuri01/nokizaru.git
cd nokizaru
gem build nokizaru.gemspec
gem install nokizaru-*.gem
nokizaru --helpcurl -L -o nokizaru.tar.gz https://github.com/hakkuri01/nokizaru/archive/refs/heads/main.tar.gz
tar -xzf nokizaru.tar.gz
cd nokizaru
gem build nokizaru.gemspec
gem install nokizaru-*.gem
nokizaru --helpSome modules use API keys to fetch data from different resources. These are optional—if you do not provide an API key, the module will be skipped.
Keys are read from environment variables if they are set; otherwise they are loaded from the user data directory (~/.local/share/nokizaru/keys.json).
NK_BEVIGIL_KEY, NK_BINEDGE_KEY, NK_CENSYS_API_ID, NK_CENSYS_API_SECRET,
NK_CHAOS_KEY, NK_FB_KEY, NK_HUNTER_KEY, NK_NETLAS_KEY,
NK_SHODAN_KEY, NK_VT_KEY, NK_WAPPALYZER_KEY, NK_ZOOMEYE_KEY
# Example :
export NK_SHODAN_KEY="kl32lcdqwcdfv"You can use -k to add keys which will be saved automatically in the config directory.
# Usage
nokizaru -k '<API NAME>@<API KEY>'
Valid Keys : 'bevigil', 'binedge', 'censys_api_id', 'censys_api_secret', 'chaos', 'facebook', 'hunter', 'netlas', 'shodan', 'virustotal', 'wappalyzer', 'zoomeye'
# Example :
nokizaru -k 'shodan@kl32lcdqwcdfv'Path = $HOME/.local/share/nokizaru/keys.json
| Source | Module | Link |
|---|---|---|
| Sub Domain Enum | https://developers.facebook.com/docs/facebook-login/access-tokens | |
| VirusTotal | Sub Domain Enum | https://www.virustotal.com/gui/my-apikey |
| Shodan | Sub Domain Enum | https://developer.shodan.io/api/requirements |
| BeVigil | Sub Domain Enum | https://bevigil.com/osint-api |
| BinaryEdge | Sub Domain Enum | https://app.binaryedge.io/ |
| Netlas | Sub Domain Enum | https://docs.netlas.io/getting_started/ |
| ZoomEye | Sub Domain Enum | https://www.zoomeye.hk/ |
| Hunter | Sub Domain Enum | https://hunter.how/search-api |
| Chaos | Sub Domain Enum | https://docs.projectdiscovery.io/tools/chaos |
| Censys | Sub Domain Enum | https://search.censys.io/api |
| Wappalyzer | Architecture Fingerprinting | https://www.wappalyzer.com/api/ |
Default config file is available at ~/.config/nokizaru/config.json
{
"common": {
"timeout": 30,
"dns_servers": "8.8.8.8, 8.8.4.4, 1.1.1.1, 1.0.0.1"
},
"ssl_cert": {
"ssl_port": 443
},
"port_scan": {
"threads": 50
},
"dir_enum": {
"threads": 50,
"redirect": false,
"verify_ssl": true,
"extension": ""
},
"export": {
"format": "txt"
}
}Nokizaru - Recon Refined
Arguments:
-h, --help Show this help message and exit
-v, --version Show version number and exit
--target TARGET Target (http[s]://host[:port])
--headers Header Information
--sslinfo SSL Certificate Information
--whois Whois Lookup
--crawl Crawl Target
--dns DNS Enumeration
--sub Sub-Domain Enumeration
--arch Architecture Fingerprinting
--dir Directory Search
--wayback Wayback URLs
--wb-raw Wayback raw URL output (no quality filtering)
--ps Fast Port Scan
--full Full Recon
--no-[MODULE] Skip specified modules above during full scan (eg. --no-dir)
--export Write results to export directory
Persistence / Enrichment:
--project [NAME] Enable a persistent workspace (profiles, caching, diffing)
--cache Enable caching even without a project
--no-cache Disable caching (even in a project)
--diff last / [ID] Diff this run against the last (or another run ID in the workspace)
Extra Options:
-nb Hide Banner
-dt DT Number of threads for directory enum [ Default : 30 ]
-pt PT Number of threads for port scan [ Default : 50 ]
-T T Request Timeout [ Default : 30.0 ]
-w W Path to Wordlist [ Default : wordlists/raft_med-dir_5k.txt ]
-H HEADER Add custom request header (repeatable)
-r Follow redirects during directory enum [ Default : False ]
-s Toggle SSL Verification [ Default : True ]
-sp SP Specify SSL Port [ Default : 443 ]
-d D Custom DNS Servers [ Default : 1.1.1.1 ]
-e E File Extension(s) (comma separated) [ Example : txt,xml,php,etc. ]
-o O Export Format(s) (comma-separated) [ Default : txt,json,html ]
-cd CD Change export directory [ Default : ~/.local/share/nokizaru/dumps/nk_<domain> ]
-of OF Change export folder name [ Default : YYYY-MM-DD_HH-MM-SS ]
-k K Add API key [ Example : shodan@key ]# Full scan
nokizaru --full --target https://example.com
# Check headers
nokizaru --headers --target https://example.com
# Crawl target
nokizaru --crawl --target https://example.com
# Directory enumeration
nokizaru --dir --target https://example.com -e txt,php -w /path/to/wordlist
# Authenticated crawl + dir enum with a session cookie
nokizaru --crawl --dir --target https://example.com \
-H 'Cookie: PHPSESSID=abc123; uid=52' \
-H 'X-Role: admin'Custom headers are applied only to in-scope target requests. Nokizaru does not echo supplied header values back in module banners.
Nokizaru is ephemeral by default (stdout). If you specify --export, it will write TXT, JSON, and HTML reports (unless you narrow formats with -o).
By default, exports are written to:
~/.local/share/nokizaru/dumps/nk_<domain>/
├── YYYY-MM-DD_HH-MM-SS.txt
├── YYYY-MM-DD_HH-MM-SS.json
└── YYYY-MM-DD_HH-MM-SS.htmlEach target gets its own directory, and each run is timestamped for easy organization and sorting. You can override the directory with -cd or the basename with -of.
If you specify --project <name>, Nokizaru can create a persistent workspace for a target using the Ronin Framework:
- stores run metadata and results internally (so you can build a target profile over time)
- enables caching (speeding up repeated runs)
- enables diffing between runs:
--diff last(or--diff <Run ID>)
The following providers are planned for integration to enhance recon coverage and signal quality:
- GreyNoise: Internet noise classification to filter out mass-scanning activity and focus on targeted reconnaissance
All providers will follow Nokizaru's existing integration pattern: optional API keys, graceful degradation on failure, and consistent error reporting. These additions prioritize breadth of coverage and actionable intelligence to support the bug bounty/pentest recon workflow.
- Nokizaru is intended for authorized security testing and research. Always ensure you have explicit permission to scan targets you do not own.
- Nokizaru is licensed under the MIT License. If you reuse Nokizaru or redistribute derived work, ensure you preserve applicable license notices.
