Get AI summaries of any video or article — Sign up free
find HIDDEN urls!! (subdomain enumeration hacking) // ft. HakLuke thumbnail

find HIDDEN urls!! (subdomain enumeration hacking) // ft. HakLuke

NetworkChuck·
5 min read

Based on NetworkChuck's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

Subdomain enumeration helps uncover hostnames that may point to entirely different servers and applications, expanding the attack surface under a domain.

Briefing

Hidden URLs aren’t just a curiosity—they’re often the difference between a secure website and one with exposed endpoints. Subdomain enumeration and URL discovery techniques help map those “extra” locations under a domain, such as learn.networkchuck.com, which may point to entirely different servers and applications. That matters because unused or misconfigured subdomains can become security liabilities, while bug bounty hunters use the same process to find attack surfaces worth reporting.

The transcript frames this as a legitimate recon workflow when done with permission. In bug bounty programs, companies define a scope of domains and rules for what testers can probe; platforms like HackerOne then pay for validated vulnerabilities. Within that allowed scope, automated discovery can surface candidate targets—like hypothetical vulnerable subdomains—faster than manual browsing. The practical goal is to identify endpoints that might host vulnerabilities, then report findings through the program’s process.

Two tools anchor the workflow. HackCrawler is presented as a web crawler that performs active enumeration: it takes one or more target websites, follows links recursively to a configurable depth, and discovers additional resources such as JavaScript files. Because it actually navigates through the target’s web surface, it can be powerful enough to stress systems or violate rules if the program’s scope doesn’t permit active crawling. The transcript emphasizes caution: not every scope allows this kind of direct probing.

Installation is demonstrated using Docker, with Linux as the setup baseline. The approach uses Docker to pull or build the tool’s container image, then run it with parameters such as the target domain and optional flags (including a subdomain-focused mode). The output is described as fast and broad, including links discovered through external references like social profiles.

The second tool, GAU (Get All URLs), is positioned as more passive. Instead of crawling a live site, GAU aggregates known URLs from existing datasets and archives. The transcript lists sources such as AlienVault, Open Threat Exchange, the Wayback Machine, and Common Crawl (hosted via AWS infrastructure). That design reduces direct interaction with the target server while still producing large URL lists that can reveal hidden paths and resources.

Beyond tool usage, the transcript highlights why these projects exist: developers build automation when existing tools don’t meet a specific need. HackLuke’s choice of Go is attributed to speed and native concurrency, while the broader community lesson is to tinker, fork, and extend open-source tools rather than waiting for a perfect solution. The closing guidance ties the workflow back to defensive and offensive value—discover what a domain exposes, fix what shouldn’t be public, or report what is vulnerable—while repeatedly stressing permission and ethical boundaries. A sponsor segment promotes using a VPN (Private Internet Access) for privacy during online activity, paired with claims of encryption and a no-log policy.

Cornell Notes

Subdomain enumeration and hidden-URL discovery help map the “extra” parts of a domain—often separate servers or applications—so security teams and bug bounty hunters can find exposed endpoints faster. HackCrawler performs active enumeration by crawling a site recursively and discovering links and resources like JavaScript files; it can be powerful enough to require strict scope permission. GAU (Get All URLs) performs more passive enumeration by pulling known URLs from archives and threat-intel datasets such as the Wayback Machine, Common Crawl, AlienVault, and Open Threat Exchange, without directly crawling the live site. Both tools are demonstrated via Docker, and the transcript emphasizes ethical use: only test targets with explicit permission. The broader takeaway is to learn from tool authors and build or extend your own automation when gaps exist.

What problem does subdomain enumeration solve, and why can it reveal security risk?

Subdomain enumeration identifies additional hostnames under a domain that aren’t obvious from the main site. A subdomain like learn.networkchuck.com can point to a completely different system or server than the primary domain, meaning it may run different applications, configurations, or legacy code. Unused or forgotten subdomains can become dangerous if they expose services or vulnerable endpoints. In bug bounty contexts, enumerating these endpoints helps testers find where vulnerabilities might exist within the program’s allowed scope.

How does HackCrawler’s approach differ from GAU’s, and what does that mean for scope and risk?

HackCrawler is active enumeration: it crawls the target by navigating through pages and following links to a specified depth. That means it can generate real traffic to the target and discover resources like JavaScript files, which can be valuable for finding hidden functionality. GAU is more passive: it queries existing databases and archives for known URLs (e.g., Wayback Machine, Common Crawl, AlienVault, Open Threat Exchange) rather than crawling the live server. Active crawling typically demands stricter scope permission than passive aggregation.

What sources does GAU rely on to produce URL lists without crawling the target directly?

GAU aggregates URLs from known datasets and web archives. The transcript specifically mentions AlienVault and Open Threat Exchange as sources of known URLs, the Wayback Machine for historical snapshots, and Common Crawl as another dataset. Common Crawl is described as being hosted on AWS servers via an open repository, enabling GAU to retrieve large amounts of previously collected web data.

Why is Docker recommended for installing and running these tools?

Docker provides a repeatable environment for running security tools without complex local dependency setup. The transcript shows installing Docker on a Linux (Debian-based) system, then using Docker build/run to pull the tool’s container image and execute it with command-line parameters. This keeps the workflow consistent across machines and reduces friction when trying multiple tools.

What practical workflow do these tools support for bug bounty or defensive testing?

A typical workflow is: (1) enumerate subdomains and hidden URLs for a domain within authorized scope, (2) collect candidate endpoints (paths, resources, and sometimes JavaScript files), and (3) investigate for vulnerabilities and report validated issues through the program’s process. The transcript notes that bug bounty programs define what’s in scope and what isn’t, and platforms like HackerOne provide the submission and payout mechanism once a vulnerability is confirmed.

What’s the community lesson about tool-building mentioned in the transcript?

When existing tools don’t match a specific need, developers can build their own automation. The transcript highlights the hacking/community mindset of tinkering and creating rather than waiting for perfect tooling. It also encourages learning from open-source authors, forking repositories, and adding features—while sharing improvements back to the community. The underlying message: start building even if the first version isn’t polished.

Review Questions

  1. When would active enumeration (like crawling) be inappropriate even if a tool can do it?
  2. Compare how HackCrawler and GAU obtain URLs. Which one interacts with the live target more directly, and why does that matter?
  3. What kinds of security outcomes can come from discovering JavaScript files and historical URLs for a domain?

Key Points

  1. 1

    Subdomain enumeration helps uncover hostnames that may point to entirely different servers and applications, expanding the attack surface under a domain.

  2. 2

    Hidden URL discovery can support both defensive audits and bug bounty hunting, but only within explicitly permitted scope.

  3. 3

    HackCrawler performs active enumeration by crawling pages recursively and extracting links and resources such as JavaScript files, which can generate real traffic to targets.

  4. 4

    GAU (Get All URLs) performs more passive enumeration by aggregating known URLs from archives and datasets like the Wayback Machine, Common Crawl, AlienVault, and Open Threat Exchange.

  5. 5

    Docker is used to install and run both tools in a consistent environment, reducing local setup friction.

  6. 6

    Tool authors choose implementation details (like Go for concurrency) to improve performance and usability for large-scale discovery.

  7. 7

    The transcript emphasizes learning-by-building: fork existing tools, add features, and share improvements rather than waiting for a perfect solution.

Highlights

A subdomain can quietly expose a completely different system than the main domain, making enumeration a practical first step for finding vulnerable endpoints.
HackCrawler’s active crawling can be powerful but may require stricter scope permission because it navigates the target directly.
GAU avoids live crawling by pulling known URLs from archives and threat-intel datasets, producing large results with less direct interaction.
Both tools are run via Docker, turning complex setup into a repeatable command-line workflow.
The core community message: automate repetitive recon tasks, then extend open-source tools when gaps remain.

Topics

  • Subdomain Enumeration
  • Hidden URLs
  • Active vs Passive Recon
  • Docker Tooling
  • Bug Bounty Scope