How to Use the Command 'feroxbuster' (with examples)

How to Use the Command 'feroxbuster' (with examples)

Feroxbuster is a robust and efficient tool for discovering hidden web content. Written in Rust, it is ideal for security professionals and testers who aim to uncover directories and files on web servers, effectively aiding in penetration testing operations. Its powerful features, such as multi-threading and recursive searching, make it a popular choice for fast and comprehensive web content discovery.

Use Case 1: Discover Specific Directories and Files with Extensions and 100 Threads Using a Random User-Agent

Code:

feroxbuster --url "https://example.com" --wordlist path/to/file --threads 100 --extensions "php,txt" --random-agent

Motivation:

This command is perfect when there is a need to discover potential sensitive directories or files with specific extensions on a web server, such as .php for scripts and .txt for text files. Using a large number of threads helps speed up the discovery process, especially when dealing with vast wordlists. The use of a random user-agent further aids in evading detection by web server logs that look for suspicious activity patterns.

Explanation:

  • --url "https://example.com": Specifies the target URL where exploration is performed.
  • --wordlist path/to/file: Points to a file containing the wordlist used to brute-force directory and file names.
  • --threads 100: Utilizes 100 threads to execute the search concurrently, drastically reducing the time needed for discovery, especially for extensive wordlists.
  • --extensions "php,txt": Limits the search to paths ending with these specified file extensions.
  • --random-agent: Uses random user-agent strings to obfuscate the origin of the requests and reduce detection by restrictive server logs.

Example Output:

200      1l       32w       220c https://example.com/hidden/admin.php
200      0l        0w         0c https://example.com/hidden/notes.txt

The output indicates HTTP status codes (200 for success), number of lines, words, and characters in the response, alongside discovered paths.

Use Case 2: Enumerate Directories Without Recursion Through a Specific Proxy

Code:

feroxbuster --url "https://example.com" --wordlist path/to/file --no-recursion --proxy "http://127.0.0.1:8080"

Motivation:

This command is suitable for scenarios where one is interested in the top-level directories only, without diving deeper into subdirectories, which is particularly useful when the focus is on analyzing the root web directory structure. Employing a proxy in the request path adds a level of indirection, allowing network traffic analysis or consistent IP use for requests.

Explanation:

  • --url "https://example.com": The target URL for discovery.
  • --wordlist path/to/file: Specifies the wordlist to use for discovering directory names.
  • --no-recursion: Halts the process from exploring subdirectories, confining the search to the initial set of provided directories.
  • --proxy "http://127.0.0.1:8080": Routes the requests through the specified proxy server, ideal for testing environments where traffic routing is critical.

Example Output:

301      9l       28w       184c https://example.com/hidden -> /hidden/
403      0l        0w         0c https://example.com/private

The results show HTTP responses with status codes, suggesting some directories might be accessible (301 for redirects) while others might be restricted (403 for Forbidden).

Code:

feroxbuster --url "https://example.com" --extract-links

Motivation:

Web scraping or gathering intelligence about a website’s structure can necessitate collecting hyperlinks embedded within pages. This command allows users to extract potential pathways for exploration, particularly useful in reconnaissance phases where understanding site navigation paths is key.

Explanation:

  • --url "https://example.com": The website whose links you want to analyze.
  • --extract-links: Instructs Feroxbuster to parse pages for discoverable links, providing insight into where a site might lead from a given page.

Example Output:

https://example.com/about-us
https://example.com/contact
https://example.com/legal/privacy-policy

This output includes a list of discovered URLs on the given web page, outlining additional paths to further explore.

Use Case 4: Filter by a Specific Status Code and a Number of Chars

Code:

feroxbuster --url "https://example.com" --filter-status 301 --filter-size 4092

Motivation:

Filtering responses based on status codes and content size is essential when scanning large datasets, as it allows security testers to ignore irrelevant or redundant responses. For example, repeated redirects or typical size mirrored responses can be ignored, streamlining the task of finding useful information.

Explanation:

  • --url "https://example.com": The website to target for scanning.
  • --filter-status 301: Excludes responses with a HTTP 301 status code, typically indicating a redirect.
  • --filter-size 4092: Omits responses that have a content length of 4092 characters, often used to bypass uniform or default-sized pages.

Example Output:

302      8l       29w       402c https://example.com/account/login
404      0l        0w         0c https://example.com/old-page

The listed outputs exclude responses according to defined filters, highlighting new avenues not merely diversions or boilerplate content.

Conclusion:

Feroxbuster is a versatile tool that empowers security professionals to discover hidden web paths efficiently. By leveraging various command options like multi-threading, random user-agents, proxies, content filtering, and link extraction, Feroxbuster becomes an indispensable asset in penetration testing and reconnaissance, offering rich insights into web server architecture and accessibly.

Related Posts

How to Use the Command 'sc_warts2csv' (with Examples)

How to Use the Command 'sc_warts2csv' (with Examples)

The sc_warts2csv command is a tool designed to convert traceroute data collected by Scamper, a utility used for internet measurement studies, into CSV (Comma-Separated Values) format.

Read More
How to Use the Command 'rankmirrors' (with examples)

How to Use the Command 'rankmirrors' (with examples)

The rankmirrors command is a useful utility for Arch Linux users who want to optimize their package fetching process by ranking a list of Pacman mirrors.

Read More
How to Use the Command 'uniq' (with Examples)

How to Use the Command 'uniq' (with Examples)

The uniq command is a utility for filtering out duplicate lines from input data, making it essential for data cleaning, text processing, and various other tasks.

Read More