Skip to content

Robofinder retrieves historical #robots.txt files from #Archive.org, allowing you to uncover previously disallowed directories and paths for any domain—essential for deepening your #OSINT and #recon process.

Notifications You must be signed in to change notification settings

Spix0r/robofinder

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 

Repository files navigation

Robofinder

Robofinder is a Python script that allows you to search for and retrieve historical robots.txt files for any given website using Archive.org. This tool is particularly useful for security researchers and web archivists to discover previously accessible paths or directories that were once listed in a site's robots.txt.

Features

  • Fetch historical robots.txt files from Archive.org.
  • Extract and display old paths or directories that were once disallowed or listed.
  • Option to save the output to a file.
  • Silent mode for unobtrusive execution.
  • Multi-threading support to speed up the search process.

Installation

Clone the repository and install the required dependencies:

git clone https://github.com/Spix0r/robofinder.git
cd robofinder
pip install -r requirements.txt

Usage

Run the program by providing a URL with the -u flag:

python3 robofinder.py -u https://example.com

Additional Options

  • Save output to file:
    python3 robofinder.py -u https://example.com -o results.txt
  • Concatenate paths with site URL:
    python3 robofinder.py -u https://example.com -c
  • Run in silent mode (no console output):
    python3 robofinder.py -u https://example.com --silent
  • Multi-threading (default recommended: 10 threads):
    python3 robofinder.py -u https://example.com -t 10 -c -o results.txt

Example

Running Robofinder on example.com and saving the result to results.txt with 10 threads:

python3 robofinder.py -u https://example.com -t 10 -o results.txt

About

Robofinder retrieves historical #robots.txt files from #Archive.org, allowing you to uncover previously disallowed directories and paths for any domain—essential for deepening your #OSINT and #recon process.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages