This script performs automated searches on Searx using dorks, extracts URLs, and saves them to a file. It leverages Tor for anonymity and includes features like IP rotation and user-agent randomization.
Ensure you have Python 3.x and pip
installed on your system.
Installation command Line git clone
git clone https://github.com/bitkeyhash/Dorkx-Tor.git && cd Dorkx-Tor
Run the following command to install the required Python libraries:
pip install -r requirements.txt
Install Tor using the following command:
sudo apt update && sudo apt tor -y
Make sure the Tor service is running:
sudo service tor start
- Prepare Dork File: You can use Default dorks.txt or Create a
dorks.txt
file in the same directory as the script. Add your dorks (one per line). - Run the Script: Execute the script using:
git clone https://github.com/bitkeyhash/Dorkx-Tor.git && cd Dorkx-Tor
chmod +x main.sh && bash main.sh
(ps: I have add script to filter url at the end . final output file is "sqlurl.txt" )
- Follow the prompts to specify the number of pages to scrape per dork.
dorks.txt
Input File containing all search dorks.urls.txt
Output file where extracted URLs are saved.sqlurl.txt
Output file where extracted filtered with format ?id= URLs are saved.
- ๐ Tor Integration: Ensures anonymity by routing traffic through Tor.
- ๐ IP Rotation: Automatically changes IP when rate-limited.
- ๐ต๏ธโโ๏ธ Customizable Headers and Cookies: Mimics real browser behavior.
- ๐ Output Filter Final URL File: Saves unique URLs to
sqlurl.txt
.
- Ensure Tor is installed and running for proper functionality.
- Use responsibly and adhere to legal and ethical guidelines.