Termux ID: SQLMap -->

Zeus is an advanced reconnaissance utility designed to make web application reconnaissance simple. Zeus comes complete with a powerful built-in URL parsing engine, multiple search engine compatibility, the ability to extract URLs from both ban and webcache URLs, the ability to run multiple vulnerability assessments on the target, and is able to bypass search engine captchas.

Features
  • A powerful built in URL parsing engine
  • Multiple search engine compatibility (DuckDuckGo, AOL, Bing, and Google default is Google)
  • Ability to extract the URL from Google's ban URL thus bypassing IP blocks
  • Ability to extract from Google's webcache URL
  • Proxy compatibility (http, https, socks4, socks5)
  • Tor proxy compatibility and Tor browser emulation
  • Parse robots.txt/sitemap.xml and save them to a file
  • Multiple vulnerability assessments (XSS, SQLi, clickjacking, port scanning, admin panel finding, whois lookups, and more)
  • Tamper scripts to obfuscate XSS payloads
  • Can run with a custom default user-agent, one of over 4000 random user-agents, or a personal user-agent
  • Automatic issue creation when an unexpected error arises
  • Ability to crawl a webpage and pull all the links
  • Can run a singular dork, multiple dorks in a given file, or a random dork from a list of over 5000 carefully researched dorks
  • Dork blacklisting when no sites are found with the search query, will save the query to a blacklist file
  • Identify WAF/IPS/IDS protection of over 20 different firewalls
  • Header protection enumeration to check what kind of protection is provided via HTTP headers
  • Saving cookies, headers, and other vital information to log files
  • and much more...

Screenshots

Running without a mandatory options, or running the --help flag will output Zeus's help menu:


 A basic dork scan with the -d flag, from the given dork will launch an automated browser and pull the Google page results:


  Calling the -s flag will prompt for you to start the sqlmap API server python sqlmapapi.py -s from sqlmap, it will then connect to the API and perform a sqlmap scan on the found URL's.


You can see more screenshots here

Demo


Requirements
There are some requirements for this to be run successfully.

Basic requirements
  • libxml2-dev, libxslt1-dev, python-dev are required for the installation process
  • Firefox web browser is required as of now, you will need Firefox version <=57 >=51 (between 51 and 57). Full functionality for other browsers will eventually be added.
  • If you want to run sqlmap through the URL's you will need sqlmap somewhere on your system.
  • If you want to run a port scan using nmap on the URL's IP addresses. You will need nmap on your system.
  • Geckodriver is required to run the firefox web browser and will be installed the first time you run. It will be added to your /usr/bin so that it can be run in your ENV PATH.
  • You must be sudo for the first time running this so that you can add the driver to your PATH, you also may need to run as sudo depending on your permissions. NOTE: Depending on permissions you may need to be sudo for any run involving the geckodriver
  • xvfb is required by pyvirtualdisplay, it will be installed if not installed on your first run

Python package requirements
  • selenium-webdriver package is required to automate the web browser and bypass API calls.
  • requests package is required to connect to the URL, and the sqlmap API
  • python-nmap package is required to run nmap on the URL's IP addresses
  • whichcraft package is required to check if nmap and sqlmap are on your system if you want to use them
  • pyvirtualdisplay package is required to hide the browser display while finding the search URL
  • lxml is required to parse XML data for the sitemap and save it as such
  • psutil is required to search for running sqlmap API sessions
  • beautifulsoup is required to pull all the HREF descriptor tags and parse the HTML into an easily workable syntax

Installation
You can download the latest tar.gz, the latest zip, or you can find the current stable release here. Alternatively you can install the latest development version by following the instructions that best match your operating system:
NOTE: (optional but highly advised) add sqlmap and nmap to your environment PATH by moving them to /usr/bin or by adding them to the PATH via terminal

Ubuntu/Debian
sudo apt-get install libxml2-dev libxslt1-dev python-dev &&  git clone https://github.com/ekultek/zeus-scanner.git && cd zeus-scanner && sudo pip2 install -r requirements.txt && sudo python zeus.py

centOS
sudo apt-get install gcc python-devel libxml2-dev libxslt1-dev python-dev && git clone https://github.com/ekultek/zeus-scanner.git && cd zeus-scanner && sudo pip2 install -r requirements.txt && sudo python zeus.py

Others
sudo apt-get install libxml2-dev libxslt1-dev python-dev && git clone https://github.com/ekultek/zeus-scanner.git && cd zeus-scanner && sudo pip2 install -r requirements.txt && sudo python zeus.py
This will install all the package requirements along with the geckodriver


Zeus-Scanner - Advanced Reconnaissance Utility


Striker is an offensive information and vulnerability scanner.

Features
Just supply a domain name to Striker and it will automatically do the following for you:
  • Check and Bypass Cloudflare
  • Retrieve Server and Powered by Headers
  • Fingerprint the operating system of Web Server
  • Detect CMS (197+ CMSs are supported)
  • Launch WPScan if target is using Wordpress
  • Retrieve robots.txt
  • Check if the target is a honeypot
  • Port Scan with banner grabbing
  • Dumps all kind of DNS records
  • Generate a map for visualizing the attack surface
  • Gather Emails related to the target
  • Find websites hosted on the same web server
  • Find hosts using google
  • Crawl the website for URLs having parameters
  • SQLi scan using online implemention of SQLMap (takes < 3 min.)
  • Basic XSS scanning

Screenshots





Striker - Offensive Information And Vulnerability Scanner


There are some features that we think SQLMap should have. Like finding admin panel of the target, better hash cracking etc. If you think the same, SQLMate is for you.

What it does?
  • Feed it a SQL injection dork via --dork option and it will find vulnerable sites for you. After that, it will try to find their admin panels and also try to bypass them with SQL queries.
  • It can do very fast hash lookups for MD5, SHA1 and SHA2. You can supply a hash with --hash option. Average lookup takes less than 2 seconds.
  • You can also supply it a txt file containing hashes to be cracked with --list option.
  • The first mode just checks for 13 most common admin panel locations but if you feed a website through --admin option, you can do a full scan using 482 paths.
  • SQLMate has ability to scrap dorks as well. Specify dumping level via --dump option. Using --dump 1 will dump nearly 20 dorks so set the level anywhere between 1-184 as per your needs. SQLMate automatically saves the dorks into a txt file so you can use them later.
Scroll down for more.

Screenshots




Running SQLMate
Enter the following command in terminal to download SQLMate
git clone https://github.com/UltimateHackers/sqlmate
Then navigate to the sqlmate directory by entering this command
cd sqlmate
Now install the required modules
pip install -r requirements.txt
Now run sqlmate
python sqlmate

Available command line options
usage: sqlmate [-h] [--dork DORK] [--hash HASH] [--list <path>]
[--dump 1-184] [--admin URL] [--type PHP,ASP,HTML]

optional arguments:
-h, --help show this help message and exit
--dork DORK Supply a dork and let SQLMate do its thing
--hash HASH 'Crack' a hash in 5 secs
--list <path> Import and crack hashes from a txt file
--dump 1-184 Get dorks. Specify dumping level. Level 1 = 20 dorks
--admin URL Find admin panel of website
--type PHP,ASP,HTML Choose extension to scan (Use with --admin option,
Default is all)


sqlmate - Tool which will do what you always expected from SQLmap


Selenium powered Python script to automate searching the web for vulnerable applications.
DorkNet can take a single dork or a list of dorks as arguments. After the proper command line arguments have been passed, the script will use Selenium and Geckodriver to find the results we want and save them to a textfile for further processing with SQLmap or similar utilities.

Usage
git clone https://github.com/NullArray/DorkNet.git
cd DorkNet
python dorknet.py
The options for the program are as follows.
-h, --help              show this help message and exit
-d DORK, --dork DORK specify the dork you wish to use
-l LIST, --list LIST specify path to list with dorks
-v, --verbose toggle verbosity
Some examples for clarity.
DorkNet.py -h
DorkNet.py -d inurl:show.php?id= -v
DorkNet.py -l /path/to/list.txt --verbose

Dependencies
You will need the Mozilla Geckodriver for this to work. After it has been installed feel free to use the requirements file i made for this program
pip install -r requirements.txt

Known Issue
By using Selenium and Geckodriver, DorkNet is effective at emulating a regular browser. In this manner the program is able to avoid captchas most of the time. However on limited occasions, Google throws one regardless. The same sometimes happens when manually searching for strings that look like a dork. Should you encounter one, you can just fill out the captcha in the Geckodriver and DorkNet will continue it's normal operation.


DorkNet - Selenium Powered Python Script To Automate Searching For Vulnerable Web Apps