Termux ID: Windows -->

Tool to identify if a domain is a CMS such as Wordpress, Moodle, Joomla, Drupal or Prestashop.

Use
python cmssc4n.py -h 
_____ __ __ _____ _ _
/ ____| \/ |/ ____| | || |
| | | \ / | (___ ___ ___| || |_ _ __
| | | |\/| |\___ \/ __|/ __|__ _| '_ \
| |____| | | |____) \__ \ (__ | | | | | |
\_____|_| |_|_____/|___/\___| |_| |_| |_|

** Tool to scan if a domain is a CMS (Wordpress , Drupal, Joomla, Prestashop or Moodle) and return the version
** Author: Ignacio Brihuega Rodriguez a.k.a N4xh4ck5
** Version 1.0
** DISCLAMER This tool was developed for educational goals.
** The author is not responsible for using to others goals.
** A high power, carries a high responsibility!
usage: cmssc4n.py [-h] -e EXPORT -i INPUT

This tool verifies if the domain is a CMS (Wordpress , Drupal, Joomla, Prestashop or Moodle) and returns the version

optional arguments:
-h, --help show this help message and exit
-e EXPORT, --export EXPORT
Indicate the type of format to export results.
1.json (by default)
2.xlsx
-i INPUT, --input INPUT
File in json format which contains the domains want to know if they are a CMS


CMSsc4n - Tool to identify if a domain is a CMS such as Wordpress, Moodle, Joomla, Drupal or Prestashop


WebDavC2 is a PoC of using the WebDAV protocol with PROPFIND only requests to serve as a C2 communication channel between an agent, running on the target system, and a controller acting as the actuel C2 server.

Architecture
WebDavC2 is composed of:
  • a controller, written in Python, which acts as the C2 server
  • an agent, written in C#/.Net, running on the target system, delivered to the target system via various initial stagers
  • various flavors of initial stagers (created on the fly when the controller starts) used for the initial compromission of the target system

Features
WebDavC2 main features:
  • Various stager (powershell one liner, batch file, different types of MS-Office macro, JScript file) - this is not limited, you can easily come up with your own stagers, check the templates folder to get an idea
  • Pseudo-interactive shell (with environment persistency)
  • Auto start of the WebClient service, even from an unprivileged user using the 'pushd' trick

Installation & Configuration
Installation is pretty straight forward:
  • Git clone this repository:
    git clone https://github.com/Arno0x/WebDAVC2 WebDavC2
  • cd into the WebDavC2 folder:
    cd WebDavC2
  • Give the execution rights to the main script:
    chmod +x webDavC2.py
To start the controller, simply type
./webDavC2.py
.

Compiling your own agent
Although it is perfectly OK to use the provided agent.exe, you can very easily compile your own executables of the agent, from the source code provided. You don't need Visual Studio installed.
  • Copy the
    agent/agent.cs
    file on a Windows machine with the .Net framework installed
  • CD into the source directory
  • Use the .Net command line C# compiler:
    • To get the standard agent executable:
      C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe /out:agent.exe *.cs
    • To get the debug version:
      C:\Windows\Microsoft.NET\Framework64\v4.0.30319\csc.exe /define:DEBUG /out:agent_debug.exe *.cs


WebDavC2 - A WebDAV C2 Tool


Tools It's Supported By Terminal Command Prompt For Windows, We Publish At 01 - 12 -2017, Thanks To Friends Who Supported This Project

Supported With Command Features:

- Admin Panel Finder 

Admin Panel is a place where Administrators can manage and manage site content.

Command Usage : 01

- Dork

dork is really lazy, but here is dork itself is GOOGLE DORK (s). I can guess, surely if when you search on google with keyword "Tutorial Carding" while reading the tutorial you will find the words "Dork". Yes, actually dork itself is a weapon / tools heking * gubraakk: v that can be created by everyone with only creative brain mind, if you already know a little about dork, surely you assume dork itself point is to find targets for SQLi, Deface, etc. As inurl bla..bla..bla, intext bla..bla..bla, allinurl bla..bla..bla, + site: .bla..bla..bla .

Command Usage : 02

- Whois Lookup

Whois or voiced "who is" is used to get domain specific information such as domain name, ip address, name server and age domain. Whois lookup is a command line based application used to query against whois database.

Command Usage : 03

- Port Scanner

In the TCP / IP network protocol, a port is a mechanism that allows a computer to support multiple connection sessions with other computers and programs on the network. Ports can identify applications and services that use connections within the TCP / IP network.

Command Usage : 04

- Robots.txt Finder 

Robots.txt is a file at the root of your site that shows the inside of a site you are not allowed to be accessed by search engine crawlers. Files use the Robot Exclusion Standard, which is a protocol with a small set of commands that can be used to indicate access to sites by section and based on certain types of web crawlers (such as mobile crawlers vs. desktop crawlers).

Command Usage : 05

- Proxy Checker

The definition of proxy is a server that provides a service to forward any user requests to other servers contained on the internet. Or another proxy server definition is a server or computer program that has a role as a liaison between a computer with the internet.

Download HaxorScan 1.0 Multi Tools For Scan Website Informations With Python


ProcDump is a Linux reimagining of the classic ProcDump tool from the Sysinternals suite of tools for Windows. ProcDump provides a convenient way for Linux developers to create core dumps of their application based on performance triggers.

Installation & Usage

Requirements
  • Minimum OS: Ubuntu 14.04 LTS (Desktop or Server)
    • We are actively testing against other Linux distributions. If you have requests for specific distros, please let us know (or create a pull request with the necessary changes).
  • gdb (>=7.7.1)

Install ProcDump

Via Package Manager [prefered method]

1. Add the Microsoft Product feed
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg

Register the Microsoft Product feed

Ubuntu 16.04
sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-xenial-prod xenial main" > etc/apt/sources.list.d/microsoft.list'

Ubuntu 14.04
sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-trusty-prod trusty main" > /etc/apt/sources.list.d/microsoft.list'

2. Install Procdump
sudo apt-get update
sudo apt-get install procdump

Via .deb Package
Pre-Depends: dpkg(>=1.17.5)

1. Download .deb Package

Ubuntu 16.04
wget https://packages.microsoft.com/repos/microsoft-ubuntu-xenial-prod/pool/main/p/procdump/procdump_1.0_amd64.deb

Ubuntu 14.04
wget https://packages.microsoft.com/repos/microsoft-ubuntu-trusty-prod/pool/main/p/procdump/procdump_1.0_amd64.deb

2. Install Procdump
sudo dpkg -i procdump_1.0_amd64.deb
sudo apt-get -f install

Uninstall

Ubuntu 14.04+
sudo apt-get purge procdump

Usage
Usage: procdump [OPTIONS...] TARGET
OPTIONS
-C CPU threshold at which to create a dump of the process from 0 to 200
-c CPU threshold below which to create a dump of the process from 0 to 200
-M Memory commit threshold in MB at which to create a dump
-m Trigger when memory commit drops below specified MB value.
-n Number of dumps to write before exiting
-s Consecutive seconds before dump is written (default is 10)
TARGET must be exactly one of these:
-p pid of the process

Examples
The following examples all target a process with pid == 1234
The following will create a core dump immediately.
sudo procdump -p 1234
The following will create 3 core dumps 10 seconds apart.
sudo procdump -n 3 -p 1234
The following will create 3 core dumps 5 seconds apart.
sudo procdump -n -s 5 -p 1234
The following will create a core dump each time the process has CPU usage >= 65%, up to 3 times, with at least 10 seconds between each dump.
sudo procdump -C 65 -n 3 -p 1234
The following with create a core dump each time the process has CPU usage >= 65%, up to 3 times, with at least 5 seconds between each dump.
sudo procdump -C 65 -n 3 -s 5 -p 1234
The following will create a core dump when CPU usage is outside the range [10,65].
sudo procdump -c 10 -C 65 -p 1234
The following will create a core dump when CPU usage is >= 65% or memory usage is >= 100 MB.
sudo procdump -C 65 -M 100 -p 1234


ProcDump for Linux - A Linux version of the ProcDump Sysinternals tool


Quasar adalah alat administrasi jarak jauh yang cepat dan ringan yangdikodekan di C #. Menyediakan stabilitas tinggi dan antarmuka pengguna yang mudah digunakan, Quasar adalah solusi administrasi jarak jauh yang sempurna untuk Anda.

Fitur
  • TCP network stream (IPv4 & IPv6 support)
  • Fast network serialization (NetSerializer)
  • Compressed (QuickLZ) & Encrypted (AES-128) communication
  • Multi-Threaded
  • UPnP Support
  • No-Ip.com Support
  • Visit Website (hidden & visible)
  • Show Messagebox
  • Task Manager
  • File Manager
  • Startup Manager
  • Remote Desktop
  • Remote Webcam
  • Remote Shell
  • Download & Execute
  • Upload & Execute
  • System Information
  • Computer Commands (Restart, Shutdown, Standby)
  • Keylogger (Unicode Support)
  • Reverse Proxy (SOCKS5)
  • Password Recovery (Common Browsers and FTP Clients)
  • Registry Editor

Diperlukan
  • .NET Framework 4.0 Client Profile (Download)
  • Supported Operating Systems (32- and 64-bit)
    • Windows XP SP3
    • Windows Server 2003
    • Windows Vista
    • Windows Server 2008
    • Windows 7
    • Windows Server 2012
    • Windows 8/8.1
    • Windows 10

Compiling
Buka proyek di Visual Studio dan klik build, atau gunakan salah satu file batch yang disertakan dalam direktori root.
Batch file Description
build-debug.bat Build aplikasi menggunakan konfigurasi debug (untuk pengujian)
build-release.bat Build aplikasi menggunakan konfigurasi rilis (untuk penerbitan)

Build Client
Build configuration Description
debug configuration Yang telah ditentukan sebelumnya Settings.cs akan digunakan. Build client tidak bekerja dalam konfigurasi ini. Anda bisa langsung menjalankan klien dengan pengaturan yang ditentukan.
release configuration Gunakan client builder, jika tidak, itu akan crash.


QuasarRAT - Remote Administration Tool for Windows


w3af is an open source web application security scanner which helps developers and penetration testers identify and exploit vulnerabilities in their web applications.
The scanner is able to identify 200+ vulnerabilities, including Cross-Site Scripting, SQL injection and OS commanding.


Identify and exploit a SQL injection

One of the most difficult parts of securing your application is to identify the vulnerable parameters and define the real risk. This video shows how to easily identify and exploit SQL injection vulnerabilities. As bonus the video shows how to extract information using web application payloads.

Batteries included

Want to know more about the low-level features provided by our framework? Go through our features page in order to understand what’s under the hood.


Plugin architecture


Vulnerabilities are identified using plugins, which are short and sweet pieces of Python code that send specially crafted HTTP requests to forms and query string parameters to identify errors and mis-configurations.


Flexible

Easy to use for novice users, fully customizable for hackers and developers. We’ve built it that way.

Expert tools

Besides the automated scanning features w3af’s GUI provides expert tools which allow the advanced users to manually craft and send custom HTTP requests, generate requests in an automated manner, cluster HTTP responses and more!

More here.

w3af - Web Application Attack and Audit Framework


InSpy is a python based LinkedIn enumeration tool. Inspy has two functionalities: TechSpy and EmpSpy.
  • TechSpy - Crawls LinkedIn job listings for technlogoies used by the provided company. InSpy attempts to identify technologies by matching job descriptions to keywords from a new line delimited file.
  • EmpSpy - Crawls LinkedIn for employees working at the provided company. InSpy searches for employees by title and/or departments from a new line delimited file. InSpy may also create emails for the identified employees if the user specifies an email format.

Installation
Run
pip install -r requirements.txt
within the cloned InSpy directory.

Help
InSpy - A LinkedIn enumeration tool by Jonathan Broche (@jonathanbroche)

positional arguments:
company Company name to use for tasks.

optional arguments:
-h, --help show this help message and exit
-v, --version show program's version number and exit

Technology Search:
--techspy [file] Crawl LinkedIn job listings for technologies used by
the company. Technologies imported from a new line
delimited file. [Default: tech-list-small.txt]
--limit int Limit the number of job listings to crawl. [Default:
50]

Employee Harvesting:
--empspy [file] Discover employees by title and/or department. Titles
and departments are imported from a new line delimited
file. [Default: title-list-small.txt]
--emailformat string Create email addresses for discovered employees using
a known format. [Accepted Formats: first.last@xyz.com,
last.first@xyz.com, first_last@xyz.com, last_first@xyz.com,
firstl@xyz.com, lfirst@xyz.com,
flast@xyz.com, lastf@xyz.com, first@xyz.com,
last@xyz.com]

Output Options:
--html file Print results in HTML file.
--csv file Print results in CSV format.
--json file Print results in JSON.


InSpy - A Linkedin Enumeration Tool


Sublist3r is a python tool designed to enumerate subdomains of websites using OSINT. It helps penetration testers and bug hunters collect and gather subdomains for the domain they are targeting. Sublist3r enumerates subdomains using many search engines such as Google, Yahoo, Bing, Baidu, and Ask. Sublist3r also enumerates subdomains using Netcraft, Virustotal, ThreatCrowd, DNSdumpster, and ReverseDNS.
subbrute was integrated with Sublist3r to increase the possibility of finding more subdomains using bruteforce with an improved wordlist. The credit goes to TheRook who is the author of subbrute.

Installation
git clone https://github.com/aboul3la/Sublist3r.git

Recommended Python Version:
Sublist3r currently supports Python 2 and Python 3.
  • The recommended version for Python 2 is 2.7.x
  • The recommened version for Python 3 is 3.4.x

Dependencies:
Sublist3r depends on the requests, dnspython, and argparse python modules.
These dependencies can be installed using the requirements file:
  • Installation on Windows:
c:\python27\python.exe -m pip install -r requirements.txt
  • Installation on Linux
sudo pip install -r requirements.txt
Alternatively, each module can be installed independently as shown below.

Requests Module (http://docs.python-requests.org/en/latest/)
  • Install for Windows:
c:\python27\python.exe -m pip install requests
  • Install for Ubuntu/Debian:
sudo apt-get install python-requests
  • Install for Centos/Redhat:
sudo yum install python-requests
  • Install using pip on Linux:
sudo pip install requests

dnspython Module (http://www.dnspython.org/)
  • Install for Windows:
c:\python27\python.exe -m pip install dnspython
  • Install for Ubuntu/Debian:
sudo apt-get install python-dnspython
  • Install using pip:
sudo pip install dnspython

argparse Module
  • Install for Ubuntu/Debian:
sudo apt-get install python-argparse
  • Install for Centos/Redhat:
sudo yum install python-argparse
  • Install using pip:
sudo pip install argparse
for coloring in windows install the following libraries
c:\python27\python.exe -m pip install win_unicode_console colorama

Usage
Short Form Long Form Description
-d --domain Domain name to enumerate subdomains of
-b --bruteforce Enable the subbrute bruteforce module
-p --ports Scan the found subdomains against specific tcp ports
-v --verbose Enable the verbose mode and display results in realtime
-t --threads Number of threads to use for subbrute bruteforce
-e --engines Specify a comma-separated list of search engines
-o --output Save the results to text file
-h --help show the help message and exit

Examples
  • To list all the basic options and switches use -h switch:
python sublist3r.py -h
  • To enumerate subdomains of specific domain:
python sublist3r.py -d example.com
  • To enumerate subdomains of specific domain and show only subdomains which have open ports 80 and 443 :
python sublist3r.py -d example.com -p 80,443
  • To enumerate subdomains of specific domain and show the results in realtime:
python sublist3r.py -v -d example.com
  • To enumerate subdomains and enable the bruteforce module:
python sublist3r.py -b -d example.com
  • To enumerate subdomains and use specific engines such Google, Yahoo and Virustotal engines
python sublist3r.py -e google,yahoo,virustotal -d example.com

Using Sublist3r as a module in your python scripts
Example
import sublist3r 
subdomains = sublist3r.main(domain, no_threads, savefile, ports, silent, verbose, enable_bruteforce, engines)
The main function will return a set of unique subdomains found by Sublist3r
Function Usage:
  • domain: The domain you want to enumerate subdomains of.
  • savefile: save the output into text file.
  • ports: specify a comma-sperated list of the tcp ports to scan.
  • silent: set sublist3r to work in silent mode during the execution (helpful when you don't need a lot of noise).
  • verbose: display the found subdomains in real time.
  • enable_bruteforce: enable the bruteforce module.
  • engines: (Optional) to choose specific engines.
Example to enumerate subdomains of Yahoo.com:
import sublist3r 
subdomains = sublist3r.main('yahoo.com', 40, 'yahoo_subdomains.txt', ports= None, silent=False, verbose= False, enable_bruteforce= False, engines=None)

Credits


Sublist3r v1.0 - Fast subdomains enumeration tool for penetration testers