Termux ID: Firefox -->

CryKeX - Linux Memory Cryptographic Keys Extractor

Properties:
  • Cross-platform
  • Minimalism
  • Simplicity
  • Interactivity
  • Compatibility/Portability
  • Application Independable
  • Process Wrapping
  • Process Injection

Dependencies:
  • Unix - should work on any Unix-based OS
    • BASH - the whole script
    • root privileges (optional)
Limitations:
  • AES and RSA keys only
  • Fails most of the time for Firefox browser
  • Won't work for disk encryption (LUKS) and PGP/GPG
  • Needs proper user privileges and memory authorizations

How it works
Some work has been already published regarding the subject of cryptograhic keys security within DRAM. Basically, we need to find something that looks like a key (entropic and specific length) and then confirm its nature by analyzing the memory structure around it (C data types).
The idea is to dump live memory of a process and use those techniques in order to find probable keys since, memory mapping doesn't change. Thanks-fully, tools exist for that purpose.
The script is not only capable of injecting into already running processes, but also wrapping new ones, by launching them separately and injecting shortly afterwards. This makes it capable of dumping keys from almost any process/binary on the system.
Of course, accessing a memory is limited by kernel, which means that you will still require privileges for a process.
Linux disk ecnryption (LUKS) uses anti-forensic technique in order to mitigate such issue, however, extracting keys from a whole memory is still possible.
Firefox browser uses somehow similar memory management, thus seems not to be affected.
Same goes for PGP/GPG.

HowTo
Installing dependencies:
sudo apt install gdb aeskeyfind rsakeyfind || echo 'have you heard about source compiling?'
An interactive example for OpenSSL AES keys:
openssl aes-128-ecb -nosalt -out testAES.enc
Enter a password twice, then some text and before terminating:
CryKeX.sh openssl
Finally, press Ctrl+D 3 times and check the result.
OpenSSL RSA keys:
openssl genrsa -des3 -out testRSA.pem 2048
When prompted for passphrase:
CryKeX.sh openssl
Verify:
openssl rsa -noout -text -in testRSA.pem
Let's extract keys from SSH:
echo 'Ciphers aes256-gcm@openssh.com' >> /etc/ssh/sshd_config
ssh user@server
CryKeX.sh ssh
From OpenVPN:
echo 'cipher AES-256-CBC' >> /etc/openvpn/server.conf
openvpn yourConf.ovpn
sudo CryKeX.sh openvpn
TrueCrypt/VeraCrypt is also affected: Select "veracrypt" file in VeraCrypt, mount with password "pass" and:
sudo CryKeX.sh veracrypt
Chromium-based browsers (thanks Google):
CryKeX.sh chromium
CryKeX.sh google-chrome
Despite Firefox not being explicitly affected, Tor Browser Bundle is still susceptible due to tunneling:
CryKeX.sh tor
As said, you can also wrap processes:
apt install libssl-dev
gcc -lcrypto cipher.c -o cipher
CryKeX.sh cipher
wrap
cipher


CryKeX - Linux Memory Cryptographic Keys Extractor


Zeus is an advanced reconnaissance utility designed to make web application reconnaissance simple. Zeus comes complete with a powerful built-in URL parsing engine, multiple search engine compatibility, the ability to extract URLs from both ban and webcache URLs, the ability to run multiple vulnerability assessments on the target, and is able to bypass search engine captchas.

Features
  • A powerful built in URL parsing engine
  • Multiple search engine compatibility (DuckDuckGo, AOL, Bing, and Google default is Google)
  • Ability to extract the URL from Google's ban URL thus bypassing IP blocks
  • Ability to extract from Google's webcache URL
  • Proxy compatibility (http, https, socks4, socks5)
  • Tor proxy compatibility and Tor browser emulation
  • Parse robots.txt/sitemap.xml and save them to a file
  • Multiple vulnerability assessments (XSS, SQLi, clickjacking, port scanning, admin panel finding, whois lookups, and more)
  • Tamper scripts to obfuscate XSS payloads
  • Can run with a custom default user-agent, one of over 4000 random user-agents, or a personal user-agent
  • Automatic issue creation when an unexpected error arises
  • Ability to crawl a webpage and pull all the links
  • Can run a singular dork, multiple dorks in a given file, or a random dork from a list of over 5000 carefully researched dorks
  • Dork blacklisting when no sites are found with the search query, will save the query to a blacklist file
  • Identify WAF/IPS/IDS protection of over 20 different firewalls
  • Header protection enumeration to check what kind of protection is provided via HTTP headers
  • Saving cookies, headers, and other vital information to log files
  • and much more...

Screenshots

Running without a mandatory options, or running the --help flag will output Zeus's help menu:


 A basic dork scan with the -d flag, from the given dork will launch an automated browser and pull the Google page results:


  Calling the -s flag will prompt for you to start the sqlmap API server python sqlmapapi.py -s from sqlmap, it will then connect to the API and perform a sqlmap scan on the found URL's.


You can see more screenshots here

Demo


Requirements
There are some requirements for this to be run successfully.

Basic requirements
  • libxml2-dev, libxslt1-dev, python-dev are required for the installation process
  • Firefox web browser is required as of now, you will need Firefox version <=57 >=51 (between 51 and 57). Full functionality for other browsers will eventually be added.
  • If you want to run sqlmap through the URL's you will need sqlmap somewhere on your system.
  • If you want to run a port scan using nmap on the URL's IP addresses. You will need nmap on your system.
  • Geckodriver is required to run the firefox web browser and will be installed the first time you run. It will be added to your /usr/bin so that it can be run in your ENV PATH.
  • You must be sudo for the first time running this so that you can add the driver to your PATH, you also may need to run as sudo depending on your permissions. NOTE: Depending on permissions you may need to be sudo for any run involving the geckodriver
  • xvfb is required by pyvirtualdisplay, it will be installed if not installed on your first run

Python package requirements
  • selenium-webdriver package is required to automate the web browser and bypass API calls.
  • requests package is required to connect to the URL, and the sqlmap API
  • python-nmap package is required to run nmap on the URL's IP addresses
  • whichcraft package is required to check if nmap and sqlmap are on your system if you want to use them
  • pyvirtualdisplay package is required to hide the browser display while finding the search URL
  • lxml is required to parse XML data for the sitemap and save it as such
  • psutil is required to search for running sqlmap API sessions
  • beautifulsoup is required to pull all the HREF descriptor tags and parse the HTML into an easily workable syntax

Installation
You can download the latest tar.gz, the latest zip, or you can find the current stable release here. Alternatively you can install the latest development version by following the instructions that best match your operating system:
NOTE: (optional but highly advised) add sqlmap and nmap to your environment PATH by moving them to /usr/bin or by adding them to the PATH via terminal

Ubuntu/Debian
sudo apt-get install libxml2-dev libxslt1-dev python-dev &&  git clone https://github.com/ekultek/zeus-scanner.git && cd zeus-scanner && sudo pip2 install -r requirements.txt && sudo python zeus.py

centOS
sudo apt-get install gcc python-devel libxml2-dev libxslt1-dev python-dev && git clone https://github.com/ekultek/zeus-scanner.git && cd zeus-scanner && sudo pip2 install -r requirements.txt && sudo python zeus.py

Others
sudo apt-get install libxml2-dev libxslt1-dev python-dev && git clone https://github.com/ekultek/zeus-scanner.git && cd zeus-scanner && sudo pip2 install -r requirements.txt && sudo python zeus.py
This will install all the package requirements along with the geckodriver


Zeus-Scanner - Advanced Reconnaissance Utility


Paskto will passively scan the web using the Common Crawl internet index either by downloading the indexes on request or parsing data from your local system. URLs are then processed through Nikto and known URL lists to identify interesting content. Hash signatures are also used to identify known default content for some IoT devices or web applications.

  Options

-d, --dir-input directory Directory with common crawl index files with .gz extension. Ex: -d "/tmp/cc/"
-v, --ia-dir-input directory Directory with internet archive index files with .gz extension. Ex: -v "/tmp/ia/"
-o, --output-file file Save test results to file. Ex: -o /tmp/results.csv
-u, --update-db Build/Update Paskto DB from Nikto databases.
-n, --use-nikto Use Nikto DBs. Default: true
-e, --use-extras Use EXTRAS DB. Default: true
-s, --scan domain name Domain to scan. Ex: -s "www.google.ca" or -s "*.google.ca"
-i, --cc-index index Common Crawl index for scan. Ex: -i "CC-MAIN-2017-34-index"
-a, --save-all-urls file Save CSV List of all URLS. Ex: -a /tmp/all_urls.csv
-h, --help Print this usage guide.

Examples

Scan domain, save results and URLs $ node paskto.js -s "www.msn.com" -o /tmp/rest-results.csv -a /tmp/all-urls.csv
Scan domain with CC wildcards. $ node paskto.js -s "*.msn.com" -o /tmp/rest-results.csv -a /tmp/all-urls.csv
Scan domain, only save URLs. $ node paskto.js -s "www.msn.com" -o /tmp/rest-results.csv
Scan dir with indexes. $ node paskto.js -d "/tmp/CC-MAIN-2017-39-index/" -o /tmp/rest-results.csv -a /tmp/all-urls.csv

Create Custom Digest signatures
A quick way to create new digest signatures for default content is to use WARCPinch which is a Chrome Extension I hacked together based off of WARCreate except it creates digital signatures as well as WARC files. (Also adds highlight and right click functionality, which is useful to just highlight any identifying text to use as the name of the signatures).


Paskto - Passive Web Scanner

A wrapper tool for shadowsocks to consistently bypass firewalls.

Quick start

Automatically connect
The easiest way to run this tool is just type ssct in terminal, and ssct will acquire available shadowsocks servers from ishadowsocks and connect to it automatically.

Connect to a specific server
First, show all ss servers by --list option.
ssct --list
Then, connect to a specific server by -n option.
ssct -n 5
Alternatively, you can connect a custom server.
ssct -s <server_addr> -p <server_port> -l <local_port> -k <password> -m <method>

Usage

Requirements
1 Install shadowsocks
# for python2
pip install shadowsocks
# for python3
pip3 install shadowsocks
Note: You can also install shadowsocks with system package manager (apt, yum, dnf, etc) or just chrome app version shadowsocks. However, the chrom app version can't connect automatically.
2 Install python3 modules
pip3 install requests
pip3 install prettytable
Note: The module prettytable is optional, but would be better if installed.

Configuration for google chrome
  1. Install chrome extension SwitchyOmega.


  2. Open the options of SwitchyOmega, and configure as below.
  3. List servers and select one to connect, or just type ssct to connect automatically.
  4. Select proxy option in chrome and enjoy it.

Configuration for firefox
  1. Install firefox extension AutoProxy.
  2. AotoProxy preferences: Proxy Server --> Edit proxy server, and add shadowsocks item.
  3. Start ssct and select the shadowsocks proxy.
Note: For detail help here.

More options
optional arguments:
-h, --help show this help message and exit

ssct options:
-n <num> connect server number
--ss <ss> path to shadowsocks, assumed in the PATH
--list list all ss servers
--stop stop running servers
--version show program's version number and exit
--morehelp show this help message and exit

shadowsocks options:
-c <config> path to config file
-s <addr> server address, auto crawl online
-p <port> server port, auto crawl online
-b <addr> local binding address [default: 127.0.0.1]
-l <port> local port [default: 1080]
-k <password> password, auto crawl online
-m <method> encryption method, auto crawl online
-t <timeout> timeout in seconds [default: 300]
--fast-open use TCP_FASTOPEN, requires Linux 3.7+
-d <daemon> daemon mode, one of start, stop and restart
--pid-file <file> pid file for daemon mode
--log-file <file> log file for daemon mode
--user <user> username to run as
-v, -vv verbose mode
-q, -qq quiet mode, only show warnings/errors
Connect to the available server automatically without any argument.


ShadowSocks ConnecTion - A Wrapper Tool For Shadowsocks To Consistently Bypass Firewalls


OS X Auditor is a free Mac OS X computer forensics tool.
OS X Auditor parses and hashes the following artifacts on the running system or a copy of a system you want to analyze:
  • the kernel extensions
  • the system agents and daemons
  • the third party's agents and daemons
  • the old and deprecated system and third party's startup items
  • the users' agents
  • the users' downloaded files
  • the installed applications
It extracts:
  • the users' quarantined files
  • the users' Safari history, downloads, topsites, LastSession, HTML5 databases and localstore
  • the users' Firefox cookies, downloads, formhistory, permissions, places and signons
  • the users' Chrome history and archives history, cookies, login data, top sites, web data, HTML5 databases and local storage
  • the users' social and email accounts
  • the WiFi access points the audited system has been connected to (and tries to geolocate them)
It also looks for suspicious keywords in the .plist themselves.
It can verify the reputation of each file on:
  • Team Cymru's MHR
  • VirusTotal
  • your own local database
It can aggregate all logs from the following directories into a zipball:
  • /var/log (-> /private/var/log)
  • /Library/logs
  • the user's ~/Library/logs
Finally, the results can be:
  • rendered as a simple txt log file (so you can cat-pipe-grep in them… or just grep)
  • rendered as a HTML log file
  • sent to a Syslog server

Author
Jean-Philippe Teissier - @Jipe_ & al.

Support
OS X Auditor started as a week-end project and is now barely maintained. It has been forked by the great guys @ Yelp who created osxcollector.
If you are looking for a production / corporate solution I do recommend you to move to osxcollector (https://github.com/Yelp/osxcollector)

How to install
Just copy all files from GitHub.

Dependencies
If you plan to run OS X Auditor on a Mac, you will get a full plist parsing support with the OS X Foundation through pyobjc:
pip install pyobjc
If you can't install pyobjc or if you plan to run OS X Auditor on another OS than Mac OS X, you may experience some troubles with the plist parsing:
pip install biplist
pip install plist
These dependencies will be removed when a working native plist module will be available in python

How to run
  • OS X Auditor runs well with python >= 2.7.2 (2.7.9 is OK). It does not run with a different version of python yet (due to the plist nightmare)
  • OS X Auditor is maintained to work on the lastest OS X version. It will do its best on older OS X versions.
  • You must run it as root (or via sudo) if you want to use is on a running system, otherwise it won't be able to access some system and other users' files
  • If you're using API keys from environment variables (see below), you need to use the sudo -E to use the users environment variables
Type osxauditor.py -h to get all the available options, then run it with the selected options
eg. [sudo -E] python osxauditor.py -a -m -l localhashes.db -H log.html

Setting Environment Variables
VirusTotal API:
export VT_API_KEY=aaaabbbbccccddddeeee

Artifacts

Users
  • Library/Preferences/com.apple.LaunchServices.QuarantineEventsV2
  • Library/Preferences/com.apple.LaunchServices.QuarantineEvents
  • Library/Preferences/com.apple.loginitems.plist
  • Library/Mail Downloads/
  • Library/Containers/com.apple.mail/Data/Library/Mail Downloads
  • Library/Accounts/Accounts3.sqlite
  • Library/Containers/com.apple.mail/Data/Library/Mail/V2/MailData/Accounts.plist
  • Library/Preferences/com.apple.recentitems.plist
  • Firefox
  • Library/Application Support/Firefox/Profiles/
  • cookies.sqlite
  • downloads.sqlite
  • formhistory.sqlite
  • places.sqlite
  • signons.sqlite
  • permissions.sqlite
  • addons.sqlite
  • extensions.sqlite
  • content-prefs.sqlite
  • healthreport.sqlite
  • webappsstore.sqlite
  • Safari
  • Library/Safari/
  • Downloads.plist
  • History.plist
  • TopSites.plist
  • LastSession.plist
  • Databases
  • LocalStorage
  • Chrome
  • Library/Application Support/Google/Chrome/Default/
  • History
  • Archived History
  • Cookies
  • Login Data
  • Top Sites
  • Web Data
  • databases
  • Local Storage

System
  • /System/Library/LaunchAgents/
  • /System/Library/LaunchDaemons/
  • /System/Library/ScriptingAdditions/
  • /System/Library/StartupItems/Library/ScriptingAdditions/
  • /System/Library/Extensions/
  • /System/Library/CoreServices/SystemVersion.plist
  • /Library/LaunchAgents/
  • /Library/LaunchDaemons/
  • /Library/StartupItems/
  • /Library/Preferences/SystemConfiguration/com.apple.airport.preferences.plist
  • /Library/logs
  • /var/log
  • /etc/localtime
  • StartupParameters.plist
  • /private/var/db/dslocal/nodes/Default/groups/admin.plist
  • /private/var/db/dslocal/nodes/Default/users

Related work

Disk Arbitrator
Disk Arbitrator is Mac OS X forensic utility designed to help the user ensure correct forensic procedures are followed during imaging of a disk device. Disk Arbitrator is essentially a user interface to the Disk Arbitration framework, which enables a program to participate in the management of block storage devices, including the automatic mounting of file systems. When enabled, Disk Arbitrator will block the mounting of file systems to avoid mounting as read-write and violating the integrity of the evidence.
https://github.com/aburgh/Disk-Arbitrator

Volafox
volafox a.k.a 'Mac OS X Memory Analysis Toolkit' is developed on python 2.x
https://code.google.com/p/volafox/

Mandiant Memoryze(tm) for the Mac
Memoryze for the Mac is free memory forensic software that helps incident responders find evil in memory… on Macs. Memoryze for the Mac can acquire and/or analyze memory images. Analysis can be performed on offline memory images or on live systems.
http://www.mandiant.com/resources/download/mac-memoryze

Volatility MacMemoryForensics
https://code.google.com/p/volatility/wiki/MacMemoryForensics


OSXAuditor - Free Mac OS X Computer Forensics Tool