80,443 - Pentesting Web Methodology

Reading time: 20 minutes

tip

Jifunze na fanya mazoezi ya AWS Hacking:HackTricks Training AWS Red Team Expert (ARTE)
Jifunze na fanya mazoezi ya GCP Hacking: HackTricks Training GCP Red Team Expert (GRTE) Jifunze na fanya mazoezi ya Azure Hacking: HackTricks Training Azure Red Team Expert (AzRTE)

Support HackTricks

Taarifa Msingi

Huduma ya web ni huduma ya kawaida zaidi na yenye upana mkubwa, na kuna aina nyingi tofauti za vulnerabilities.

Port ya chaguo-msingi: 80 (HTTP), 443(HTTPS)

bash
PORT    STATE SERVICE
80/tcp  open  http
443/tcp open  ssl/https
bash
nc -v domain.com 80 # GET / HTTP/1.0
openssl s_client -connect domain.com:443 # GET / HTTP/1.0

Mwongozo wa Web API

Web API Pentesting

Muhtasari wa Methodolojia

Katika methodolojia hii tutadhani kuwa utashambulia domain (au subdomain) na hiyo pekee. Kwa hivyo, unapaswa kutumia methodolojia hii kwa kila domain, subdomain au IP iliyogunduliwa ambayo ina web server isiyotambuliwa ndani ya wigo.

  • Anza kwa kutambua teknolojia zinazotumika na web server. Tafuta mbinu za kuzingatia wakati wa mtihani mzima ikiwa unaweza kutambua teknolojia hiyo.
  • Je, kuna udhaifu unaojulikana katika toleo la teknolojia?
  • Unatumia well known tech? Kuna mbinu muhimu za kupata taarifa zaidi?
  • Je, kuna specialised scanner ya kuendesha (kama wpscan)?
  • Anzisha general purposes scanners. Haujui kama zitatoka na kitu au kupata taarifa ya kuvutia.
  • Anza na ukaguzi wa awali: robots, sitemap, 404 error na SSL/TLS scan (ikiwa HTTPS).
  • Anza spidering ukurasa wa wavuti: Ni wakati wa kutafuta faili zote, folda na parameta zinazotumika. Pia, angalia uvumbuzi maalum.
  • Kumbuka kwamba kila wakati saraka mpya inapotambulika wakati wa brute-forcing au spidering, inapaswa kuspider.
  • Directory Brute-Forcing: Jaribu brute force saraka zote zilizogunduliwa ukitafuta faili na saraka mpya.
  • Kumbuka kwamba kila wakati saraka mpya inapotambuliwa wakati wa brute-forcing au spidering, inapaswa kufanywa Brute-Forced.
  • Backups checking: Jaribu kuona ikiwa unaweza kupata backups za faili zilizogunduliwa kwa kuongeza viambatisho vya backup vinavyojulikana.
  • Brute-Force parameters: Jaribu kutafuta parameta zilizofichwa.
  • Mara unapokuwa umewataja yote endpoints zinazokubali user input, angalia aina zote za vulnerabilities zinazohusiana nazo.
  • Follow this checklist

Server Version (Je lina udhaifu?)

Tambua

Angalia kama kuna udhaifu unaojulikana kwa toleo la server linaloendesha.
Vichwa vya HTTP na cookies za majibu vinaweza kuwa vya msaada mkubwa kutambua teknolojia na/au toleo linalotumika. Nmap scan inaweza kutambua toleo la server, lakini pia zana whatweb, webtech au https://builtwith.com/ zinaweza kuwa za msaada:

bash
whatweb -a 1 <URL> #Stealthy
whatweb -a 3 <URL> #Aggresive
webtech -u <URL>
webanalyze -host https://google.com -crawl 2

Tafuta kwa vulnerabilities of the web application version

Angalia kama kuna WAF

Mbinu za teknolojia za Web

Baadhi ya mbinu za kutafuta udhaifu katika teknolojia mbalimbali zinazotumika:

Tafadhali zingatia kwamba domeni ile ile inaweza kutumia teknolojia tofauti katika bandari, folda na subdomains tofauti.
Ikiwa programu ya wavuti inatumia tech/platform yoyote iliyotajwa hapo juu au nyingine yoyote, usisahau kutafuta mtandaoni mbinu mpya (na nijulishe!).

Ukaguzi wa Source Code

Ikiwa source code ya programu inapatikana kwenye github, mbali na kufanya mwenyewe White box test ya programu, kuna taarifa ambazo zinaweza kuwa zitumike kwa ajili ya sasa ya Black-Box testing:

  • Je, kuna faili ya Change-log or Readme or Version au kitu chochote chenye version info accessible kupitia wavuti?
  • Je, vipi na wapi zinahifadhiwa credentials? Je, kuna faili (inayopatikana?) yenye credentials (majina ya watumiaji au passwords)?
  • Je, passwords ziko kwa plain text, encrypted, au ni algorithm gani ya hashing inayotumika?
  • Je, inatumia master key yoyote kwa encrypting kitu? Ni algorithm gani inayotumika?
  • Je, unaweza kufikia faili zozote kati ya hizi kwa exploiting udhaifu wowote?
  • Je, kuna taarifa za kuvutia kwenye github (issues zilizotatuliwa na zisizotatuliwa)? Au katika commit history (labda password ilizingirwa ndani ya commit ya zamani)?

Source code Review / SAST Tools

Automatic scanners

General purpose automatic scanners

bash
nikto -h <URL>
whatweb -a 4 <URL>
wapiti -u <URL>
W3af
zaproxy #You can use an API
nuclei -ut && nuclei -target <URL>

# https://github.com/ignis-sec/puff (client side vulns fuzzer)
node puff.js -w ./wordlist-examples/xss.txt -u "http://www.xssgame.com/f/m4KKGHi2rVUN/?query=FUZZ"

Vichunguzi vya CMS

Ikiwa CMS inatumiwa usisahau kukimbiza skana, huenda ikapatikana kitu cha kuvutia:

Clusterd: JBoss, ColdFusion, WebLogic, Tomcat, Railo, Axis2, Glassfish
CMSScan: WordPress, Drupal, Joomla, vBulletin tovuti kwa masuala ya usalama. (GUI)
VulnX: Joomla, Wordpress, Drupal, PrestaShop, Opencart
CMSMap: (W)ordpress, (J)oomla, (D)rupal au (M)oodle
droopscan: Drupal, Joomla, Moodle, Silverstripe, Wordpress

bash
cmsmap [-f W] -F -d <URL>
wpscan --force update -e --url <URL>
joomscan --ec -u <URL>
joomlavs.rb #https://github.com/rastating/joomlavs

Kwa wakati huu unapaswa tayari kuwa na baadhi ya habari kuhusu web server inayotumiwa na mteja (ikiwa data yoyote imetolewa) na mbinu za kuzingatia wakati wa mtihani. Ikiwa una bahati, umeweza hata kugundua CMS na kuendesha scanner.

Ugunduzi wa Web Application Hatua kwa Hatua

Tangu hapa tutaanza kuingiliana na web application.

Ukaguzi wa awali

Kurasa za default zenye taarifa za kuvutia:

  • /robots.txt
  • /sitemap.xml
  • /crossdomain.xml
  • /clientaccesspolicy.xml
  • /.well-known/
  • Angalia pia maoni (comments) kwenye kurasa kuu na za sekondari.

Kusababisha makosa

Web servers zinaweza kutenda kwa njia isiyotab predictable wakati data isiyo ya kawaida inapotumwa kwao. Hii inaweza kufungua tadhaa au kufichua taarifa nyeti.

  • Fikia kurasa za uongo kama /whatever_fake.php (.aspx,.html,.etc)
  • Ongeza "[]", "]]", and "[[" katika cookie values na parameter values ili kuunda makosa
  • Tengeneza kosa kwa kutoa input kama /~randomthing/%s mwishoni mwa URL
  • Jaribu HTTP Verbs tofauti kama PATCH, DEBUG au zenye makosa kama FAKE

Angalia kama unaweza kupakia files (PUT verb, WebDav)

Kama utagundua kuwa WebDav iko imewezeshwa lakini huna ruhusa za kutosha za kupakia files kwenye folder la root jaribu:

  • Brute Force credentials
  • Upload files kupitia WebDav kwenye sehemu zilizobaki za found folders ndani ya ukurasa wa wavuti. Huenda una ruhusa za kupakia files katika folda nyingine.

SSL/TLS vulnerabilites

  • Ikiwa application hainatii matumizi ya HTTPS katika sehemu yoyote, basi iko hatarini kwa MitM
  • Ikiwa application inatuma data nyeti (passwords) kutumia HTTP. Hii ni hatari kubwa.

Tumia testssl.sh kukagua vulnerabilities (Katika programu za Bug Bounty labda aina hizi za vulnerabilities hazitakubaliwa) na tumia a2sv kwa kukagua tena vulnerabilities:

bash
./testssl.sh [--htmlfile] 10.10.10.10:443
#Use the --htmlfile to save the output inside an htmlfile also

# You can also use other tools, by testssl.sh at this momment is the best one (I think)
sslscan <host:port>
sslyze --regular <ip:port>

Information about SSL/TLS vulnerabilities:

Spidering

Launch some kind of spider inside the web. The goal of the spider is to find as much paths as possible from the tested application. Therefore, web crawling and external sources should be used to find as much valid paths as possible.

  • gospider (go): HTML spider, LinkFinder in JS files and external sources (Archive.org, CommonCrawl.org, VirusTotal.com, AlienVault.com).
  • hakrawler (go): HML spider, with LinkFider for JS files and Archive.org as external source.
  • dirhunt (python): HTML spider, also indicates "juicy files".
  • evine (go): Interactive CLI HTML spider. It also searches in Archive.org
  • meg (go): This tool isn't a spider but it can be useful. You can just indicate a file with hosts and a file with paths and meg will fetch each path on each host and save the response.
  • urlgrab (go): HTML spider with JS rendering capabilities. However, it looks like it's unmaintained, the precompiled version is old and the current code doesn't compile
  • gau (go): HTML spider that uses external providers (wayback, otx, commoncrawl)
  • ParamSpider: This script will find URLs with parameter and will list them.
  • galer (go): HTML spider with JS rendering capabilities.
  • LinkFinder (python): HTML spider, with JS beautify capabilities capable of search new paths in JS files. It could be worth it also take a look to JSScanner, which is a wrapper of LinkFinder.
  • goLinkFinder (go): To extract endpoints in both HTML source and embedded javascript files. Useful for bug hunters, red teamers, infosec ninjas.
  • JSParser (python2.7): A python 2.7 script using Tornado and JSBeautifier to parse relative URLs from JavaScript files. Useful for easily discovering AJAX requests. Looks like unmaintained.
  • relative-url-extractor (ruby): Given a file (HTML) it will extract URLs from it using nifty regular expression to find and extract the relative URLs from ugly (minify) files.
  • JSFScan (bash, several tools): Gather interesting information from JS files using several tools.
  • subjs (go): Find JS files.
  • page-fetch (go): Load a page in a headless browser and print out all the urls loaded to load the page.
  • Feroxbuster (rust): Content discovery tool mixing several options of the previous tools
  • Javascript Parsing: A Burp extension to find path and params in JS files.
  • Sourcemapper: A tool that given the .js.map URL will get you the beatified JS code
  • xnLinkFinder: This is a tool used to discover endpoints for a given target.
  • waymore: Discover links from the wayback machine (also downloading the responses in the wayback and looking for more links
  • HTTPLoot (go): Crawl (even by filling forms) and also find sensitive info using specific regexes.
  • SpiderSuite: Spider Suite is an advance multi-feature GUI web security Crawler/Spider designed for cyber security professionals.
  • jsluice (go): It's a Go package and command-line tool for extracting URLs, paths, secrets, and other interesting data from JavaScript source code.
  • ParaForge: ParaForge is a simple Burp Suite extension to extract the paramters and endpoints from the request to create custom wordlist for fuzzing and enumeration.
  • katana (go): Awesome tool for this.
  • Crawley (go): Print every link it's able to find.

Brute Force directories and files

Start brute-forcing from the root folder and be sure to brute-force all the directories found using this method and all the directories discovered by the Spidering (you can do this brute-forcing recursively and appending at the beginning of the used wordlist the names of the found directories).
Tools:

  • Dirb / Dirbuster - Included in Kali, old (and slow) but functional. Allow auto-signed certificates and recursive search. Too slow compared with th other options.
  • Dirsearch (python): It doesn't allow auto-signed certificates but allows recursive search.
  • Gobuster (go): It allows auto-signed certificates, it doesn't have recursive search.
  • Feroxbuster - Fast, supports recursive search.
  • wfuzz wfuzz -w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt https://domain.com/api/FUZZ
  • ffuf - Fast: ffuf -c -w /usr/share/wordlists/dirb/big.txt -u http://10.10.10.10/FUZZ
  • uro (python): This isn't a spider but a tool that given the list of found URLs will to delete "duplicated" URLs.
  • Scavenger: Burp Extension to create a list of directories from the burp history of different pages
  • TrashCompactor: Remove URLs with duplicated functionalities (based on js imports)
  • Chamaleon: It uses wapalyzer to detect used technologies and select the wordlists to use.

Recommended dictionaries:

Note that anytime a new directory is discovered during brute-forcing or spidering, it should be Brute-Forced.

What to check on each file found

Special findings

While performing the spidering and brute-forcing you could find interesting things that you have to notice.

Interesting files

403 Forbidden/Basic Authentication/401 Unauthorized (bypass)

403 & 401 Bypasses

502 Proxy Error

If any page responds with that code, it's probably a bad configured proxy. If you send a HTTP request like: GET https://google.com HTTP/1.1 (with the host header and other common headers), the proxy will try to access google.com and you will have found a SSRF.

NTLM Authentication - Info disclosure

If the running server asking for authentication is Windows or you find a login asking for your credentials (and asking for domain name), you can provoke an information disclosure.
Send the header: “Authorization: NTLM TlRMTVNTUAABAAAAB4IIAAAAAAAAAAAAAAAAAAAAAAA=” and due to how the NTLM authentication works, the server will respond with internal info (IIS version, Windows version...) inside the header "WWW-Authenticate".
You can automate this using the nmap plugin "http-ntlm-info.nse".

HTTP Redirect (CTF)

It is possible to put content inside a Redirection. This content won't be shown to the user (as the browser will execute the redirection) but something could be hidden in there.

Web Vulnerabilities Checking

Now that a comprehensive enumeration of the web application has been performed it's time to check for a lot of possible vulnerabilities. You can find the checklist here:

Web Vulnerabilities Methodology

Find more info about web vulns in:

Monitor Pages for changes

You can use tools such as https://github.com/dgtlmoon/changedetection.io to monitor pages for modifications that might insert vulnerabilities.

HackTricks Automatic Commands

Protocol_Name: Web    #Protocol Abbreviation if there is one.
Port_Number:  80,443     #Comma separated if there is more than one.
Protocol_Description: Web         #Protocol Abbreviation Spelled out

Entry_1:
Name: Notes
Description: Notes for Web
Note: |
https://book.hacktricks.wiki/en/network-services-pentesting/pentesting-web/index.html

Entry_2:
Name: Quick Web Scan
Description: Nikto and GoBuster
Command: nikto -host {Web_Proto}://{IP}:{Web_Port} &&&& gobuster dir -w {Small_Dirlist} -u {Web_Proto}://{IP}:{Web_Port} && gobuster dir -w {Big_Dirlist} -u {Web_Proto}://{IP}:{Web_Port}

Entry_3:
Name: Nikto
Description: Basic Site Info via Nikto
Command: nikto -host {Web_Proto}://{IP}:{Web_Port}

Entry_4:
Name: WhatWeb
Description: General purpose auto scanner
Command: whatweb -a 4 {IP}

Entry_5:
Name: Directory Brute Force Non-Recursive
Description:  Non-Recursive Directory Brute Force
Command: gobuster dir -w {Big_Dirlist} -u {Web_Proto}://{IP}:{Web_Port}

Entry_6:
Name: Directory Brute Force Recursive
Description: Recursive Directory Brute Force
Command: python3 {Tool_Dir}dirsearch/dirsearch.py -w {Small_Dirlist} -e php,exe,sh,py,html,pl -f -t 20 -u {Web_Proto}://{IP}:{Web_Port} -r 10

Entry_7:
Name: Directory Brute Force CGI
Description: Common Gateway Interface Brute Force
Command: gobuster dir -u {Web_Proto}://{IP}:{Web_Port}/ -w /usr/share/seclists/Discovery/Web-Content/CGIs.txt -s 200

Entry_8:
Name: Nmap Web Vuln Scan
Description: Tailored Nmap Scan for web Vulnerabilities
Command: nmap -vv --reason -Pn -sV -p {Web_Port} --script=`banner,(http* or ssl*) and not (brute or broadcast or dos or external or http-slowloris* or fuzzer)` {IP}

Entry_9:
Name: Drupal
Description: Drupal Enumeration Notes
Note: |
git clone https://github.com/immunIT/drupwn.git for low hanging fruit and git clone https://github.com/droope/droopescan.git for deeper enumeration

Entry_10:
Name: WordPress
Description: WordPress Enumeration with WPScan
Command: |
?What is the location of the wp-login.php? Example: /Yeet/cannon/wp-login.php
wpscan --url {Web_Proto}://{IP}{1} --enumerate ap,at,cb,dbe && wpscan --url {Web_Proto}://{IP}{1} --enumerate u,tt,t,vp --passwords {Big_Passwordlist} -e

Entry_11:
Name: WordPress Hydra Brute Force
Description: Need User (admin is default)
Command: hydra -l admin -P {Big_Passwordlist} {IP} -V http-form-post '/wp-login.php:log=^USER^&pwd=^PASS^&wp-submit=Log In&testcookie=1:S=Location'

Entry_12:
Name: Ffuf Vhost
Description: Simple Scan with Ffuf for discovering additional vhosts
Command: ffuf -w {Subdomain_List}:FUZZ -u {Web_Proto}://{Domain_Name} -H "Host:FUZZ.{Domain_Name}" -c -mc all {Ffuf_Filters}

tip

Jifunze na fanya mazoezi ya AWS Hacking:HackTricks Training AWS Red Team Expert (ARTE)
Jifunze na fanya mazoezi ya GCP Hacking: HackTricks Training GCP Red Team Expert (GRTE) Jifunze na fanya mazoezi ya Azure Hacking: HackTricks Training Azure Red Team Expert (AzRTE)

Support HackTricks