Exploiting the Power of Curl
Didier explained in a recent diary[1] that it is possible to analyze malicious documents with standard Linux tools. I’m using Linux for more than 20 years and, regularly, I find new commands or new switches that help me to perform recurring (boring?) tasks in a more efficient way. How to use these tools can be found by running them with the flag ‘-h’ or ‘--help’. They also have a corresponding man page that describes precisely how to use the numerous options available (just type 'man <command>' in your shell). Unfortunately, the man page can be very long to read. Let’s take a look at the “curl” command[2]. curl is a standard tool to transfer data based on URLs. It is not only a command line web browser, it supports many protocols (from the documentation: DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET and TFTP) which make it a wonderful tool. Many protocols mean also many options. Indeed, the curl manpage is quite long:
$ man curl | col -b | wc -l 2393
I had the opportunity to follow a lightning talk presented by one of the curl developers and he gave really nice tips that made me curious. I read the man page carefully and found nice features that are very useful for security researchers, incident responders or malware analysts. Here is a quick review of options that could save you some time:
Use curl to download similar URLs (based on patterns) in one command:
$ curl http://www.{domain1,domain2,domain3}.com $ curl ftp://ftp.malicious.com/dump[001-100].txt
You can even specify dedicated output files per website:
$ curl -o “file_#1.txt” http://www.{domain1,domain2,domain3}.com
curl will dump the content into file_domain1.txt, file_domain2.txt, etc.
Of course, they are multiple ways to use proxies with curl. SOCKS proxies are supported and are very useful for Tor. This is classic but I’m always running a Tor SOCKS proxy ready to grab some sensitive content:
$ curl --socks5 torproxy:9050 http://www.suspicious.com/
Sometimes, you have to submit a file to a remote server (example: to upload a file through an API):
$ curl -F file=@“myfile.exe" http://application.com/file/upload/
You can modify the default DNS config:
$ curl --dns-ipv4-addr 172.16.0.20 http://www.malicious.com $ curl --dns-interface eth1 http://www.malicious.com
DNS request to resolve the hostname will originate from 172.16.0.20 or from eth1
More interesting: you can resolve a host with a specified IP address:
$ curl --resolve www.mallicious.com:80:172.16.0.20 http://www.malicious.com
This command provides a custom address for a specific host and port pair. Consider it a sort of /etc/hosts alternative provided on the command line. This is super useful to remain stealthy!
Specify more HTTP headers (sometimes required to download some pieces of malware)
$ curl --header "X-Application: BotClient" http://cc.domain.com/
Or, more classic, specify a referer:
$ curl --referer http://www.domain.com/login.php http://www.domain.com/admin
If you save cookies into a file with '--cookie-jar’, it can be useful to not reuse session cookies:
$ curl --cookie cookies.txt --junk-session-cookies http://www.malicious.com
Perform multiple operations on a single URL:
$ curl www.malicious.com --next -d $data_to_post www.malicious.com
Capture a full trace of the HTTP requests:
$ curl --trace - https://isc.sans.edu
('--trace-ascii' removes the hex dump)
Finally, we all need log files! curl can also generate nice output to be processed by another tool. It can generate personalized outputs:
$ curl --silent --write-out "Response code: %{http_code}\nTotal time: %{time_total}" https://isc.sans.edu
But you can prepare a more complex format in a file and read it from the command line. In the example below, we generate a simple JSON output:
$ cat format.txt {"code":"%{http_code}","remote_ip":"%{remote_ip}","url":"%{url_effective}”} $ curl --silent --write-out @format.txt http://isc.sans.edu | jq "." { "code":"200", "remote_ip":"204.51.94.153", "url":"https://isc.sans.edu/" }
You can also change the way curl connects to remote sites (retries, timeouts, follow redirects or not). Just read the man page to find your most interesting options.
Do you have nice tips about curl? Please share!
[1] https://isc.sans.edu/forums/diary/Maldoc+analysis+with+standard+Linux+tools/23900/
[2] https://curl.haxx.se/
Xavier Mertens (@xme)
Senior ISC Handler - Freelance Cyber Security Consultant
PGP Key
Reverse-Engineering Malware: Malware Analysis Tools and Techniques | Frankfurt | Dec 9th - Dec 14th 2024 |
Comments