Secrets of a botnet developer

Downloading Web Pages

The harvesters for the sniper botnet not only had to download web pages from the sale server, they also had to request tasks from the central server. In either case, they were essentially making HTTP requests from web servers for web pages. If you use the library LIB_http [4], the code to download a web page is quite simple. The code shown below in Listing 1 shows an example.

Listing 1

Downloading a Web Page

01 <?php
02 include('LIB_http.php');                        // include the http code from library
03 $target = 'http://www.schrenk.com';             // specify a target website
04 $referrer = '';                                 // specify a referrer, if needed
05 $result = http_get($target, $referrer);
06 $raw_data = $result['FILE'];
07 ?>
08 @KE

Listing 1 uses LIB_http, which makes extensive use of libcurl4 [5], the PHP implementation of cURL. The cURL project is a cross-platform toolset for accessing resources in a variety of protocols (not just HTTP). cURL also automatically takes care of obstacles like cookie management, page forwarding, encryption, and other low-level tasks.

Parsing Data

Once you've downloaded a web page, you'll need to parse the important stuff from the surrounding text that is of little interest to your harvester. Take, for example, the instructions coming from the central server, which are essentially specialized web pages. For this demonstration, assume the data the harvester receives from the central server looks like the XML depicted in Listing 2.

Listing 2

From Central Server to Harvester

01 <xml>
02     <task>purchase</task>
03     <target>somewebsite.com</target>
04     <username>clientname</username>
05     <password>supersecret</password>
06     <vehicles>
07         <vin>192KF8Q2MSJE8T921</vin>
08         <vin>192KF8IWECOWIMCE3</vin>
09         <vin>192KF0032HJS002AS</vin>
10     </vehicles>
11 </xml>

Although there are many ways to parse XML documents, the library LIB_parse is used in Listing 3 to demonstrate how you can use the same technique to parse a variety of web information.

Listing 3

Parsing Web Data

01 <?php
02 include('LIB_parse.php');                        // include the parsing code from library
03
04 $task      = return_between($raw_data, "<task>", "</task>", EXCL);
05 $target    = return_between($raw_data, "<target>", "</target>", EXCL);
06 $username  = return_between($raw_data, "<username>", "</username>", EXCL);
07 $password  = return_between($raw_data, "<password>", "</password>", EXCL);
08 $vin_array = parse_array($raw_data, "<vin>", "</vin>");
09 ?>

The example in Listing 3 shows two of the many useful functions in LIB_parse. The return_between() function simply returns data found between two endpoints. The EXCL passed as the last parameter in the function specifies that the returned sample is exclusive of the endpoints. If you also wanted the endpoints returned with the data, the example would have used INCL instead. The other function shown in Listing 3 is parse_array(). This function creates an easy-to-use array of all characters found between every instance of the two endpoints. In this case, those endpoints are the XML tags that encase the Vehicle Identification Numbers. In regular web pages, the parse_array() function might return table cells, hyperlinks, or all references to images, as shown in Listing 4.

Listing 4

Using the parse_array() Function

01 <?php
02 include('LIB_parse.php');                       // include the parsing code from library
03
04 $td_array   = parse_array($raw_data, "<td", "</td>"); // array of table cells
05 $img_array  = parse_array($raw_data, "<img", ">");    // array of images
06 $link_array = parse_array($raw_data, "<a", "</a>");   // array of links
07 ?>

Form Submission

Another task required in the sniper botnet was the ability to submit a form. Form submission is how vehicles were purchased and payment and shipping arranged. Again, with the help of LIB_http, this was an easy task. There are actually two ways of submitting a form, depending on what the form handler is expecting. You could make a GET method request by tacking a query string on the end of the URL, or you could make the POST method request, as shown in Listing 5.

Listing 5

Submitting a POST Method Form

01 <?php
02 include('LIB_http.php');                        // include the parsing code from library
03
04 // initialize the form elements as an array
05 $form_array['session']  = "12301231#e0sds0122342";
06 $form_array['user']     = $username;
07 $form_array['password'] = $password;
08 $form_array['Login']    = "login";
09 $form_handler   = "http://www.WebbotsSpidersScreenScrapers.com/form_analyzer.php";
10 $referrer       = "";
11
12 $submission_result = http_post_form($form_handler, $referrer, $form_array);
13 $web_page          = $submission_result['FILE'];
14 ?>

Buy this article as PDF

Express-Checkout as PDF
Price $2.95
(incl. VAT)

Buy Linux Magazine

SINGLE ISSUES
 
SUBSCRIPTIONS
 
TABLET & SMARTPHONE APPS
Get it on Google Play

US / Canada

Get it on Google Play

UK / Australia

Related content

comments powered by Disqus
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters

Support Our Work

Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.

Learn More

News