The Ratproxy security scanner looks for vulnerabilities in web applications
Google's Ratproxy is a free testing tool that searches for security problems in web applications.
Several test suites help you look for vulnerabilities in web-based applications  , but many of these applications are expensive or difficult to use. Wouldn't it be nice just to press a button to find out what vulnerabilities exist in your own software – along with a line reference to help you find the problems in the source code?
Ratproxy  is a tiny but powerful tool with a simple approach to searching for problems in web applications. The Ratproxy security testing tool originated in the development labs Google, where it was created to test Google's own applications. In July 2008, the company decided to release the current version to the general public under the Apache License 2.0.
Because Ratproxy does not cause a noticeable increase in network traffic, it even lets you check applications that are deployed in production environments. (Other scanners launch DOS or cross-site scripting attacks that are likely to bring a production system to its knees.)
Setting the Mousetrap
Deploying Ratproxy is simple: Just download the source code from the homepage and run make to build the dozen or so source files. The tool does not require a configure script or have any major dependencies. What you do need are the libcrypto and libssl libraries (typically supplied as part of the OpenSSL distribution) and corresponding headers.
Starting the test tool is slightly more complicated: No fewer than 22 parameters (Table 1) govern the nature and scope of the tests Ratproxy performs. The parameters are also responsible for defining the level of detail to output. To avoid being plowed under in an avalanche of messages when you first launch the program, start with the default settings:
./ratproxy -v /tmp -w ratproxy.log -d mydomain.com -lfscm
This command points Ratproxy at the web application in the mydomain.com domain. Ratproxy will ignore any URLs not on this server. (This approach is a way of making sure that Ratproxy will not run off and accidentally test external ad sites.) The http traffic sniffed by Ratproxy is dumped into a multitude of tiny files in the temporary directory (-v /tmp), whereas the analysis of the results – that is, the information you are actually interested in – is stored in ratproxy.log. The Ratproxy Parameters box explains the Pandora's box of command-line options.
If you prefer a full broadside, you can change the parameters as follows:
./ratproxy -v /tmp -w ratproxy.log -d mydomain.com -lextifscgjm
The optional duo -XC (note the uppercase letter in the command name), releases Ratproxy from its passive role. Once released, Ratproxy will check to see how well your web application withstands XSS and XSRF attacks (-X), and it will repeat requests with modified parameters (-C).
If the web application returns Flash objects, Ratproxy can disassemble and analyze them. Ratproxy relies on the Flare ActionScript decompiler for this; unfortunately, Flare is only available as a prebuilt closed source application. By default, Ratproxy supports execution on x86 processors. A version for 64-bit Linux is available on the Flare homepage . First you must download the file, unpack it, and store the results in flare-dist.
Shadowing the User
Ratproxy's interactive orientation has several benefits, but it is also the tool's major deficiency. If the user does not execute a function, Ratproxy does not test it. Before you launch Ratproxy, you should think carefully about which parts of the web application you want to test – and in which order.
Once you see the message Accepting connections on port 8080/tcp (local only), you know that the test tool is listening on port 8080 for incoming browser requests. The next step is to set up the browser to direct all communications via Ratproxy. The easiest way of doing this is to enter this port as a proxy on your own machine (127.0.0.1 or localhost) (Figure 1).
This tells the browser to forward all requests to localhost:8080, where Ratproxy will analyze the requests before passing them on to the web application (Figure 2). Because the test tool sniffs traffic passively, all of this is absolutely transparent and has only a minimal effect on execution speed. The -X and -C parameters, however, are an exception to this rule. They tell Ratproxy to switch to "disruptive mode" and actively interfere with communications. (The effects of these parameters will vary.)
If you use a genuine proxy to access the web, which is the case in many corporate environments, you need to pass the -P host:port parameter to Ratproxy, in which host and port represent the data for your proxy. This feature means you can deploy Ratproxy as part of a chain of other test tools.
The next step is to access the web application in your browser and work in the normal way. To avoid interference from other sources, Google recommends closing all other browser windows and flushing the browser cache before you start. Ratproxy will now monitor all your actions and log them in ratproxy.log.
In the case of SSL-encrypted data, Ratproxy will replace the certificate served up by the web application with its own. A good browser will alert you to this. To carry on with the test, you must accept the new certificate. The Ratproxy documentation  warns against storing the certificate permanently in your browser. After all, everyone who downloads Ratproxy knows the certificate. Because Ratproxy forces a certificate on you, another problem appears: Ratproxy negotiates all further steps with the web application, so you can't be 100 percent certain that you are talking to your own server. Thus, you should avoid using critical (administrative) accounts or entering sensitive data while being monitored by the tool.
On top of this, you should resist the temptation to use wget to feed the website to Ratproxy. Most of Ratproxy's tests rely on user interaction and would simply be dropped in the case or a wget command.
Buy this article as PDF
Mozilla’s product think tank sinks silently into history.
TODO group will focus on open source tools in large-scale environments.
New tool will look like GParted but support a wider range of storage technologies.
New public key pinning feature will help prevent man-in-the-middle attacks.
Carnegie Mellon researchers say 3 million pages could fall down the phishing hole in the next year.
The US government rolls new best-practice rules for protecting SSH.
Klaus Knopper announces the latest version of his iconic Live Linux system.
All websites that use these popular CMS tools could be vulnerable to denial of service attacks if users don't install the updates.
According to a report, many potential victims of the Heartbleed attack have patched their systems, but few have cleaned up the crime scene to protect themselves from the effects of a previous intrusion.
DARPA and NICTA release the code for the ultra-secure microkernel system used in aerial drones.