Sunday, June 21, 2009

Using google safe browsing API to check if your site is listed as suspicious

Google is maintaining a couple of lists of bad urls that they identified using their crawlers. This is a quick example of how to make a good use of one of the lists to monitor your web site if it has been compromised and identified by google.

My 2 perl scripts, are using the available modules:
Net::Google::SafeBrowsing::UpdateRequest
Net::Google::SafeBrowsing::Blocklist


I am updating the google list every hour, and then check my urls using a crontab job:


1st script (register in google to get a key):

#!/usr/bin/perl
use Net::Google::SafeBrowsing::UpdateRequest;
$apikey='put-your-key-here';
$dbfile = "/feeds/url/glist.txt";
$skip_mac= "true";
$blocklist = "goog-black-hash";
my $u = Net::Google::SafeBrowsing::UpdateRequest->new($apikey, $dbfile, $blocklist);
if ($u->update and $u->close) {
print "Seccessfully Updated $blocklist in $dbfile\n";
}



2nd script is just an example on how to compare your url against googleSafe browsing blocklist:

#!/usr/bin/perl

use Net::Google::SafeBrowsing::UpdateRequest;
use Net::Google::SafeBrowsing::Blocklist;
$apikey='put-your-key-here';
$dbfile = "/feeds/url/glist.txt";
$skip_mac= "true";
$tablename = "goog-malware-hash";
$uri = "http://www.yourdomain.com/";
my $blocklist = Net::Google::SafeBrowsing::Blocklist->new($tablename, $dbfile, $apikey);
my $matched_uri = $blocklist->suffix_prefix_match($uri);
if (defined($matched_uri)) {
print "Matched '$matched_uri'\n";
} else {
print " No Match ";
}
$blocklist->close;

No comments: