Categories
linux Perl Projects

DNS for DHCPd in the FUTURE

I have a dream.

My dream is that one day, a giant carrot carved into the shape of a submarine will sail down the Thames before sinking below the waves to take back America using only the power of latin.

But also, I want for machines that are on my local network to be accessible as “$hostname.d.water.gkhs.net” to everyone else on the same local network. That’s a more technical dream, and this is how I did it:

first, we google “smoothwall dhcp to dns”. The first result seems to be exactly what we need, so we click it, and find outselves on Kryogenix, the website of Aquarius, who I have known for somewhere close to a decade, which is an aeon in internet time. The article is now close seven years old, and while its lost its styling, it is (a) entirely what I want to do (b) comprehensive and (c) now completely broken.

The new page that Douglas Warner’s dhcp2dnrd script lives is now somewhere else on the site, and appears to be having some kind of formatting problem, but can still be downloaded from this direct link. At the bottom of this is a link to my own version of this file, with all these changes already made.

Although the the class::date problem no longer exists, a few other things that have changed since the article was released. So, this is what to do to get it working. Most of this is built on the stuff sill said already in his article, just updated for Smoothwall 3.0:

Log in to your Smoothwall box over ssh (If you cannot do this, you need to go to the web interface, Services, Remote Access, and tick SSH. Then, using your favourite terminal, log in to the same IP, port 222. Username root, password whatever you chose when you set up the firewall so long ago. I do hope you remember it.

mkdir dhcp2dnrd; cd dhcp2dnrd # (Being neat and tidy is good)

wget http://www.silfreed.net/download/progs/dhcp2dnrd.pl
wget http://search.cpan.org/CPAN/authors/id/D/DL/DLUX/Class-Date-1.1.9.tar.gz

tar xzvf Class-Date-1.1.9.tar.gz # to extract the perl module.
mv Class-Date-1.1.9/Date* /usr/lib/perl5/5.8.8/Class/ # to copy the perl module in place
vim dhcp2dnrd.pl # Or use your personal favourite editor. Unless it’s emacs or something, because I don’t think that’s installed.

Personally, I change the “home.net” line to “d.water.gkhs.net”, because it fits my network model better. You do need to change the “$dhcpdpath” to “/usr/etc/dhcpd.leases”, however.

Finally, smoothwall no longer uses dnrd, so either comment out the entire bottom of the file after “# restart dnrd”, or rewrite that to work. I’ve modified the code in mine to “work”, but it’s mostly cargo culty.

Downloading Douglas’ script, I found it had windows line endings, which confused me. You can convert it back to unix format in vim with “:set fileformat=unix”. If you’re using mine you shouldn’t need to.

Finally, run it, check the output of /etc/hosts is roughly what you expect, then throw the script into cron like this:
cp dhcp2dnrd.pl /etc/cron.often/

And that appears to work. You can grab my copy of the code from github should you want to.

Categories
aqcom Imported From Epistula Perl programming

Logging

Aquarionics’ logging system was designed to work against mod_log_sql, a module that, er, logs to an SQL database. This worked until we upgraded to Apache 2, which log_sql didn’t support until recently. Since part of the logging system is the bit of AqCom that shows who linked here recently, I’d rather not convert it to run off plain text files (though I may be converting it to use Sqlite at some point), so I created a perl script that feeds the log into the database in log_sql’s format. It looks like this:

#!/usr/bin/perl

use DBD::mysql;

#Database options:
$dbUser = "user";
$dbPass = "password";
$dbName = "epistula";

$database = DBI->connect("dbi:mysql:$dbName:localhost:1114", $dbUser, $dbPass);

#204.95.98.252 - - [24/Dec/2003:15:23:38 +0000] "GET /archive/writing/2003/08/
	19 HTTP/1.0" 200 11873 "-" "msnbot/0.11 (+http://search.msn.com/msnbot.htm)"

while (<>) {
  my ($client, $identuser, $authuser, $date, $method,
      $url, $protocol, $status, $bytes, $referer,$agent) =

/^(S+) (S+) (S+) [(.*?)] "(S+) (.*?) (S+)" (S+) (S+) "(.*?)" "(.*?)"$/;
  # ...
        #$database->quote($thisdir);
        $q = "insert into apachelogs (remote_host, remote_user, request_time,
			request_method, request_uri, request_protocol, status, bytes_sent, referer, agent)
        values
        (".$database->quote($client).", ".$database->quote($authuser).", '".$date."', "
			.$database->quote($method).", ".$database->quote($url).", "
			.$database->quote($protocol).", ".$database->quote($status).", "
			.$database->quote($bytes).", ".$database->quote($referer).", "
			.$database->quote($agent).")";

        #print $database->quote($url)."n";
        my $sth = $database->prepare($q);
        $sth->execute();

}

…and is run using this crontab line:

@reboot tail -f /var/log/apache2/www.aquarionics.com | $EPBIN/apache2db.pl &

Now, the important thing to remember is that this gets pretty big pretty quickly, since it logs every line. It’s vitally important that you don’t under any circumstances, forget that you commented out this crontab line:

@daily echo "delete from apachelogs where time_stamp < `date +%Y%m%d --date '1 month ago'`" | mysql epistula

Because otherwise you’ll discover that your daily database dumps start to hit 16Mb each… BZ compressed… 380Mb uncompressed… oh, lets say four months and twelve days later.

For example.

(I ran the above query, or one like it, just before I started this entry. It’s just stopped:

mysql> delete from apachelogs where time_stamp < 20040825;
Query OK, 913830 rows affected (21 min 44.87 sec)

Reformatting for the girlymen who don’t have 2000px wide displays and are reading the RSS feed. See? This is why I want to only do partial content, because that way when I do something like this it only fucks up in IE

Categories
aqcom Christmas Imported From Epistula Perl

Geek at Christmas

So, as is traditional I spend my christmas holidays playing with epistula. Now I have referer tracking turned working again.

The problem with referer tracking is extracting the data from log files. When the server had mod_log_sql it was easy (I have an entire log stats suite built for mod_log_sql), but since log_sql doesn’t support Apache 2 yet (A patch to make it do so was released yesterday. It remains untested) I had to brush off my extremely limited perl skillz to create this, a perl program to send apache logs to mysql:

#!/usr/bin/perl
use DBD::mysql;

#Database options:
$dbUser = "username";
$dbPass = "password";
$dbName = "database";

$database = DBI->connect(
"dbi:mysql:$dbName:localhost:1114",
$dbUser, $dbPass
);

while (<>) {

my ($client, $identuser, $authuser, $date, $method,
$url, $protocol, $status, $bytes, $referer,$agent) =
/^(S+) (S+) (S+) [(.*?)] "(S+) (.*?) (S+)" (S+) (S+) "(.*?)" "(.*?)"$/;

$q = "insert into apachelogs
(remote_host, remote_user, request_time, request_method,
request_uri, request_protocol, status, bytes_sent, referer, agent)
values
(".$database->quote($client).", ".$database->quote($authuser).", '"
.$date."', ".$database->quote($method).", ".$database->quote($url)
.", ".$database->quote($protocol).", ".$database->quote($status)
.", ".$database->quote($bytes).", ".$database->quote($referer)
.", ".$database->quote($agent).")";

my $sth = $database->prepare($q);
$sth->execute();

}