APLawrence.com -  Resources for Unix and Linux Systems, Bloggers and the self-employed

We no longer offer ftp downloads. If there is a file you need referenced here, please contact me by email and I will get it to you.

Squidlog II

© May 2003 Tony Lawrence

The original Squid Log Analyzer I wrote wasn't what I really wanted. I had promised myself that I would get back to it to do a better job, but of course I forgot about that until someone asked me to provide some changes. I then looked at the code again and said no, this won't do.

The first problem is that version creates static html pages. That's a waste of space and time. This new version doesn't; it generates pages as requested. This version also uses a Perl .db file to store its data.

You can download this from ftp://ftp.aplawrence.com/pub/squidlog2.tar. You need to edit two files to match your site. The first is the parsesquidlog.pl file. This needs to be run (by cron) to create the Perl .db file. Edit these lines:

$log_file = '/whereveritis/access.log';

The $log_file is the location of your squid log file. The $wlogs is wherever you would like to store your .db file (this program creates it). The $ipmatch matches your local lan pattern. Finally, the $purge_date determines what log lines are kept; you will want to modify this every now and then to avoid having gigantic pages to look through.

The other program is what you'll install in your cgi-bin area. For this, modify just two variables:


The $wlogs needs to match whatever you used in the parsesquidlog.pl file, and $cgiloc just matches your cgi location.

Got something to add? Send me email.

(OLDER)    <- More Stuff -> (NEWER)    (NEWEST)   

Printer Friendly Version

-> Squidlog II


Inexpensive and informative Apple related e-books:

Take Control of Apple Mail, Third Edition

Take Control of Parallels Desktop 12

Take Control of Upgrading to El Capitan

Photos for Mac: A Take Control Crash Course

Take Control of Numbers

More Articles by © Tony Lawrence

Thu Dec 11 17:05:56 2008: 4904   JeffSouza

I love your script and how simple it is. However, the output doesn't go past the first level html file. All I get is the first page - any links on the main output are just pages with the filename in them at the top. As a side note, I could not get squidlog2 to work (i've never worked with cgi scripts and I'm not sure how to view the information from the showlogs.pl).
Please help if you can.

Thu Dec 11 17:21:58 2008: 4905   TonyLawrence

Gosh, I haven't looked at this in years. I'm not sure how to help you - if you don't grok cgi I can't educate you through comment replies.

Do you understand Perl? Or know someone who does? Any half-competent Perl person could take this base idea and improve it considerably. If you don't want to run cgi, just use the script as is and edit out the HTML output.. it's really that easy.

If you don't know anyone, I could do a rewrite specifically for you - that would run a couple of hundred dollars at most.

Thu Dec 11 17:52:43 2008: 4906   JeffSouza

I completely understand. I was only wanting to get the output you posted as an example log. I get the first page, but the information in the links aren't there. I just thought there might be a "level" setting in the script. I don't know perl, but I know C which is close enough to edit the file. Honestly, your program is the easiest with the least amount of dependencies I've found. It's basically a drop in with slackware to get running. It's just that somewhere in the parsing it fails to output to the sub .html files any details like you show in your example.

Thu Dec 11 17:54:01 2008: 4907   JeffSouza

Sorry, I should have mentioned that I was talking about your original Squidlog *not* Squidlog2 in the previous post.

Thu Dec 11 18:02:25 2008: 4908   TonyLawrence

OK. Well, if I'm not real busy over the next few weeks I might find some time to look at it. I can't promise anything, of course.

I detest code that depends on obscure modules.. so often the module changes slightly or depends on something else and soon you are chasing never ending rabbits..

Thu Dec 11 18:18:21 2008: 4909   TonyLawrence

Send me (email) a sample of your log file and I'll definitely look at it within the next two weeks if at all possible.

Include enough lines for a reasaonable test but don't send me tens of thousands of lines, please.

Thu Dec 11 18:24:23 2008: 4910   JeffSouza

Let me know where you would like me to email the log to. I'll send you 1000 lines - let me know if you want more (or less).

Thu Dec 11 18:29:15 2008: 4911   TonyLawrence

1,000 lines is good. See (link) for email address.

Fri Dec 12 20:08:55 2008: 4924   TonyLawrence

I haven't seen that log yet??


Printer Friendly Version

Have you tried Searching this site?

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.

Contact us

Printer Friendly Version

The Internet? We are not interested in it. (Bill Gates, 1993)

Linux posts

Troubleshooting posts

This post tagged:






Unix/Linux Consultants

Skills Tests

Unix/Linux Book Reviews

My Unix/Linux Troubleshooting Book

This site runs on Linode