APLawrence.com -  Resources for Unix and Linux Systems, Bloggers and the self-employed

Sitemaps: Influencing Google for Web Site Promotion and Adsense Revenue

There really isn't much you can do directly with Google in the area of web site promotion. However, there are a few tools you can use. The newest is a set of comment tags that may help Google identify the "real" content of your site. The idea is that you avoid sidebars and any other boiler plate content that may contain words potentially confusing to Google's analysis. The tags are:

<!- google_ad_section_start ->
<!-google_ad_section_start(weight=ignore) ->
<!- google_ad_section_end ->

I suppose this is better than nothing, and could help in some places. The "ad_section" wording suggests that they are only using these to determine Adsense content, but I don't know why their search indexing spiders wouldn't pay some attention to that also.

Versatile Site Map Generator $69.00
A1 Sitemap Generator

I do wish Google would add to this to include at least a "not about" property. I realize that Google isn't going to let anyone tell them what a page IS about, but a "not about" property can't really be abused as easily and could help their accuracy in search results.

If they are worried about cheaters, they could put a bounty out for such sites. If someone is abusing Google (Adsense or search) by stuffing keywords or deceptive use of tags, offer the bounty hunter that site's Adsense income for the month. That would sniff this kind of scum out.

For the search and indexing side of things, Google now lets web sites submit an xml file that lists urls and some information about how often the pages change, and how important the page is relative to other pages. Basically, it gets you to do part of the work for them - which we would hope helps everyone.

Google provides a Python script that can produce the file for your site; I wrote a Perl script that does the same:

@stuff=`find . -type f -name "*.html"`;
print O <<EOF;
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.google.com/schemas/sitemap/0.84">
foreach (@stuff) {
($dev,$ino,$mode,$nlink,$uid,$gid,$rdev,$size,$atime,$mtime,$ctime,$blksize,$blocks)=stat $rfile;
$year +=1900;
$freq="daily" if /index.html/;
$priority="1.0" if /index.html/;
print O <<EOF;


print O <<EOF;
close O;
system("gzip sitemap");

By the way, that chunk of Perl looks like something Google should ignore - so I added those tags around it.

Season to taste.. see Google Webmaster Tools

Got something to add? Send me email.

Increase ad revenue 50-250% with Ezoic

More Articles by

Find me on Google+

© Tony Lawrence

Kerio Samepage

Have you tried Searching this site?

Unix/Linux/Mac OS X support by phone, email or on-site: Support Rates

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.

Contact us

We are stuck with technology when what we really want is just stuff that works. (Douglas Adams)

Today’s computers are not even close to a 4-year-old human in their ability to see, talk, move, or use common sense. One reason, of course, is sheer computing power. It has been estimated that the information processing capacity of even the most powerful supercomputer is equal to the nervous system of a snail—a tiny fraction of the power available to the supercomputer inside [our] skull. (Steven Pinker)

This post tagged: