skip to main content

TAMU Webmaster's Blog


Information and insight from the A&M Webmasters

Robots Files

November 7th, 2007 by Erick Beck

Yesterday we got our first call about the new search engine hitting a “black hole” site and causing problems for their server. I’ve identified a few others by keeping watch on the Google logs and adding those sites to the “do not crawl” list. This brings up a good point though – these same sites are still going to be problematic in the public Google search, and Microsoft, and Yahoo, and…

This reenforces the need for us as webmasters to be mindful that our sites are always being searched and to be diligent about using robot exclusion standards to control access. Robots files give us control over what gets searched, which in turn means that our site users are being better served by getting only the most relevent pages returned.

Wednesday, November 7th, 2007 Search
Share this article

No comments yet.

Leave a comment

Categories

Archives