January 7, 2006

Scraper blocker blocked me

I'd installed a site-scraper blocker script that would block anyone who accessed too many pages in too short a time (i.e., downloading the whole site in one go, as many bots do). Strangely, it blocked me even though I hadn't accessed the site in hours; I was summarily advised that I could not access it for another 2.8 hours. Hm. So I have disabled it for now.

My apologies to anyone whom this affected.

4 Comments to "Scraper blocker blocked me"

  1. Bill Slawski says:

    I hate it when something like that happens.

    Good idea though.

  2. DianeV says:

    LOL. Thanks.

    'Tis a strange thing that one would even have to think about blocking scrapers.

  3. Brad says:

    >>'Tis a strange thing that one would even have to think about blocking scrapers.

    I have been having the same strange thoughts about RSS: will allowing full posts on feeds help scrapers and will teaser only feeds annoy my reader? I hate having to think that way.

  4. DianeV says:

    Brad, for those who are using your RSS feed to display content on their own sites, then I think the full-post feed gives them the whole post.

    (I'll note that some aggregator-type sites display only the title and a snippet, but I'm not sure that that's due to a limitation they may have placed on what's displayed, or the blog owner's a limitation on feed length.)

    However, for those who send bots around to sites to pick up all pages (as opposed to feeds), then that's another approach altogether.

Have your say ...

First-time comments will be held for moderation (but comments are appreciated). Otherwise, just be courteous. If your name is a bunch of keywords, your comment will be deleted. Don't post links unless highly pertinent. Posters must be 16 or older.

Manage your subscriptions

Archives
© 2004-2017 DianeV Web Design Studio. All Rights Reserved.
34 queries. 0.244 seconds.