Check out Symmetric Chess, our featured variant for March, 2024.


[ Help | Earliest Comments | Latest Comments ]
[ List All Subjects of Discussion | Create New Subject of Discussion ]
[ List Earliest Comments Only For Pages | Games | Rated Pages | Rated Games | Subjects of Discussion ]

Comments/Ratings for a Single Item

Earlier Reverse Order Later
[Subject Thread] [Add Response]
Jianying Ji wrote on Sat, Jul 1, 2006 08:47 AM UTC:
Welcome back everyone! This thread I'm creating for both site
administrators to explain the situation and to brainstorm solution. 
Course we all thank the people who keep this site up. And if we have ways to
solve this bandwidth issue, let us all pitch in whatever we can to help.

My first suggestion is that some sort mirror system could be setup so that
when the main site goes down, traffic get redirected to a different site.

🕸Fergus Duniho wrote on Sat, Jul 1, 2006 12:09 PM UTC:
It was robots. One robot in particular was responsible for almost half of
our bandwidth usage in June, and it has now been banned in a robots.txt
file. As a precaution, Googlebot is being restricted and all other robots
are being completely banned until I can confirm that the robots exclusion
protocol is keeping the ravenous robot away.

Joe Joyce wrote on Sat, Jul 1, 2006 12:22 PM UTC:
Glad to be back! Would be happy to help, but am probably limited to a small
monetary contribution, and that may well not help. If it will, please let
us know - I'd pay to post and play. One probably dumb question: could we
go after the perpetrator of the ravenous robot for something like a denial
of service attack, or is it a[n anonymous] robot from some place that will
block any such efforts? Or is this just our tough luck; is the idea that 
people and other things use the net, and we need to have way too much 
bandwidth to try to avoid this?

🕸Fergus Duniho wrote on Sat, Jul 1, 2006 12:32 PM UTC:
The responsible robot is BecomeBot from the Become.com shopping search
engine. Although the site says that some people have complained about
spurious BecomeBots not from their site, the robot's IP address was
within the range used by Become.com.

4 comments displayed

Earlier Reverse Order Later

Permalink to the exact comments currently displayed.