Over the last few months the amount of visitors using Mozilla Firefox has grown to about 25%. The amount of bandwidth being used has also increased a large amount. Part of the reason behind this is that many Firefox users use an extension called “Fasterfox”. This extension “pre-fetches” links on a page so that if the user were to click on a link it would load much faster because its already been downloaded. This may be more convenient for viewer, but is a major problem for many webmasters who are low on bandwidth. Since Fasterfox constantly requests new files, it can cause many servers to overload much faster than if a person viewing the same content without Fasterfox were to view it.
Fasterfox is in fact one of the most popular extensions for Firefox. It is currently ranked as the 3rd most downloaded extension for Firefox on Mozilla Update page. (aka. Firefox Add-ons) The latest version of Fasterfox, v1.0.3, checks the for the robots.txt file on the site the viewer is visiting to check whether it should pre-fetch or not. This new feature allows webmasters to add the following text (in bold) to their robots.txt file to prevent Fasterfox from pre-fetching links. Text To Add To “robots.txt”:
- User-agent: Fasterfox
- Disallow: /
Adding these two lines somewhere in your robots.txt file and placing it in the root folder will prevent Fasterfox from pre-fetching links anywhere on your site. (ex: yourwebsite.com/robots.txt) Webmasters can modify the text so that Fasterfox will only be prevented from pre-fetching on specified directories. Still have questions? Ask us! (reply in the comments)
Links: Fasterfox | Firefox Add-ons
Related: “robots.txt” Tutorial
Recent: Firefox Extend Contest Finalists
Thanks for the info. I was just looking up fasterfox to see if it was worth using. But after reading this i think i will wait the extra half a second to download pages rather than cause problems for some people who spend a lot of their own personal time for the benifit of others. I only checked this out because firefox put a block on the download, maybe they are be responsible.
Do you know how to go about discovering installed instances of this plug-in? As in, what if I have some users who have installed this and I don’t want them using it… I didn’t see any additional processes or executables caused by its installation.
Well, if people are going to ignore the request anyway then webmasters will have to take more drastic measures. We implemented a system where if you grab more than x pages in a 5 second period you are banned. Stopped all that fasterfox traffic. Shouldn’t have to do that. Fasterfox is broken. We had some Firefox users downloading the same pages HUNDREDS of times. We had some users machines that would download 1500 pages in a 5 minute period because of fasterfox… That is just dumb. There is NO WAY a human could view all of them.
The old fasterfox xpi (1.0.0) does work with Firefox 2.0.0.4 if you alter the archive with WinRAR to change the install.rdf version checking.
SiteSucker is still around if you really want everything all at once and don’t care how much damage you do. It’s all about personal responsibility.
I just bumped Fasterfox 1.0.0 to work with Firefox 2.
Now ignoring robot.txt. BWAHAHHAHA!
The user can easily switch off prefetching. Go to Add-ons in the Tools menu, click Options in the Fasterfox entry in Extentions, switch to Courteous and then back to Custom in the Presets menu, and that is the right place to do this: In the Pipelining menu, Enable Pipelining and Enable Proxy Pipelining, and set Max Pipelining Requests to 10. In Rendering, set Pages In Memory to 10. This procedure should get rid of any prefetching problems and still allow great speed.
It is too bad that the XPI’s code has released to the general population, now it is a simple matter for anyone with a smidgen of programming knowledge to remove the polite-man check and one rouge group called something like Frozey Crew 200* has released a version that is simply nastier than anything that has come before it that utilizes the entire available bandwidth and when it runs out it seeks other sites that have been somehow selected at ramdom to continue the data pull. Just friggen nasty…Why people have to do crud like that is beyond me, there out to be international laws against those kind of nasty-hacker groups.
I just implemented this onto my website. That’ll teach those bandwidth thieves a lesson or two. Also might lower the “Digg Effect” considering that a lot of people who visit from Digg are probably using this extension. When your server is taking a beating from 100,000 people and 20,000 of them are using Fasterfox without this your server is already gone.
– Dwayne Charrington.
http://www.dwaynecharrington.com
what a hell are u talking about ???
I didn’t figure it out!!!!!
is it dangeros ??
Since there is a FasterFox fork which ignores the robot.txt I will implement a blocking mechanisim. Care needs to be taken not to block search engines but I think this can be done easily.
– Tell all search engines not to spider a specific directory via robots.txt
– Setup a hidden link on every page, let’s call it VandalTrap in this example
– Run a server side script such as PHP on the index page of the VandalTrap directory. If the page is loaded, the .htaccess file is modified to block the user
This could also be extended to display a informing message but if they choose to install software which circumvents the rules then blocking the IP for 20 minutes should be the best choice.
Faster 3.1.2 for Firefox 2.0 at 3 !! https://addons.mozilla.org/fr/firefox/addon/14833?collection_id=4985b32f-408b-7861-5ee1-9acc446a9cfb