Over the last few months the amount of visitors using Mozilla Firefox has grown to about 25%. The amount of bandwidth being used has also increased a large amount. Part of the reason behind this is that many Firefox users use an extension called “Fasterfox”. This extension “pre-fetches” links on a page so that if the user were to click on a link it would load much faster because its already been downloaded. This may be more convenient for viewer, but is a major problem for many webmasters who are low on bandwidth. Since Fasterfox constantly requests new files, it can cause many servers to overload much faster than if a person viewing the same content without Fasterfox were to view it.
Fasterfox is in fact one of the most popular extensions for Firefox. It is currently ranked as the 3rd most downloaded extension for Firefox on Mozilla Update page. (aka. Firefox Add-ons) The latest version of Fasterfox, v1.0.3, checks the for the robots.txt file on the site the viewer is visiting to check whether it should pre-fetch or not. This new feature allows webmasters to add the following text (in bold) to their robots.txt file to prevent Fasterfox from pre-fetching links. Text To Add To “robots.txt”:
- User-agent: Fasterfox
- Disallow: /
Adding these two lines somewhere in your robots.txt file and placing it in the root folder will prevent Fasterfox from pre-fetching links anywhere on your site. (ex: yourwebsite.com/robots.txt) Webmasters can modify the text so that Fasterfox will only be prevented from pre-fetching on specified directories. Still have questions? Ask us! (reply in the comments)
Links: Fasterfox | Firefox Add-ons
Related: “robots.txt” Tutorial
Recent: Firefox Extend Contest Finalists
i just added this to the robots file. ill see if this actually works since fasterfox is sorta stupid at times.
if i knew about this i would have used it a long time ago. i just checked the fasterfox change log and indeed they have added “robots.txt” check for instructions.
this is actually neat that they created this so ppl w/ bandwith issues can block it and others who have excess of bandwidth can allow it. i guess noone needs to complain about fasterfox anymore :)
This is a good move on the part of the developers. While I’m not sure that the entire concept of these things particularly sound – most people will never stumble across this fix, particularly casual bloggers and such who might attract a lot of traffic but are bandwidth limited – having a way to block it is probably more than many developers would bother to implement I’m sure.
fasterfox isnt all that anyway… use google web accelerator
::most people will never stumble across this fix, particularly casual bloggers and such who might attract a lot of traffic but are bandwidth limited ::: Digg Frontpage enough advertising for it?
Great tip! Adding this to robots.txt now.
No, DD32. Unless the fix is integrated as an option with popular software such as WordPress, a lot of *casual* users will not be aware of the fix. Still, the fact it exists at all is a good step. :)
Cool! Thanks for the warning, just downgraded to 1.0.0, the last version before the robots check!
get it here:
http://downloads.mozdev.org/fasterfox/fasterfox-1.0.0-fx.xpi
Daniel is a fine example of human decency.
If I have the robots.txt file setup like this:
User-agent: *
Disallow: /
Will that block fasterfox as well, or do I have to make a separate entry?
if most webhosts use this robots trick, then the extension becomes useless. i download this extension to decrease load times on pages on the web, not just the ‘high bandwidth’ pages.
shawgo, your setup should technically work, because all standard search engines would read that and know not to index anything. But the creators of Fasterfox have specified the method posted in this article, therefore if you are really worried I’d just place another two lines along with what you have to once again specify a command for Fasterfox.
Daniel, I don’t think you understood the purpose of this article. It is not to help people bring servers down, but is to help everyone. Please upgrade to 1.0.3 once again. This will help the sites you like stay up. If you do use this on severs that are already over loaded, this will just help bring them down.
I don’t think Daniel understands that webmasters generally pay for limited bandwidth resources, and that in a lot of cases this limit isn’t awfully high – especially on blog websites and such. If one of these websites were to post a good article and have lots of visitors come to read it, there is a significant risk that the site would be knocked offline for the remainder of the month unless the webmaster foots the bill for additional resources. If it’s a hobby site, this is unlikely.
This risk is basically quadrupled (or more) on a per-user basis for everybody using the unfixed extension. So, Daniel, if you’re happy with sites you visit being knocked offline, then sure, go ahead and spread the good word about 1.0.0 ignoring robots.txt. You will then see your entire browser become useless. ;)
I added the extra line, just to be safe. Thanks for the advice, Sahas.
Daniel:
fasterfox won’t become useless, just restricted. If it becomes annoying enough to content providers, the backlash would make firefox less popular, and firefox themselves would move to block it. controlled is better for everyone. Besides, like the poster above mentions, most of the world will never utilize this.
Personally, I think the prefetch aspect is dumb anyways. I am never interested in 100% of the links on any given page. I use fasterfox but turned off prefetch.
Does Fasterfox send an X-Moz header with its prefetch requests? I blocked that back when Google Web Accelerator came out, would it also stop Fasterfox prefetches?
The sad part is that Fasterfox is open source, much like every other extension and the mozilla browser itself, which means that any user can just remove the useragent header from the fasterfox extension and totally bypass the robots.txt file.
It would have been nice to say that the developers! of the extension have this information publicly posted:
Admin: Link has been removed because of some part of the URL that corrputed the page.
Here is the new URL: http://tinyurl.com/zctne
User-agent: *
Disallow: /
Should NEVER be done unless you want to take yourself out of all the search engines…. it’s search engine ranking suicide.
Not everyone depends on search engines for traffic.
Thanks for the heads up. Its getting ridiculous the number of ways Firefox is being extended (even in the trunk build) to sap bandwidth from websites to give the user a false impression of speed. If only Firefox itself supported robots.txt for its own href prefetching.