Over the last few months the amount of visitors using Mozilla Firefox has grown to about 25%. The amount of bandwidth being used has also increased a large amount. Part of the reason behind this is that many Firefox users use an extension called “Fasterfox”. This extension “pre-fetches” links on a page so that if the user were to click on a link it would load much faster because its already been downloaded. This may be more convenient for viewer, but is a major problem for many webmasters who are low on bandwidth. Since Fasterfox constantly requests new files, it can cause many servers to overload much faster than if a person viewing the same content without Fasterfox were to view it.
Fasterfox is in fact one of the most popular extensions for Firefox. It is currently ranked as the 3rd most downloaded extension for Firefox on Mozilla Update page. (aka. Firefox Add-ons) The latest version of Fasterfox, v1.0.3, checks the for the robots.txt file on the site the viewer is visiting to check whether it should pre-fetch or not. This new feature allows webmasters to add the following text (in bold) to their robots.txt file to prevent Fasterfox from pre-fetching links. Text To Add To “robots.txt”:
- User-agent: Fasterfox
- Disallow: /
Adding these two lines somewhere in your robots.txt file and placing it in the root folder will prevent Fasterfox from pre-fetching links anywhere on your site. (ex: yourwebsite.com/robots.txt) Webmasters can modify the text so that Fasterfox will only be prevented from pre-fetching on specified directories. Still have questions? Ask us! (reply in the comments)
Links: Fasterfox | Firefox Add-ons
Related: “robots.txt” Tutorial
Recent: Firefox Extend Contest Finalists
Fred: If there is one thing I have learned in all my years of Open Source development, is the overwhelming majority of folks are lazy. I don’t see that to be a problem;
Thanks for the tip!
If the problem becomes seriousness enough than scripts which block what are clearly prefetching from a site for a temporary duration are likely to be be developed – it could even perhaps be set to ignore search engines, using the IP database of something else out there as a headstart.
If people like Daniel insist on sapping people’s bandwidth, the chances are that it will get to the extent were webmasters retaliate with more than casual blocking.
See more extensions for power surfing.
http://pchere.blogspot.com/2005/12/50-best-firefox-extensions-for-power.html
> Stwo says
> fasterfox isnt all that anyway… use google web accelerator
While I agree Fasterfox aint all that perhaps you should open your eyes to the fact that not everyone can meet this requirement :
To use Google Web Accelerator, your computer must have a Windows XP or Windows 2000 SP 3+ operating system. Google Web Accelerator works for the Internet Explorer 5.5+ or Firefox 1.0+ browsers.
plus I have 100+ links on quite a few of my pages
Fasterfox is virtually no help to anyone coming to my website
Yeah, but, at least in the version I have, prefetching is turned off by default…
Hey Daniel
Thanks for that like to fasterfox 1.0.0
get it here:
http://downloads.mozdev.org/fasterfox/fasterfox-1.0.0-fx.xpi
Update: BUGGER it doesn’t work with firefox 1.5.0.1
The Fasterfox just forked, so now you can either stick with the original Fasterfox branch, or get FastestFox which does not include the robot.txt check.
Daniel, Matt: Why is it so critical to leave prefetch on? Are you on network connections with enough latency that you really need that enabled for a fast experience? Besides, if you disable the check, webmasters may just decide that they’ll add absolute throttling or write behavioral plugins (easy enough to do — I’m no programmer and even I could do it) to block pre-fetching or ban hosts using it. In fact, if enough folks like you are insistant enough to ignore it when people have politely asked you to stop, I fully expect that to happen. If they’ve asked you to stop, why not play nice and just stop? It doesn’t seem like it’s too much to ask for.
Fortunately my website isn’t really large enough for prefetching to cause too much of an issue (I think) – but I know if people rammed my bandwidth, and I went offline, I don’t think I could easily sort out the cash to pay for more. Ideally I would prefer my bandwidth to be used by people downloading my music than prefetching!
The idea that bandwidth could be used by someone who might not visit those prefetched pages seems crazy.
Very handy, fasterfox is bad news for us web developers.
http://www.thewebdesignblog.com
At last, it reads robots.txt. Well, that was much needed.
Now that is really good news.
Have added it to my robots.txt Now wait and monitor :)
Any clue on how to check how much bandwidth was it actually eating up?
My websites are update, but most won’t.
This is another invention to save our beloved cyberspace… much like segregating nodes via ip address (for the elite), p2p “traffic shaping” (censorship) and use of DRM to (jacking the consumers some more)… Why don’t you twats write a pluggin that will make http requests more effiecent rather then making some crap that limits what people can do… You make the users to be the enemies when in fact the bandwidth providers are the bastards…
Thats great that it is finally possible to block prefetching. Before that people could download gigabytes of data that would never be read. As for Daniel and Matt, I ban that type of person from ever entering my websites
Prefetch makes web browsing 10x faster easily! I use 1.0.3 & respect those sites that do not want prefetch, but I have never noticed any problems with my own site. (Than again, i’m on University webspace)
Unfortunately, in some of my tests the robots.txt entry doesn’t stop it…
Good, at least they can bork the robots.txt thing :D