Wikka Mod 033
Type: Bug Fix / FeatureCredit:
CharlesNepote @ WikiNiMatthiasAppel
This is the method used in Wikka:
In header.php :
<?php if ($this->GetMethod() != 'show') echo "<meta name=\"robots\" content=\"noindex, nofollow\" />\n";?>
History from WakkaWiki........
Hm, does anybody know how to make search robots not to index the ../edit ../revisions and ../referrers links on every page? You can search for WakkaWiki on Google and see the result.
--MatthiasAppel
Yes. See : http://www.robotstxt.org/wc/exclusion.html.
-- CharlesNepote
Does that mean, that I have to add a robots.txt with this content:
User-agent: Googlebot Disallow: /edit$ Disallow: /revisions$ Disallow: /referrers$ User-agent: * Disallow: /edit Disallow: /revisions Disallow: /referrers
--MatthiasAppel
Did it work with robots.txt? And what does $ mean?? What is with referrers_sites? --MoE
Dont really know, but I think $ doesnt belong there. But you could use something like this:
User-agent: googlebot Disallow: /*/edit Disallow: /*/revisions Disallow: /*/referrers Disallow: /*/referrers_sites
But also like Charles' method noted below. --JanPiotrowski
- Yahoo! recently announced wildcard support, so the above mentioned solution works for Yahoo! Slurp as well. Up to now I haven't found a solution for msn. :( --DaC
On WikiNi we code that in header.php :
<?php if ($this->GetMethod() != 'show') echo "<meta name=\"robots\" content=\"noindex, nofollow\" />\n";?>
It seem to work quite well.
-- CharlesNepote
You can't get corrects contents of robots.txt if you install wiki at the root of your web, ie, if urls to HomePage is like my_domain.com/HomePage, because /robots.txt will be redirected to /wakka.php?wakka=robots.txt
- There is a fix for this problem at RobotsDotTxt. - BaxilDragon