[webkit-dev] trac.webkit.org links via Google.com
Yaar Schnitman
yaar at chromium.org
Tue Dec 1 11:04:21 PST 2009
Robots.txt can exclude most of the trac site, and then include the
sitemap.xml. This way you block most of the junk and only give permission to
the important file. All major search engine support sitemap.xml, and those
that don't will be blocked by robots.txt.
A script could generate sitemap.xml from a local svn checkout of trunk. It
will produce one url for each source file (frequency=daily) and one url for
every revision (frequency=year). That will cover most of the search
requirements.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.webkit.org/pipermail/webkit-dev/attachments/20091201/aa0fdfaa/attachment.html>
More information about the webkit-dev
mailing list