Skip to content

Commit 8efc653

Browse files
authored
Create robots.txt
This GitHub pages site seems to serve from the root of the repo. I'm not sure if GitHub pages will serve this file correctly, but it's worth a shot. Our goal here is to prevent crawlers from indexing non-prod docs sites. See Raku/doc-website#384
1 parent 1611f8e commit 8efc653

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

robots.txt

+2
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
User-agent: *
2+
Disallow: /

0 commit comments

Comments
 (0)