FAQ
I am writing a simple blog using Revel and deploying it on Heroku.
However, I want to have a robots.txt file so that not all pages get crawled
by web crawlers.
What is the best way to achieve this ?

--
You received this message because you are subscribed to the Google Groups "golang-nuts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to [email protected].
For more options, visit https://groups.google.com/groups/opt_out.

Search Discussions

  • Kyle Lemons at Dec 19, 2013 at 5:35 pm
    Add a handler for /robots.txt and generate/print the list of URLs you don't
    want to index?

    On Wed, Dec 18, 2013 at 7:15 PM, Dar wrote:

    I am writing a simple blog using Revel and deploying it on Heroku.
    However, I want to have a robots.txt file so that not all pages get
    crawled by web crawlers.
    What is the best way to achieve this ?

    --
    You received this message because you are subscribed to the Google Groups
    "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an
    email to [email protected].
    For more options, visit https://groups.google.com/groups/opt_out.
    --
    You received this message because you are subscribed to the Google Groups "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an email to [email protected].
    For more options, visit https://groups.google.com/groups/opt_out.
  • Dar at Dec 20, 2013 at 2:40 am
    I tried
    GET /robots.txt
    Static.Serve("public","robots.txt")
    However, online robots.txt checkers do not validate it.
    On Thursday, December 19, 2013 11:04:55 PM UTC+5:30, Kyle Lemons wrote:

    Add a handler for /robots.txt and generate/print the list of URLs you
    don't want to index?


    On Wed, Dec 18, 2013 at 7:15 PM, Dar <[email protected] <javascript:>>wrote:
    I am writing a simple blog using Revel and deploying it on Heroku.
    However, I want to have a robots.txt file so that not all pages get
    crawled by web crawlers.
    What is the best way to achieve this ?

    --
    You received this message because you are subscribed to the Google Groups
    "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an
    email to [email protected] <javascript:>.
    For more options, visit https://groups.google.com/groups/opt_out.
    --
    You received this message because you are subscribed to the Google Groups "golang-nuts" group.
    To unsubscribe from this group and stop receiving emails from it, send an email to [email protected].
    For more options, visit https://groups.google.com/groups/opt_out.

Related Discussions

Discussion Navigation
viewthread | post
Discussion Overview
groupgolang-nuts @
categoriesgo
postedDec 19, '13 at 3:15a
activeDec 20, '13 at 2:40a
posts3
users2
websitegolang.org

2 users in discussion

Dar: 2 posts Kyle Lemons: 1 post

People

Translate

site design / logo © 2023 Grokbase