Static files in Flask – robot.txt, sitemap.xml (mod_wsgi)
The best way is to set static_url_path to root url from flask import Flask app = Flask(__name__, static_folder=”static”, static_url_path=””)
The best way is to set static_url_path to root url from flask import Flask app = Flask(__name__, static_folder=”static”, static_url_path=””)
The answer is, “it depends”. The robots.txt “standard” as defined at robotstxt.org is the minimum that bots are expected to support. Googlebot, MSNbot, and Yahoo Slurp support some common extensions, and there’s really no telling what other bots support. Some say what they support and others don’t. In general, you can expect the major search … Read more
robots.txt User-agent: * Disallow: / this will block all search bots from indexing. for more info see: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40360
Here’s a solutions if you want to disallow query strings: Disallow: /*?* or if you want to be more precise on your query string: Disallow: /*?dir=*&order=*&p=* You can also add to the robots.txt which url to allow Allow: /new-printer$ The $ will make sure only the /new-printer will be allowed. More info: http://code.google.com/web/controlcrawlindex/docs/robots_txt.html http://sanzon.wordpress.com/2008/04/29/advanced-usage-of-robotstxt-w-querystrings/