Robots.txt: Is this wildcard rule valid?

The answer is, “it depends”. The robots.txt “standard” as defined at robotstxt.org is the minimum that bots are expected to support. Googlebot, MSNbot, and Yahoo Slurp support some common extensions, and there’s really no telling what other bots support. Some say what they support and others don’t. In general, you can expect the major search … Read more

Do SEO-friendly URLs really affect a page’s ranking? [closed]

I will let google answer to your question: http://googlewebmastercentral.blogspot.com/2008/09/dynamic-urls-vs-static-urls.html In the article: Which can Googlebot read better, static or dynamic URLs? […]While static URLs might have a slight advantage in terms of clickthrough rates because users can easily read the urls, the decision to use database-driven websites does not imply a significant disadvantage in terms … Read more

Ignore URLs in robot.txt with specific parameters?

Here’s a solutions if you want to disallow query strings: Disallow: /*?* or if you want to be more precise on your query string: Disallow: /*?dir=*&order=*&p=* You can also add to the robots.txt which url to allow Allow: /new-printer$ The $ will make sure only the /new-printer will be allowed. More info: http://code.google.com/web/controlcrawlindex/docs/robots_txt.html http://sanzon.wordpress.com/2008/04/29/advanced-usage-of-robotstxt-w-querystrings/

Google search results site map?

Google calls them sitelinks. You can’t enforce them currently: We only show sitelinks for results when we think they’ll be useful to the user. If the structure of your site doesn’t allow our algorithms to find good sitelinks, or we don’t think that the sitelinks for your site are relevant for the user’s query, we … Read more