# /robots.txt file for http://www.domain.com/ Sitemap: http://www.domain.com/sitemap.xml User-agent: * Disallow: /*.axd Disallow: /mobile-page-not-found/ Disallow: /page-not-found/ Disallow: /site-offline/ Disallow: /web-support/ Disallow: /xml/project-gallery.xml Disallow: /performancepartners/ Disallow: /_https-paypal-images/ Disallow: /fsp93j3nkrjn21nbee0/ # /*.axd - When Google/Yahoo/Bing crawlers visit a page during any session they also cache the session keys WebResource.axd and ScriptResource.axd. Because these keys are expired with its session Googlebots will not find them on its next crawl and will report the Unreachable URL 500 exception error for these files. #end robot.txt