Why crawl when a sitemap is provided. Honest question.
IME, using a sitemap is much more efficient. For example, HTTP/1.1 pipelining can be used to reduce the number of TCP connections needed.
Is resource exhaustion what draws a public website^1 operator's attention to "bots". If it is not resource exhaustion then what is it.
1. For this question, assume "public website" means a website serving public information where there are no legitimate intellectual property rights in the information that can be asserted by the site operator.
IME, using a sitemap is much more efficient. For example, HTTP/1.1 pipelining can be used to reduce the number of TCP connections needed.
Is resource exhaustion what draws a public website^1 operator's attention to "bots". If it is not resource exhaustion then what is it.
1. For this question, assume "public website" means a website serving public information where there are no legitimate intellectual property rights in the information that can be asserted by the site operator.