What are Crawl Errors?
Crawl errors are issues reported by search engines spiders. These issues are reported as they crawl your website and encounter specific problems. Everything the search engine knows about your website starts with your sitemap. Your sitemap contains all the pages & posts and related information (tags, meta, categories, etc). This is how “spiders” learn about your website, scanning or “crawling” your publicly available resources. This is a fundamental process for search engines, so interacting with your sitemap is extremely important. There are different types of errors and some are frivolous, while others are very severe issues.
4xx Client errors | intended for situations in which the error seems to have been caused by the client. Commonly encountered examples of these error types are:
5xx Server errors | indicate a server misconfiguration and should be resolved immediately. Commonly encountered examples of these error types are:
technical sitemap auditing
Quickly crawl your website to identify search engine crawl issues.
A full list of Hypertext Transfer Protocol (HTTP) response status codes can be found on wikipedia here.