This document discusses how to optimize a website to increase its crawl budget from search engine crawlers like Googlebot. It defines what a crawler is and explains crawl budget and how it is determined by crawl rate limits and demand. Factors that negatively impact crawl budget are identified as faceted navigation with duplicate URLs, on-site duplicate content, soft 404 errors, hacked/spam pages, infinite content, and low-quality pages. The document provides recommendations for optimization, including removing unnecessary URL parameters, limiting duplicate and infinite content, disallowing non-crawlable pages, and using canonical URLs when there are duplicate versions of content.