An automated program that scans websites to determine their content and purpose. The name reflects how the software “crawls” through the code, which is why they are sometimes also referred to as “spiders”. Crawlers are used by Google to find new content and to evaluate the quality of webpages for their index. Webmasters and SEOs can request additional scans through Google Search Console.
It's a program designed to browse and read the web by following hyperlinks. Search engines use the information a crawler finds to build an index.