Sitemap is a tool for web developers that crawls a website to build a list of pages and their server status at the time of the crawl. It also checks links on each page and media such as image and video files.
After a crawl is complete an XML Sitemap can be generated along with error reports listing things like broken links (including where to find them), missing media files and a list of pages that have changed since the last crawl.
Before Sitemap can crawl a site you must verify ownership of the site by uploading a file that will be generated by Sitemap when you add the new site.
Your crawl data is saved for you so you can browse your last crawl results without having to re-crawl the site every time you open the program.