

Here you can tell Screaming Frog where your Sitemap is so that it can analyze it. By selecting Crawl All Subdomains before launching our Crawl, the application brings out all the URLs of this subdomain.ĭo you have a Sitemap on your site? Then it might be interesting to analyze the gaps between what you have in your Sitemap and the URLs that Screaming Frog finds. By checking Crawl All Subdomains, Screaming Frog will be able to bring them up in its crawler.īelow you can see a site that has a subdomain of type: blog. You have pages in a subdomain and want to crawl them all the same way to analyze them. In this part, you can give instructions to the tool to crawl some specific pages of your site. Again, here it is up to you to choose what you want to analyze.Ĭrawl with external links and without external links: You can for example say that you do not want to crawl canonicals or external links. In this part, you can also decide what you want to crawl in terms of links. We recommend you select the crawl and storage of images, CSS, javascript, and SWF files for an in-depth analysis. On most of your crawls, you will have a very basic configuration in this window. In the screenshot above, yellow will not be crawled, and the link of the image with the red cross will be crawled. Good to know: It may still happen that you have some images that come up if they are in the form of a href and do not have an img src It is up to you to choose what you need to analyze. Your crawl will be faster but you will not be able to analyze the weight of images, the attributes of images … Screaming frog will exclude all the elements IMG: (IMG src=”image.jpg”). You can, for example, decide not to crawl the images of the site. Usually, you should have a window that looks like this:

Here, you can decide which resources you want to crawl Spider is the first option that is available when you enter the configuration menu. To do this go to the top menu and select Configuration This can be very useful when you crawl sites with a lot of pages or when you want to analyze a subdomain. There are many ways to configure your crawler so that it only returns information that you are interested in. When it is launched and you want to stop it, you just have to click on Pause and then Clear to reset your crawl Configure your first crawl on Screaming frog: You just have to enter your URL and select start to launch your first crawl. When you get to the tool, the first thing you will see is a bar where you can enter the URL of the site you want to analyze:
#Screaming frog seo spider alt text how to
Find out how to configure it step by step Launch your first crawl on Screaming Frog: A good configuration of your tool will be very important to analyze a site in-depth and get all the data you need.
