While helping clients deal with major algorithm updates, troubleshoot technical SEO problems and more, I’m often auditing large-scale sites. That almost always requires a thorough site crawl (typically several crawls over the life of an engagement). And when you’re hunting down SEO gremlins that can be wreaking havoc on a site, it’s extremely important to slice and dice that crawl data in order to focus your analysis.
With good data filtering, you can often surface page types, sections or subdomains that might be causing serious problems. Once surfaced, you can heavily analyze those areas to better understand the core issues and then address what needs to be fixed.
From a crawler perspective, I’ve already covered two of my favorites here on Search Engine Land, DeepCrawl and Screaming Frog. Both are excellent tools, and I typically use DeepCrawl for enterprise crawls while using Screaming Frog for surgical crawls, which are more focused. (Note: I’m on the customer advisory board for DeepCrawl.) In my opinion, the combination of using DeepCrawl and Screaming Frog is killer, and I often say that 1 + 1 = 3 when using both tools together.
Below, I’ll cover several examples of using filtering in both tools so you can get a feel for what I’m referring to. By filtering crawl data, you’ll be ready to isolate and surface specific areas of a site for further analysis. And after you start doing this, you’ll never look back. Let’s rock and roll.
Some opinions expressed in this article may be those of a guest author and not necessarily Marketing Land. Staff authors are listed here.
About The Author