How to Do SEO Analysis With Screaming Frog?

Since the internet’s first appearance as the starting point of the Information Age in 1983, digital marketing has been growing, evolving, and adjusting to trends. In addition to the internet’s development, search engines are continually updating to provide the best results for a search query. So, with so much to keep an eye out to stand out in the massive numbers of websites, search engine optimization (SEO) has become the most prominent tool. Screaming Frog is one of the major players in SEO.

SEO specialists and the analysis they provide allow website owners to take action to adjust their websites to be visible on the search engine results page (SERP). However, as it sometimes becomes a difficult task to check every corner of a website individually, there are some great tools to help you with scanning and analyzing your website. In this article, we will go into the details of one of those tools: Screaming Frog SEO Spider Tool.

A dependable tool for advanced and technical SEO, Screaming Frog SEO Spider Tool is an easy-to-use helper, and we will explain how to use it in this article. Before we start learning how to use Screaming Frog for SEO, we first need to learn what it is and what it does.

What Is Screaming Frog

What Is Screaming Frog?

Screaming Frog SEO Spider Tool is one of the most reliable and advanced SEO tools which provides detailed and helpful information on adjusting and improving a website. Just as search engine spiders crawl every web page existing on the internet, Screaming Frog SEO Spider Tool crawls a website’s images, backlinks, CSS, content, and many others as a desktop application. It basically crawls a website just as a search engine crawler would and shows you what the crawler would see, searching for and filtering out existing SEO issues. Thus, it provides SEO specialists and website owners with bad redirects, duplicate pages or content, missing metadata, and a lot more helpful insights to improve their ranking on SERP.

Here is a list of what the SEO tool crawls on your website:

  • Images
  • CSS
  • JavaScript
  • SWF
  • Internal and external links
  • Canonicals
  • Pagination
  • Hreflang
  • AMP
  • Meta refresh
  • Links outside of the start folder
  • All subdomains
  • Internal or external “nofollow”
  • XML sitemaps
  • Robots.txt

Screaming Frog SEO Spider Tool offers everything you can ask for while doing a site audit (including the backlink, XML sitemaps, pagination (rel= “next” and rel= “prev”), hreflang, and canonicals audits which we will explain how further in the article). The results can help you develop your site, and SEO works to perform more qualified in searches.

Now that we have tried to explain what Screaming Frog SEO Spider Tool is and what it does in a concise manner, it is finally time to learn how to do a Screaming Frog analysis for SEO.

How to Do Screaming Frog SEO Analysis

How to Do Screaming Frog SEO Analysis?

After installing the desktop application from Screaming Frog’s original website, we recommend getting familiar with the application’s menu and items before touching anything. Also, we highly recommend using the full version as it is more than helpful in configuring your website.

The menu tabs are:

  • File: The tab where you can save a crawl as a file to have the data for further analysis. The tool also saves the last six crawls in its database just in case you forget to save.
  • Configuration: The most important menu tab as it has the option to set up the crawling settings. Clicking “Spider” will let you customize the content you want the tool to crawl and data you want to have. Besides, you can include or exclude images, CSS, JavaScript, subdomains, or any other elements in your content. Clicking “Include & Exclude” will let you paste the specific URL you want to exclude from or include in the crawling process. “API Access” option will let you interlace the crawl with Google Analytics or Google Search Console.
  • Bulk Export: The option to export the crawled data quickly. It also allows you to export addresses with specific response codes like 401s, directives, anchor text, images, inlinks, and more.
  • Reports: It is the menu tab that allows downloading of the crawling as a report. Also, you will be provided with data such as redirect chains or canonical errors.
  • Sitemaps: This menu tab allows you to construct a sitemap.

Now that you have learned how to operate the desktop application, it is time to learn the crawling process.

SEO Analysis Using Screaming Frog SEO Spider Tool

SEO Analysis Using Screaming Frog SEO Spider Tool

Now that you are ready, paste the URL of the website you want to be crawled and hit the Start button. However, the progress may take a couple of minutes, depending on the size of the website.

1-Checking the Response Code

Clicking the “Response Codes” tab will provide all response codes to the crawled website’s URLs. The response codes you should keep an eye out are:

  • 200 OK, indicating that the request has been successful.
  • 301 Moved Permanently
  • 302 Found, previously called “Moved Temporarily”
  • 404 Not Found, indicating that the content can not be viewed at present but can be found in the future.
  • 500 Internal Server Error appears when an unexpected condition is met. 
  • 503 Service Unavailable indicates that the server can not handle the request. It could be because the server is overloaded, or it is down for maintenance.

Briefly, the 2xx, 3xx, 4xx, and 5xx codes are success, redirection, client error, and server error codes in order. First, you should recognize all the pages with 404 error codes, and 301 redirect them to a related page. You can also check for broken internal and external links on the Response Codes tab. After selecting a URL with a 4xx or 5xx code, click the “Inlinks” or “Outlinks” tab to view the broken links.

Moreover, updating all the web pages with redirects and error codes will impact the user experience and significantly improve your rankings.

2-Checking the URL Structures

The importance of a good URL structure and its influence on SEO is a crystal-clear fact. Screaming Frog SEO Spider Tool will analyze your website’s URL structure under the “URL” tab. It is significant to analyze the URL structure to see URLs that contain unusual parameters and characters that are illegible for search engine crawlers to crawl and rank.

In addition, making sure that the URL contains four to five words, stands descriptive and unique will provide a better user experience.

3-Correcting Page Titles

Moving on to the “Page Title” tab, you will see all the page titles on your website with their pixel length and word count. Besides, Screaming Frog SEP Spider Tool lets you filter the titles that remain as a problem. For example:

  • Duplicate Titles: If you see two titles that are the same, correcting it will improve your SEO.
  • Keyword Optimization: Ensuring your web pages contain the most crucial keywords in your field, along with avoiding the over-excessive usage of keywords, is also essential for your SEO.
  • Title Length: Once you use the “Character Column,” you will be able to separate titles that are longer than 50-65 characters. Longer titles will impact the visibility on SERP, while shorter ones will not be appealing to your targeted audience.

4-Analyzing Meta Descriptions

Meta descriptions appear on the search engine result pages under the page title. Ensuring that it does not exceed the 160-characters limit is crucial. Also, checking that the meta description does not contain duplicate content and optimized by keywords is significant to get ranked better.

5-Testing SERP Snippets

One of the most important elements that appear on SERP, which impacts the ranking is, of course, a snippet. The SEO Spider Tool also provides how your snippet would appear on the SERP. This way, you can continue editing your URL, page title, and meta description until it fits your SEO goals.

6-Optimizing Images

Unfortunately, images on your web pages do more than improving visual quality. However, it has a significant part in SEO and should be optimized carefully. Since page speed is essential both for mobile users and search engines, it is best to keep the images under 100kb. So, larger images will hurt page speed, and it will eventually result in lowered rankings.

Via Screaming Frog, you will be able to see the images that are over 100kb, without alt texts, and containing alt texts over 100 characters.

7-Checking Directives

Thanks to the advanced SEO tool, Screaming Frog has a “Directives” tab that allows you to check technical issues. You can filter the web addresses that contain canonical, no canonical, follow, nofollow, index, noindex, and other significant elements. Besides, you will be able to see how robots.txt is working on your website and make adjustments.

8-Analyzing Crawl Depth

The crawl depth of a URL is the number of clicks that takes a user to get to a specific page from the homepage. However, the crawl depth will change according to a website’s size.

RECENT POSTS
What is Interaction to Next Paint (INP) & How to Optimize It?
What is Interaction to Next Paint (INP) & How to Optimize It?

Google recently changed the balance by introducing a metric called next-paint engagement (INP) to track page experience. INP stands out as an empirica...

H1 Tag Missing Or Empty Warning
H1 Tag Missing Or Empty Warning

The digital world has wonders, however, it is not perfect. Sometimes there can be problems. In this world where algorithms waltz and crawlers choreogr...

If the crawl depth on the SEO tool is high, it will be harder for your website to get ranked. For example, let’s say that Screaming Frog indicates that an essential product on your website has a crawl depth of 5 after crawling your website. It will be better to change the website’s organization and hierarchy to get the crawl depth to 2.

9-Generating XML Sitemap

An XML sitemap is an important element to get search engines to understand your website better. In other words, it is a roadmap for crawlers to crawl your website and ensure that everything is relevant.

You will find the “Create XML Sitemap” feature in the top navigation bar under “Sitemaps.” You can manually adjust the preference and modify the rotation of individual pages.

Types of Audits in Screaming Frog SEO Spider Tool

An audit is a sub-set of SEO analysis because it means you are checking for apparent technical issues. These issues are known as red signals for the indexing and ranking of your website by Google. For this reason, Screaming Frog supports doing audits in its application. Here you can find brief guides on how to do each audit.

Canonical Audit

The SEO Spider will crawl canonical link elements found in the HTML and HTTP Headers and report in their standard errors.

  1. Enable ‘Store’ and ‘Crawl’ canonicals via “Configuration>Spider>Crawl.”
  2. Enter the website URL and start the crawling.
  3. View the “Canonicals” tab. Filters ‘contains canonical’, ‘self-referencing’, ‘canonicalized’, ‘missing’, ‘multiple’, and ‘non-indexable canonical’ are available.
  4. View’ Indexability Status’ of non-indexable canonical URLs via the “URL Info” tab.
  5. Bulk export source URLs, non-indexable canonical URLs, and response codes via “Reports>Canonicals>Non-Indexable Canonicals.”
  6. View chained canonicals and loops via “Reports>Canonicals>Canonical Chains.”

Hreflang Audit

The SEO Spider will crawl rel=” alternate” hreflang explanations in HTML, via HTTP Header or in XML Sitemaps.

  1. Enable ‘Crawl’ and ‘Store’ hreflang via “Config>Spider>Crawl.”
  2. Select ‘Crawl Linked XML Sitemaps’ to crawl hreflang in XML sitemaps via “Config>Spider>Crawl.”
  3. Crawl the website URL.
  4. Populate hreflang filters on the “Hreflang” tab via “Crawl Analysis>Start.”
  5. View errors via the “URL Info” tab.
  6. Bulk export source URLs and errors via “Reports>Hreflang.”

Rel=”next” and Rel=”prev” Audit

The SEO Spider will crawl pagination characteristics.

  1. Via “Configuration>Spider,” select ‘Crawl’ and ‘Store’ pagination.
  2. Start crawling.
  3. Populate pagination filters on the “Pagination” tab via “Crawl Analysis>Start.”
  4. Bulk export via “Reports>Pagination.”

XML Sitemaps Audit

  1. Enable ‘Crawl Linked XML Sitemaps’ via “Config>Spider>Crawl.”
  2. Start crawling.
  3. Populate sitemap filters on the “Sitemaps” tab via “Crawl Analysis>Start.”
  4. Click “Inlinks” to view the XML sitemap source.
  5. Bulk export the report.

Backlinks Audit

  1. Config XPath custom extraction via “Config>Custom>Extraction.”
  2. Then, switch to list mode via “Mode>List.”
  3. View the URLs that are blocked by robots.txt via “Config>Spider>Basic.”
  4. Upload the backlinks and start the crawling.
  5. You will be able to review the report and then bulk export.

To Sum Up

With the Screaming Frog SEO Spider Tool, you can analyze the most critical elements. It is an exceptional tool to help you optimize a website and boost its performance in search result pages. Additionally; it should be a necessary tool every web designer and SEO specialist should operate.

Frequently Asked Questions About

There might be a number of reasons. However, the most common issues that prevent your website from crawling are the status codes, robots.txt, JavaScript, and the “nofollow” attribute.

The 503 Service Unavailable status code happens when a web server denies access to the SEO Spider’s request.

It is calculated from the time it takes to originate an HTTP request and get the full HTTP reply back from the server. 

Margrit Aksu

Posts: 166

Margrit Aksu, a graduate of English Literature, is passionate about the English language. As she dived deeper into this language, writing, creating and editing articles became her passion. That's why she has been working as an SEO Content Editor at Dopinger digital marketing agency since 2021. Bes... Read More

RECENT POSTS
1 Comments on How to Do SEO Analysis With Screaming Frog?

Your email address will not be published. Required fields are marked *

(Total: 41 Average: 5 )

1 Comment

  1. Dylan Sanders
    Dylan Sanders

    Great post! It looks like you have a good understanding of how to use Screaming Frog for SEO analysis. I suggest adding more details on the specific steps and techniques used to get the most out of the tool.