Search engines examine websites based on hundreds of criteria. Search engines perform these examinations in terms of software. The scope of all of these processes is called technical SEO practices. If we break down the elements for a technical SEO checklist, we may list them as such:
- Canonical link structures of websites
- URL structures of websites
- Robots codes
The items we have listed above are software issues. Here are the content and design criteria:
- Keyword densities
- Heading tags
- Alt tags
The most important ranking factors in terms of SEO are within the scope of technical SEO. So, do not expect success on a website that has technical SEO problems. You may perform the technical SEO checkup of your website yourself through this checklist we have prepared for you. You may also get help from many tools for technical SEO study. This way, you can make a smooth and more comprehensive analysis. Not solving your website’s technical SEO issues may lead to many problems. The first issue you will encounter among these problems will be the decrease of your website’s ranking in search engines.
The second of these problems will be negative participation criteria. Not fixing your website’s technical SEO issues may also result in crawl errors. That may even result in your website disappearing from search engine results pages altogether. So, you should create an action plan to prevent all this. The ingredient of this action plan should be such that it does not waste the resources of technical SEO and the developer. Such an action plan will help improve your website’s ranking.
Search engines’ ranking algorithms change with new updates. For Google, this figure is over 500 a year. That’s why technical SEO studies have to renew themselves. So, they may keep up with this update rate of search engines. Details change during this process. However, the main goal is always to improve SERP rankings with website and server optimizations.
What is Technical SEO?
Technical SEO refers to website and server optimizations built into a site’s code or architecture. The scope of this term has a wide range, from meta tags to website security. Let’s take a detailed look at what this scale includes.
- CSS Optimization
- Rich Snippets
- SSL
- Robots.txt
- Structured Data
- Meta Tag
- Crawl Bugdet
- LSI
- Breadcrumb
- Canonical URL
- Title Description Optimization
- URL Optimization
- SMF
- HTML-5
- Htaccess
- Sitemap
Our technical SEO checklist template;
https://docs.google.com/spreadsheets/d/14wI-fToCjl9q3ifoKZLghRGD7z7VTMGuaaIkuRQ3Lmk/copy
Optimization of CSS Resources
In SEO, the speed of your website is quite essential to provide a successful user experience. While optimizing your website, you should make an intense effort on this subject. Today, the most critical factor affecting the algorithms of search engines is the user experience. That’s why you should perform some work to improve the opening speed of your website. In this context, you should concentrate on the rendering blocking resources in Google tests.
Related article about optimization of CSS Resources;
How to Load CSS Asynchronously
Minimizing CSS, HTML, and JavaScript Files
Rich Snippets
This term refers to regular Google search results with additional data displayed. This extra data is taken from structured data contained in the HTML of a page.
SSL
It stands for Secure Sockets Layer. This criterion has very high importance for e-commerce sites. This criterion establishes a secure relationship between the website and the user.
Related article about SSL;
Robots.txt
Robots.txt is located in the root directory of your website. It is a text file that instructs search engine crawlers on which pages to crawl and index during the crawling and indexing process.
Related articles about Robots.txt;
Structured Data
This term attracts the most attention of anyone interested in SEO. It provides clear clues to Google regarding the meaning of the page. It is a standard format used to provide information about a web page and to classify the page content.
Meta Tag
Meta Tags are information attachments of websites. They are also known as Meta Descriptions. It is used to inform the user about the website.
Crawl Budget
It is the term that refers to the number of times Google bots visit all your existing pages daily. Usually, it has an average figure. Despite this, it is dynamic. It may change from day to day due to the effect of some factors.
Related articles about Crawl bugdet;
LSI
It stands for Latent Semantic Indexing. It is one of the Google features that are getting smarter and more sophisticated day by day. With these studies, you may increase the traffic of your website.
Breadcrumb
It helps users and search engine bots understand the hierarchy on your website and sets connections that help it move around the framework of this structure. It is a system that allows users to browse a website consisting of subcategories faster.
Related article about breadcrumb;
What Is Structured Data? (+How to Benefit From It?)
Canonical URL
As websites grow, they encounter some problems. One of these problems is the duplication of page contents. There are even almost identical copies. These situations are very difficult to prevent. It determines which link the search engine spiders will base on if the same content is found in two different link structures. With the Rel Canonical tag, your users will not be affected by this.
Related artivle about Canonical URL;
What Are the Advantages of Canonical URL?
Title Description Optimization
It is of great importance in keyword optimization. It is the content that visitors first encounter in Google search queries. Almost all of the users enter your website by clicking on these blue-marked titles. Therefore, this title content should not be SPAM targeted. You should use beautiful sentences in this title content.
Related article about Title and Description;
URL Optimization
Algorithms’ preferences for link selection should be closed without further ado in the URL parts. Your URL structure may end differently. In this case, make this the most natural and shortest without changing the URL structures.
Related article; https://www.dopinger.com/blog/how-to-optimize-title-description-and-url-for-seo
SMF
It stands for Simple Machines Forum. It is software that will allow you to set up a professional quality forum for free in a few minutes.
HTML-5 and SEO
There is no code crowd in this structure. HTML-5 has a shorter and more concise coding structure. This structure offers many new tag models for your SEO work.
Related article; https://www.dopinger.com/blog/what-is-html
Htaccess
Htacces is the file that allows setting changes on the webspace used by most network servers. The network servers that use this file the most are Hypertext Access and Apache. You may make many editing, authorization, and restriction operations on your websites with this file. It is a very useful file with 100’s of different commands. This file requires using different encodings to make the necessary implementations.
Sitemap
Everyone uses a map or navigation to be sure of their destination when they go on a journey. This way, they learn how to get to their destination. The same is true for search engine bots. You may benefit from having a sitemap file on your website. This way, Google bots crawl, search and index your web pages more easily.
Related articles about Sitemap;
Detailed Technical SEO Checklist
Now that we’ve explained the scope and terms of technical SEO, it’s time to sort out our checklist !
Technical SEO Check List;
- Detect your mistakes
- Website Architecture
- Duplicate Content
- Schema Markup
- Website Speed
- Using HTTPS
- Htaccess 301 Redirect
- Crawling
- Cache
- Indexing
- Link Building
- HTTP Response Codes
- Mobile-First Indexing
Detect your mistakes
The quick way to start your technical SEO checklist is to spot your mistakes. Thus, you have a roadmap to correct your mistakes. Technical errors result in crawlers not indexing your pages properly. In the worst-case scenario, the browser won’t index your page. Fixing these errors is a necessary first step in any technical SEO checklist.
Related article; How to Perform an SEO Audit
Website Architecture
That shows how information is structured on a website. Let’s explain this with an example. For this, let’s consider organizing web pages into categories. We may also consider the ways used to navigate between sections of the website. Website architecture affects how users experience your website. It also affects how search engines experience your website.
Let’s examine the features of websites that have bad architecture. These websites may have many dead-end pages. Besides, these websites may also have subdomains or subdirectories that are unwieldy. On the other hand, a website with good architecture does not use the browsing time unnecessarily. Search engines rank these websites faster on results pages. Let’s see what you can do to make your website have a good architecture.
First, make sure your website has an XML sitemap. Then, organize your website URLs in a logical flow. You can do this as follows. Organize your website URLs from domain to category to subcategory. This way, the new URLs that will come will fit this architecture. In the end, optimize your website URL structures for search. To do this, move the main keywords closer to the root domain. Also, do not keep the URL length longer than 60 characters.
Duplicate Content
These content are content that appears on multiple pages or domains in the same domain. Duplicate content issues are also important for technical SEO checklist. These contents are not malicious. That’s why they don’t get penalties from Google. However, you still need to fix such content. There are several main reasons for this. Browsers need to know which version of a web page is the original or canonical page. Otherwise, they will not know which result to show in the search results. So, your webpages will not be in the rankings it deserves on the results pages. Also, your webpages may be in a lower order than the value for which it was filtered.
Related article; How Does Duplicate Content Affect SEO?
Schema Markup
This has no direct effect on search results. It is also called Structured Data or Micro Data. It helps determine the content of the page. Users search on Google for some specific purposes. Schema markup enables search engines to serve users and website owners better. When you start using microdata, search engines will get to know your site better. This way, it may understand the content of your website better. Then, it provides more relevant search results to users. Schema markup allows your website to get much more engagement. So, you may get better rankings on Google or other search engines.
Use Dopinger’s brand new tool: Schema Markup Generator for your content!
Website Speed
Let’s share the result of online research on this subject. If it takes more than three seconds to load a webpage, almost half of the users leave the website. Something happens when these users leave your website in this way. This behavior sent negative signals to search engines about the quality of your website. Google confirmed this in 2010. The speed of your website is an important signal that affects the ranking factor. There are several improvements you can make to increase the speed of your website. These are as follows:
- Minimize HTML, JavaScript, and CSS files. Combining them may also be helpful.
- Eliminate unnecessary code such as line breaks.
- Tools like HandBrake and TinyPNG quickly compress large files. By using these tools, you may present your images in highly compressed formats.
- Reduce plugins, scripts, and redirect chains. The low number of codes and files on your website makes it open faster. You should keep the weight of your website and page requests to a minimum.
Related article about website speed;
Using HTTPS
Servers used to run on Hypertext Transfer Protocol. This method was a fast way to send system data. However, it was not safe. So, it is better to use HTTPS, this method’s additional protocol. This protocol encrypts data and transports it securely on the web. In doing so, it uses the secure socket layer. Google announced that it recognized HTTPS as a ranking factor in 2014. This method is a good way to protect users’ data.
Related articles;
Htaccess 301 Redirect
That is a control file that allows Apache server configuration changes. This file is located in the root directory of your website. If you don’t have this file, you may create it yourself. For a 301 redirect, you first need to locate the htaccess file. To do this, follow the steps below.
- Login to your FTP.
- Go to the root directory.
- Organize your file with software like notepad. If this file does not exist in the root directory, create it.
- Add your 301 redirect commands to redirect a particular old page to the new page.
- You need to add another blank line at the end of the file. There is a reason for this. Your server reads this file line by line. For this reason, you must add a last blank line to the file to indicate that you have finished.
That’s all you have to do.
Crawling
Crawling is the process of search engine spiders browsing the pages of your website within certain time intervals. Search engine bots do this by following the links your pages give to each other within your website. It performs this process by following the links given to your web pages outside of your website. These bots are not able to spend time exploring every single page on the internet.
At this point, we may assume that they are progressing on the quality value they give to each website. This value that these bots assign to your website is called the crawl budget in the SEO world. The crawl budget has no specific numerical value. Thanks to this technical SEO checklist, you may use this budget efficiently. So, you may enable spiders to discover, crawl and index your web pages more frequently.
Using the Browser Caching Feature
This feature is a feature that increases the opening speed of your website. This way, browsers do not have to load images and other resources on the site over and over again. When users enter the website after their first visit, browsers pull these resources from the cache for a specified period. Thus, users view the site much faster. The same is true for bots. When bots enter a webpage, recalling static resources each time that will not easily change causes extra waiting time on their behalf.
These static resources can be resources such as logos and favicon. You should offer bots that try to crawl your web pages to load all of your page content as quickly as possible. You may achieve this by activating the cache feature. For this, you must set an expiration date or a maximum lifetime in HTTP headers for static resources. These settings tell the browser to load those already downloaded on the local disk, not resources on the network.
Related articles;
Indexing
Spiders associate your web pages with the keywords on your website. Then, they add them to their index. This process is called indexing. It is one of the first terms that come to mind when it comes to technical SEO checklist. We can explain the realization of the indexing process as follows. A user performs a search query on the search engine. The search engine does not perform the indexing process live during this time. First, it searches for page groups that were previously associated with the relevant search query and indexed. Then, it sorts the results related to the search query among them. Spiders consider many algorithmic factors when ranking these page groups.
They also consider a large number of ranking signals. This process is called ranking. The top 10 results that the search engine ranks as organic results on the first page are the most important results. Therefore, all your web pages should be crawlable and indexable.
Related articles about Indexing;
Link Building
These links are of two types, Off-site and On-Site. Off-site link building refers to off-site links. On-site link building represents links within the website. Link building is the process of getting links from other websites to your website. A hyperlink is a way users navigate pages on the internet. Search engines use links to crawl web pages. Search engine bots scan the pages on your website one by one and follow the links in them. There are many different techniques for link building. However, the connections created for today’s algorithms are of critical importance. The wrong actions you take may cause negative results for your website. There are a few techniques you can do to gain backlinks. Let’s talk about them now.
Create fun, high-quality, original, or unusual content on your website that people want to share. Before creating your content, determine who your target audience is. Once you create valuable content and know where your audience is, it will be easier to earn links. If your content is related to your users, they may link to your website. This technique is called internal links in technical SEO checklist. This technique redirects both Google bot and users by creating a hyperlink between pages.
HTTP Response Codes
That is the server’s response to a request made by the browser. So, these are called HTTP response codes or status codes. These codes are one of the most obvious ways to see what is happening between the browser and the server. Search engine spiders read these codes to see the status of the relevant page. This happens when a browser makes the first request to the server to load a website. There are a few important codes among these codes in terms of technical SEO. These are 2xx, 3xx, 4xx, and 5xx response codes.
Related article;
How to Fix HTTP Error 503
Mobile-First Indexing
Before this practice, desktop pages determined both mobile and desktop search results. The rates of determining factors such as mobile click-through rates and mobile site speed were lower. These data caused the same page to rank in desktop and mobile results with a difference of several positions. In 2018, Google made an announcement. In this announcement, Google announced that they had switched to mobile priority indexing. Thus, the content offered by the mobile pages began to determine the position of the desktop pages. In this regard, Google states that it will always have a single indexing method. In other words, there will not be an index separate from the main directory.
Today, Google indexed desktop pages more. However, over time, these pages will be replaced by mobile versions. Google Search Console will soon send messages to website owners whether they have switched to mobile-priority indexing. Most websites have received this transition message.
In Google Search Console, you may see that the crawl rates made from Smartphone Googlebot increase over time. With mobile priority indexing, the mobile version of the relevant page will appear in the search results. For this reason, you should be ready in order not to lose traffic and position with the arrival of the update. So, it is of great importance to be ready for Mobile Priority Indexing. Based on this data, we may understand that page contents are the basis of mobile-first indexing. Therefore, you should make sure that all the content you offer in the desktop version is also available on mobile pages. The scope of these contents is as follows:
- Images
- Category Descriptions
- CSS and JS Resources
- Structured Data Markings
- Breadcrumb
- In-Site Link Setup
- All data types presented on the webpage
You need to make sure that you are presenting the content in the full version. You may use the mobile compatibility test for this.
Mobile Priority Index Update Checklist
Time needed: 1 hour
Here are the elements of a mobile priority index update checklist:
- Mobile Compatibility
All of your website’s category links should also be on mobile pages.
- Copy Your Links
You must include all links within your website in the mobile version.
- Synchronize Content
The number of products and content shown on the listing pages should be the same on mobile and desktop.
- Place Category Descriptions
You should place the category and product descriptions on mobile pages.
- Pay Attention to Structured Data Markups
Mobile user-agents should be able to see all of the structured data markups on the desktop.
- Put Breadcrumb Links
You should also include breadcrumb links in the mobile version.
- Enable User Engagement
You must include user content such as comment fields and reviews in the mobile version.
- Be Available for Crawlers
CSS, JS resources should be open to crawling by mobile user-agents.
- Include Metadata
You must include metadata such as an open graph, Twitter cards, meta robots in the mobile version.
- Place Annotations
You must place annotations such as canonical, prev-next, and hreflang on mobile pages.
- Allow Googlebot Mobile
Googlebot Mobile should be able to access the sitemap of your website.
- Optimize Speed
You should optimize your mobile website speed performance.
Many tools automatically test this checklist we have listed for you above. You may access many of these tools free of charge on the internet. You may also get premium versions of these tools for your needs.
Completing the Technical SEO Work
Applying this checklist we have mentioned in our article isn’t enough for technical SEO. You need to track and measure the impact of these regulations on your website over time. This way, you may understand what technical SEO factors are helping your website rank. Or, you may also see which of these factors have the most negative impact on your website’s rankings. Based on all these data, you may determine your new technical SEO checklist strategies for the future.
You will observe the positive effects of your future SEO optimization works on your website’s organic traffic. There is an app where you can get great help to complete your technical SEO work. This tool is called Google Analytics. By using this tool, you may measure the results of your technical SEO checklist, as we mentioned above. This tool allows you to measure a wide range of traffic for your website. Let’s talk about how to use this tool.
This tool also provides the opportunity to compare the traffic measurements you have made at certain time intervals. This tool is a free service from Google. You must have important data in good hands to be able to create a healthy technical SEO strategy. This tool will be your biggest assistant at this point. Now, we have a basic knowledge of Google Analytics, let’s share the Google Analytics checklist we prepared for you. Next, let’s share how you can evaluate Google Analytics data.
Our Other Checklist about SEO;
Google Analytics Checklist and How to Evaluate Google Analytics Data
Let’s first share the Google Analytics checklist with you. The best way to learn this tool is to make the necessary checks. First of all, it will be enough for you to do the following ten checks:
- Have you added additional platforms like Google Ads?
- Did you activate the demographics and interests reports?
- Have you enabled website searches?
- Can users find what they are looking for when they search the website?
- Have you created alerts that require quick action? These are notification settings.
- What is your rate of leaving the site?
- Have you blocked your own IP address and websites that send spam traffic to your website with the option of filters?
- Has there been a serious traffic change on your website in the last week?
- Which keywords and terms are more successful?
- What is the most successful content on your site? What is the reason for this?
As promised, we shared our checklist for this amazing tool. We may now share how you will evaluate the important data provided by this tool. This tool provides instant access to many valuable data. These data are as follows:
- How do users reach your website?
- What kind of interaction did users have with your website?
- How many people have visited any page on your website?
- How much time did users spend on this page?
- Where do the people who visit your website live?
- What are the top-performing keywords?
Technical SEO Checklist in Short
SEO studies are very important for websites. Technical SEO is related to the technical aspect of these studies and has a wider scope. In this article, we tried to explain the terms included in this scope. We have also tried to help you how you can evaluate these terms. As we mentioned in our article, evaluating this data is the most important part of technical SEO work. You may get help from several tools and apps that may help you in this direction.
As the most important of these tools, we have devoted a large place to Google Analytics. We also talked about how important your website is to be mobile compatible in terms of technical SEO. We hope that our article will inspire you for your technical SEO work. If you think you need professional help with this checklist, consider getting help from an SEO consulting company.
Now that you know exactly what to do, here are two detailed guides that will show you how to perform on-page and off-page SEO. Take a look!