Who Is a Technical SEO Specialist? Job Description and Duties
1.7 billion websites already exist. However, only a few of them show up on the first page of Google. One of the most important barriers in improving site ranking is ignoring technical SEO. The technical SEO specialist is the one who is in charge of this task and helps you to promote your site.
SEO and Its Components
Here you can see the division of SEO:
- Internal or on-page
- External or off-page
Internal SEO is about activities that improve the content structure. On the other hand, as its name implies, external SEO is related to the external signals of a site. And finally, technical SEO is the foundation of internal and external SEO and includes all the technical matters of the site.
Learn more: Four Best WordPress SEO plugins
What Is Technical SEO?
Observing SEO tips in content, although crucial, doesn’t fulfil the whole SEO process. To optimize the site for search engines, you need more than this. If you follow the technical SEO strategies along with on-page and off-page SEO, you can get a perfect ranking in Google results.
Technical SEO is the website and server optimization that helps search engine crawlers like Google or Bing crawl and index the site. Your website should meet the technical needs of modern search engines to get a better ranking. As mentioned, this part of SEO has nothing to do with content and cares about the website’s infrastructure, such as code optimization, building a sitemap, etc. That is people know it as technical.
The Primary Purpose of Technical SEO
The primary purpose of the troublesome SEO technical process can be summarized as follows:
- Complete indexing of the project
- Helping search engine robots to understand the structure of the site fully
- Facilitating users’ access to what they need
- Helping search engine algorithms to rank site pages
What Does a Technical SEO Expert Do?
What technical SEO experts do is beyond what is summarized in this article. However, their most important duties are as follows:
Changing URLs With 301 Redirects:
In short, sometimes, it is necessary to direct the visitor from one page to another. To do this, technical SEO specialists use redirects, which have different types. Among which they use Redirect 301 to move an address with www to an address without www or vice versa, as well as to move from an old address to a new one (for example, from soject.com to sooject.com).
Creating and Configuring robots.txt File:
The robots.txt file controls the access of search engine robots such as Google to the site content. This file does not allow bots to access the blocked host paths. As soon as search engines enter a site, they check the robots.txt file to see which directories they have access to and then begin a full investigation of the site.
The technical SEO expert writes and modifies the robots.txt file commands for each search engine separately. For example, only Yandex search engine supports the “host” command, while Google considers it incorrect.
Checking the Title and Meta Description of Pages:
Each page on the website has only one title tag and one description meta tag that identifies its content.
To check the uniqueness of the title tag and meta tag description, the SEO experts typically use the “website audit tool” which displays all errors by page.
In this article, we are not going to discuss how to change and edit these tags on web pages. We only mention that this is one of the duties of a technical SEO expert.
Creating Friendly URLs for Site Pages:
The technical SEO expert examines all the pages of the site for proper URLs and corrects the problematic ones. The appropriate URL must be selected according to the contents of the page.
Checking Website Performance in the Face of Error 404:
In general, to display web pages, the browser requests various elements from the server and displays them to the user. But what happens if the browser asks for a file from the server, but that file is not available?! A 404 or not found error occurs at this time.
In fact, a 404 error is generally displayed when the page you are looking for is not found on the server.
Let’s take a closer look. 404 is a status code in the Http protocol and is considered a category of 4xx errors in the console search.
The appropriate message code for this type of error is “Error 404: page not found” and should be displayed in all website sections for incorrect URLs. Error page 404 should be designed like any other part of the website. Also, this page should not move the user elsewhere on the site. This page should contain important site links and a search form.
Checking Server Response for All Web Pages:
All pages that are accessible via links must have a response code (response code 200 OK).
Accordingly, when the page address is changed using 301 redirects, it is necessary to correct all internal links to the destination of the page so that the user is transferred to his final destination as soon as possible. The technical SEO expert makes sure that the server responds well to all pages of the website.
Checking the Pages Loading Time:
Recommended values for source document code download time are up to 0.7 seconds (700 ms), server response up to 0.2 seconds (200 ms) and up to 120KB for source code size. Google does not approve websites that take more than 3 seconds to load. Therefore, increasing the site’s loading speed is one of the most important things that the technical SEO expert does.
The Uniqueness of the H1 Tag for All Web Pages
The content of this tag should indicate the page subject, and there should be only one for each page of the site!
If you use WordPress, know that the text you enter in the title of a content section automatically gets the H1 tag, so you do not need to use the paragraph or the H1 tag in the content text. Technical SEO expert checks all H1 tags to make sure they are unique.
Getting Sure That h1 to h6 Tags Are Not Used as Site Design Elements
These tags should only be used in the contents.
Statistical Reports on Server Uptime
The SEO expert constantly reviews these reports to make sure it is 99.85% or higher. The uptimerobot website is one of the tools that make this possible.
Creating a Favicon or Favorites Icon
A Favicon Is the small icon that you see at the beginning of a website’s domain address in the address bar of Internet browsers. A unique favicon (which stands for favorites icon) catches the attention of visitors. The SEO experts upload this icon to the main path of the website host.
Hiding Banned Indexed Links With Ajax:
The SEO expert hides links that have been blocked from being indexed using the robots.txt file by ajax. There should be no sections in the site code like “a href =”… ” for in-page links (pages that are not indexed). These ajax scripts themselves should be prevented from being indexed.
Compiling All the Codes of JS and CSS Files in a Single File:
It is necessary to compile all the js and CSS files into an appropriate single file. The technical SEO expert deletes the comments in the code, which speeds up the access and interpretation of code by Google bots and browsers. This action is necessary when there are more than eight js and CSS files. On the other hand, it is necessary to delete comments longer than three lines in these codes.
Checking the Site HTML Elements:
The technical SEO expert checks that all the HTML elements used on the site are closed. For example, if the tr tag is used on the site, it must be closed with a / tr. This is the least you can do to make your code valid.
Checking the Pages in Different Browsers
A technical SEO specialist checks that all web pages are displayed correctly in popular browsers. /most important browsers include Google Chrome, Android Browser, Mobile Safari, Firefox, Yandex. Browser, Opera.
301 Redirect Configuration
These are among a SEO specialist duties:
- Properly configuring 301 redirects from pages such as “index.php,” “index.html,” “default.html” to other pages.
- Configuring 301 redirects from pages without a slash (“/”) to pages with slash or vice versa, depending on the cms used and the server settings.
Switching From HTTP To HTTPS
The SEO expert makes sure that he has used a 301 redirect to move from the https version to the http or vice versa. Search engines currently check the https version of sites for indexing. If you have not used the appropriate redirect, Google may recognize another version of your site as duplicate content.
Avoid Indexing Login Pages
The SEO expert prevents login pages from being indexed using the robots.txt file. In different cms, these pages can be: “/ bitrix”, “/ login”, “/ admin”, “/ administrator”, “/ wp-admin”.
Building The Site Map
Creating a sitemap that includes the characteristics of all pages is one of the tasks of the technical SEO expert.
If the number of pages is more than 50,000, making this map is necessary. It is better to get a direct map in the Google Webmaster site and avoid introducing the sitemap to search engines using the robots.txt file.
Opening External Links on a New Page
It’s a good idea to open all external links in new tabs using the “target =” _blank command in the tags. However, if you do not want external links to degrade your website, the SEO expert will use ajax to hide them from Google bots.
The Key Pages of the Website
The SEO expert opens the site’s key pages and examines them in terms of code completeness, storage, and encoding history.
Preventing Some Pages From Being Indexed
The “cgi-bin”, “wp-includes”, “cache”, “backup” folders in the robots.txt file should be prevented from being indexed.
Preventing Unused Files From Being Indexed
The SEO specialist must prevent unused files (such as SWF files) or blank doc and pdf files from being indexed in the robots.txt file. However, if doc and pdf files are valuable and contain essential information, they should not be prevented from being indexed.
Additional SEO Requirements for Store Sites With Search and Access Control
For projects that are more technically complex, such as sites that have user access control and internal search for products, posts, etc, there are a few tips that the SEO expert will follow.
- Using the “canonical tag or rel = canonical” to eliminate duplicate pages, after observing all reference and behavioral criteria. (This recommendation is justified for small and simple sites, but due to some complexities and difficulties in its implementation, it often remains as a recommendation)
- Preventing miscellaneous sorting pages and product filter pages from indexing. If traffic is not going to be directed to these pages, their links should also be hidden from the view of search engine robots using ajax.
- Prohibiting indexing of login pages, password change, order processing pages, etc. by robots.txt file.
- SEO specialists also prevent search results from indexing using the robots.txt file.
- Different types of print such as “_print,” “version = print,” and the like should be prevented from being indexed.
- Prohibiting indexing of action pages such as “? Action = ADD2BASKET”, “? Action = BUY ”
- Prohibiting indexing of sections with duplicate content, such as “feed,” “RSS,” “wp-feed.