Lesson 6: Technical SEO Issues & Best Practices

For this lesson we have gathered some insight about Technical SEO. The part of SEO that looks at the architecture of your websites and makes them more attractive to Search Engine Crawlers. 

Topics covered in this lesson:

  1. Technical SEO components
  2. 8 Common Technical SEO issues
  3. How to fix your site architecture

 Back to Course Outline

We've introduced the concept of Technical SEO back in Lesson 1 and since it's one of those aspects that is under your control entirely. We've decided to explain it further, so you become an expert at optimizing the skeleton of your pages. 

 

What is Technical SEO

Technical SEO is ... Technical SEO is not...
Page elements Analytics
HTTP header responses Keyword research
XML sitemaps Backlinks
301 redirects Social media strategies
Metadata  

  

8 Common Technical SEO issues

Often overlooked, these 8 technical SEO issues are easy to fix and guarantee to boost your search visibility and SEO success.

  1.  No HTTPS Security
  2.  Site Isn’t Indexed Correctly
  3.  No XML Sitemaps
  4.  Missing or Incorrect Robots.txt
  5.  Meta Robots NOINDEX Set
  6.  Slow Page Speed
  7.  Multiple Versions of the Homepage
  8.  Incorrect Rel=Canonical



1. No HTTPS Security 

Site security with HTTPS is more important than ever. In October 2017, Google began rolling out a “not secure” warning in Chrome every time a web user lands on an HTTP site.

Browser Version Market Share Worldwide - June 2018

 

Browser Usage
Chrome 67.0 15.47%
Chrome for Android 29.31%
Chrome 66.0 9.34%
Safari iPhone 9%
Firefox 60.0 3.4%
IE 11.0 2.68%                      

Do you have to upgrade your affiliate websites to HTTPS Certificates? Probably yes. 

Review your websites analytics. What browser do your visitors use? If your website visitors use Browsers such as Chrome, Safari or Firefox then the answer is YES.

You will lose a lot of traffic if you don’t upgrade.

 

Check if your site is HTTPS:

Type your domain name into Google Chrome. If you see the “secure” message (picture below), your site is secure.

HTTPs_secure_google.jpg

However, if your site is not secure, when you type your domain name into Google Chrome, it will display a gray background—or even worse, a red background with a “not secure” warning. This could cause users to immediately navigate away from your site.

not-secure-site.jpg

How To Fix It: 

  • To convert your site to HTTPS, you need an SSL certificate from a Certificate Authority.
  • Once you purchase and install your certificate, your site will be secure.

     

    2. Site Isn’t Indexed Correctly

This is the most essential SEO feature of all time. Google must index your site in order for your site get any organic traffic from Google. If your site isn’t indexed, you are lost. No one will find your content organically, because it’s not part of Google’s search index.

Not sure how to check? Here’s how…

  • Type the following into Google’s search bar: “site:yoursitename.com” and instantly view the count of indexed pages for your site.

 Indexing.jpg

Ideally, this number should be largely proportional to the total number of pages on your site, minus the ones you don’t want indexed. If there’s a bigger gap than you expected, you’ll need to review your disallowed pages. Which brings us to the next point.

 

How To Fix It: 

  1. If your site isn’t indexed at all, you can begin by adding your URL to Google 
  2. Your Site is Indexed Under a www- or Non-www DomainTechnically www is a subdomain. Thus, http://example.com is not the same as http://www.example.com. Make sure you add both sites to your account to ensure they are both indexed.
  3. Google Hasn’t Found Your Site Yet: This is usually a problem with new sites. Give it a few days (at least), but if Google still hasn’t indexed your site, make sure your sitemap is uploaded and working properly. If you haven’t created or submitted a sitemap, this could be your problem. You should also request Google crawl and fetch your site.
  4. You Have Lots of Duplicate Content: Too much duplicate content on a site can confuse search engines and make them give up on indexing your site. If multiple URLs on your site are returning the exact same content, then you have a duplicate content issue on your site. To correct this problem, pick the page you want to keep and 301 the rest.

 

3. No XML Sitemaps

XML sitemaps help Google search bots understand more about your site pages, so they can effectively and intelligently crawl your site.

Not sure how to check? Here’s how…

Type your domain name into Google and add “/sitemap.xml” to the end, as pictured below.

domain_xml.jpg

If your website has a sitemap, you will see something like this:

website_sitemap.jpg

How To Fix It:

If your website doesn’t have a sitemap (and you end up on a 404 page), you can create one yourself or hire a web developer to create one for you. The easiest option is to use an XML sitemap generating tool. If you have a WordPress site, the Yoast SEO plugin can automatically generate XML sitemaps for you.

 

4. Missing or Incorrect Robots.txt

A missing robots.txt file is a big red flag—but did you also know that an improperly configured robots.txt file destroys your organic site traffic?

To determine if your robots.txt file is incorrect, type your website URL into your browser with a “/robots.txt” suffix. If you get a result that reads "User-agent: * Disallow: /" then you have an issue.

disallow.jpg

How To Fix It:

This text file which sits in the root of your website's folder communicates a certain number of guidelines to search engine crawlers. For instance, if your robots.txt file has this line in it; User-agent: * Disallow: / it's basically telling every crawler on the web to take a hike and not index ANY of your site's content.

 

5. Meta Robots NOINDEX Set

When the NOINDEX tag is appropriately configured, it signifies certain pages are of lesser importance to search bots. (For example, blog categories with multiple pages.) However, when configured incorrectly, NOINDEX can immensely damage your search visibility by removing all pages with a specific configuration from Google’s index.

Not sure how to check? Here’s how…

 

  • Right click on your site’s main pages and select View Source Code. Use the Find command to search for lines in the source code that read “NOINDEX” or “NOFOLLOW” such as:
    • <meta name="robots" content="NOINDEX, NOFOLLOW">

 How To Fix It: 

  • If you see any “NOINDEX” or “NOFOLLOW” in your source code, check with your web developer as they may have included it for specific reasons.
  • If there’s no known reason, have your developer change it to read <meta name="robots" content=" INDEX, FOLLOW"> or remove the tag altogether.

  

6. Slow Page Speed

If your site doesn’t load quickly (typically 3 seconds or less), your users will go elsewhere. Site speed matters to both your users' experience and to Google.


How To Fix It: 

  • The solutions to site speed issues can vary from simple to complex. Common site speed solutions can include image optimization/compression, browser caching improvement, server response time improvement and JavaScript minifying.
  • Speak with your web developer to ensure the best possible solution for your site's particular page speed issues.

 

7. Multiple Versions of the Homepage

Remember when you discovered “yourwebsite.com” and “www.yourwebsite.com” go to the same place? While this is convenient, it also means Google may be indexing multiple URL versions, diluting your site's visibility in search.

How To Fix It: 

  • First, check if different versions of your URL successfully flow to one standard URL. This can include HTTPS and HTTP versions, as well as versions like “www.yourwebsite.com/home.html.” Check each possible combination. Another way is to use your “site:yoursitename.com” to determine which pages are indexed and if they stem from multiple URL versions.

 

 

8. Incorrect Rel=Canonical

Rel=canonical is particularly important for all sites with duplicate or very similar content (especially e-commerce sites). Dynamically rendered pages (like a category page of blog posts or products) can look like duplicate content to Google search bots. The rel=canonical tag tells search engines which “original” page is of primary importance (hence: canonical)—similar to URL canonicalization.

How To Fix It: 

  • This one also requires you to spot check your source code. Fixes vary depending on your content structure and web platform. (Here’s Google’s Guide to Rel=Canonical.) If you need assistance, reach out to your web developer.

 


9. URL Canonicalization: 

One version per URL

The average user doesn't really care if your home page shows up as all of these separately: 

But the search engines do, and this configuration can dilute link equity and make your work harder.

Google will generally decide which version to index, but they may index a mixed assortment of your URL versions, which can cause confusion and complexity.

How to fix it: 

  • Manually enter in multiple versions of your home page in the browser to see if they all resolve to the same URL
  • Look also for HTTP vs HTTPS versions of your URLs — only one should exist
  • If they don’t, you’ll want to work with your developer to set up 301 redirects to fix this
  • Use the “site:” operator in Google search to find out which versions of your pages are actually indexing

 

Back to Course Outline

Source:

https://www.seoclarity.net/resources/knowledgebase/8-common-technical-seo-issues-how-to-solve-them-17372/

www.searchenginejournal.com

 

Have more questions? Submit a request

Comments

Powered by Zendesk