Skip to primary navigation Skip to content Skip to footer

Ultimate SEO Series Part 2: Technical SEO

Last updated on December 2, 2021
15 minute read
Key Takeaways

  • Technical SEO is important for your overall SEO strategy.
  • See the most important technical SEO elements to optimize on your website.
  • Fix common problems that come up with technical SEO.
Categories

Skill Level

Intermediate, Advanced
Actions

There are so many different categories of search engine optimization – hopefully, you’ve checked out part one of our Ultimate SEO Guide that covers on-page SEO – and technical SEO is one of the most important parts of your overall SEO strategy. Technical SEO includes optimizing your site for crawling and indexing, sitemap and URL structure, internal links, and more.

As with all things SEO, there’s a lot to wrap your head around! In this guide, we cover essential technical SEO elements and how to make sure your site is properly utilizing them to improve search visibility.

Technical SEO Best Practices

Sitemap

Sitemaps are pretty much what they sound like — a map or list of the pages on your website that tells Google and other search engines how the content on your site is organized. A sitemap helps them properly index and crawl your site so that it is eligible to show up on the SERP.

If you want your pages to be indexed and visible on the SERP, you must submit your sitemap to Google Search Console in the form of an XML sitemap. If you don’t do this, you’re essentially letting Google ignore the pages of your site in the search results! If you’re a WordPress user, you can use Yoast to generate the sitemap, or search around and choose one of the many sitemap generators out there. Once you’ve generated it, you’ll just enter the sitemap URL into Google Search Console under the Sitemaps tab.

Robots.txt

No, this isn’t a character from your favorite science fiction movie! The robots.txt file tells search engines what not to index and allows you to control the way a search engine indexes your site. Normally, you wouldn’t add anything to the file because you want Google to crawl all pages of your site, but there are some exceptions.

For example, you probably don’t want Google to index pages with “/billing/” or login pages with “/wp-admin/”.

You can check to see if you already have a robots.txt file on your site by going to [yourdomain.com]/robots.txt.

SSL (Secure Sockets Layer)

With the rise of online fraud, it’s more important than ever to have a secure website. SSL is a form of security technology that protects the data that passes between a web server and your internet browser. It adds an extra layer of security to your site by protecting users’ sensitive information like credit card or social security numbers.

If you’re on a FareHarbor website, you don’t have to worry about adding SSL as we’ve already done it for you. Even if you’re not on a FareHarbor site, your bookings are still encrypted through our software, but the rest of your site may not be. To find out, enter your URL and look for the lock icon and https in the browser window.

Don’t see them? Not to worry. Our guide to SSL takes you through all the steps you need to follow to add this security technology to your website. Benefits include creating a more secure site, establishing customer trust, and improving your ranking on Google. Don’t skip this important technical SEO factor.

Short, descriptive URLs

You may think of a URL as a small detail within your site, but optimizing your URLs can have a big impact on searchability. Firstly, URLs should be readable, meaning they don’t contain any confusing strings of random letters and numbers. The shorter, the better! Search engines and your readers love concise, easy-to-follow URLs that give some indication as to what the page might be about.

Here are a few tips for crafting perfect URLs every time.

  • Include your target keyword in the URL.
  • Keep your URL short.
    • Good: www.yourwebsite.com/kayak-tour
    • Bad: www.yourwebsite.com/best-family-fun-kayak-tour-maine
  • Avoid stop words: Stop words are a list of nonessential words (usually articles and conjunctions) that search engines partially or completely ignore. Some examples are: the, a, of, many, or, and so on. From Google’s perspective, these words are little more than fluff that don’t add substance to a URL, so there’s no need to read them. Go ahead and leave them out.

If you decide to change the URL structure, be sure to use a 301 redirect so that you don’t lose any site traffic.

Links to relevant internal and external resources

Building links is an easy way to help visitors to your site navigate from topic to topic within the text of each page, instead of having to click on menu items.

Internal links: These links direct users from one page of your site to another. They’re a great way to keep traffic on your site and help users find what they’re looking for. As a general rule of thumb, whenever you publish a new page to your site, link to two to five other pages on your site from that page. You can do this by using anchor text, words that appear highlighted in a hypertext link and can be clicked to direct you to a new page.

When using internal links, be sure to periodically check that your links are not broken, or pointing to non-existent resources, like a page that doesn’t exist. This can happen through user error when the person adding internal links uses the wrong URL. Broken links have a negative impact on user experience, since the user was expecting to arrive on a certain page and instead sees a 404 error. 

Find and fix broken links by using a tool like Ahrefs Site Audit or Google Search Console crawl errors.

External links: External or outbound links direct users to an authoritative outside resource that is related to the content found on your website. These links help you build authority and trust for your website since you’re linking out to other sites that are established in the industry and can lend valuable information to your readers. For example, if you run tours at an animal conservatory, and National Geographic published guides to some of the animals at the conservatory, linking to those articles would make for a nice resource your readers might enjoy. Again, do this by using anchor text within the body content on your page.

Pro tip: Set external links to open in a new tab. This way, readers can check out the link without being directed away from your website.

Optimize your images with descriptive Alt Text

Alt text, or alternative text, allow search engines to “see” the images you put on your site by reading the alt tag for a description of the photo. This is especially important for accessibility purposes so that visually impaired visitors using screen readers can understand the images based on their description.

Here’s an example of what alt text should look like in the backend of your website.

Let’s use this image as an example.Cycling woman riding on bike in autumn mountains forest landscape. Woman cycling MTB flow trail track. Outdoor sport activity.

Bad alt text: Bike ride
Better alt text: Woman riding bike
Best alt text: Woman riding an all-terrain bike on a forested mountain trail

Tips for writing alt text:
1. Keep descriptions under 75 characters.
2. Don’t use the words “photo of” or “image of” when writing alt text.

Duplicate Content

Duplicate content consists of substantive blocks of content within or across domains that either completely match other content or are appreciably similar. There are three main problems that search engines find with duplicate content:

  • They don’t know which version(s) to include/exclude from their index.
  • They don’t know whether to direct the link metrics (trust, authority, anchor text, link equity, etc.) to one page or keep it separated between multiple versions. 
  • They don’t know which versions(s) to rank for query results.

From the perspective of human users, nobody wants to read the exact same paragraph over and over on multiple tour pages. Avoid duplicate content by creating unique, engaging content for each page.

Duplicate Title Tags

Title tags are essential for showing up on search engine result pages and for your audience’s ability to easily find your site. The right title tag will tell your audience what to expect before they ever click on the link. Your title tags have the power to directly impact your click rate, and eventually your bottom line.

When pages on your site have the same title tag, this creates more work for search engines and puts them in the position of making decisions for you. When two or more of your pages contain similar title tags, it falls upon the search engine to try to determine which page the user really wants to see in their results.

Make each title tag unique, targeting keywords that explain what each page is about in 60-75 characters.

Duplicate Meta Descriptions

The meta description is an HTML attribute that provides a brief summary of a web page. Search engines such as Google often display the meta description in search results, which can influence click-through rates.

If pages have duplicate descriptions, Google frowns upon this as each page should have a unique meta description. If they are duplicate, it can confuse visitors who see it as it most likely doesn’t explain what the page is about.

Create unique meta descriptions for each page that describe what the page is about, use keywords and include a call to action. It should be no more than 160 characters.

Mixed Content Issues

If your website contains any elements that are not secured with HTTPS, this may lead to security issues. 

Browsers will warn users about loading non-secure content, and this may negatively affect user experience and reduce people’s confidence in your website.

To fix this issue, only embed HTTPS content on HTTPS pages.

Redirect Chains and Loops

Redirecting one URL to another is appropriate in many situations. However, if redirects are done incorrectly, it can lead to disastrous results. Two common examples of improper redirect usage are redirect chains and loops.

Long redirect chains and infinite loops lead to a number of problems that can damage your SEO efforts. They make it difficult for search engines to crawl your site, which affects your crawl budget usage and how well your web pages are indexed, slows down your site’s load speed, and, as a result, may have a negative impact on your rankings and user experience. 

The best way to avoid any issues is to follow one general rule: do not use more than three redirects in a chain.

If you are already experiencing issues with long redirect chains or loops, we recommend that you redirect each URL in the chain to your final destination page.

We do not recommend that you simply remove redirects for intermediate pages as there can be other links pointing to your removed URLs, and, as a result, you may end up with 404 errors.

No redirect or canonical to HTTPS homepage from HTTP version

If you’re running both HTTP and HTTPS versions of your homepage, it is very important to make sure that their coexistence doesn’t impede your SEO.

Search engines are not able to figure out which page to index and which one to prioritize in search results. As a result, you may experience a lot of problems, including pages competing with each other, traffic loss and poor placement in search results. To avoid these issues, you must instruct search engines to only index the HTTPS version.

To fix it, do either of the following:

    • Redirect your HTTP page to the HTTPS version via a 301 redirect.
    • Mark up your HTTPS version as the preferred one by adding a rel=”canonical” to your HTTP pages.

Some of these technical SEO fixes can be tackled pretty quickly, like updating your title tags and meta descriptions, while others, like looking for duplicate content, might require more effort. Work your way through this list when you have the time and see your search visibility improve. Then, head over to the Ultimate SEO Series Part 3: Analytics & Tracking.

Related Guides