BLOG
Click to hear an audio recording of this post

Stuck on the second page of Google for your desired keywords?

No matter which or how many SEO tools you use, committing these common yet critical SEO errors will keep you from ranking on the first page of SERPs.

So, conduct a frequent website audit and steer clear of the following SEO mistakes if you wish to rank higher and ensure an optimal user experience. If you need help with any of this you can always reach out to an SEO agency.

1. Slow site speed

Speed is an official ranking factor. So besides causing user frustration, slow loading pages can hurt your rankings big time.

Fix: Use Google PageSpeed Insights to check the current performance of your pages and get actionable recommendations to improve your site speed and page-experience metrics (aka web vitals).

2. Bad reviews on Google

Negative reviews are unavoidable. But too many bad reviews on Google signal a poor website or brand experience which, in turn, can repel traffic and ultimately dilute rankings.

Fix: Understand why people leave negative reviews. Respond to all reviews in a timely and courteous manner. Whenever possible, make amends by fixing the problem right away and request people to modify their reviews. Post on social media about the measures you took to fix the issues to add to your website’s credibility.

3. Duplicate Content

If you have two or more identical pages, it’s difficult for Google to determine which one to rank in the SERPs.

Fix: Use the rel=”canonical” tag to specify which version of the pages with similar content is the main one and thus, should be indexed.

4. Broken images and missing Alt text

It could happen that your images do not render, say, due to an incorrect file path or extension. In such cases, a missing alt tag hurts your user experience and thus, rankings.

Fix: Make sure to add a descriptive alt tag (with the primary keyword included) for each image on your website.

5. Outdated content & information on the website

Google and people love and prefer fresh content. If your website content is archaic and irrelevant, it will eventually drop in rankings.

Fix: Make sure all information on your website is up-to-date. Always publish fresh content backed with recent data and periodically refresh all your content to keep things relevant.

6. Not optimising for mobile

Over half of your website visitors are coming in via mobile devices. Google considers the mobile performance of your website to determine its rankings. So, not optimising your website for mobile is a dire SEO mistake you can’t afford.

Fix: Use the Mobile-Friendly Test tool to check if your website’s pages are mobile-friendly or not. Redesign your website for mobile-friendliness and use Accelerated Mobile Pages (AMP) to ensure lightning-fast content delivery speeds when users click-through to your website.

7. Unclear contact form

A poorly-designed contact form diminishes your website’s credibility and thus, hurts rankings. Not to mention it leads to poor conversion rates.

Fix: Keep your contact form simple with fields that are truly required. Enable auto-fill for your form and ensure it’s optimised for mobile. Use a clear call to action button and write supportive text for the fields.

8. Broken links

“404 not found” errors not only harm the user experience but also cost your crawl budget and ultimately, hit your rankings.

Fix: Use the crawl stats feature in Google Search Console to identify which pages are throwing a 404 error and use Ahrefs Free Broken Link Checker to know which pages have broken links. Use 301 redirects to point crawlers to another relevant page on your site. For external links, remove the link or replace it with a different source.

9. Unindexed web pages

Google can’t rank pages that aren’t indexed, simple as that. So if you have pages that you want to showcase in the SERPs but aren’t indexed, they won’t rank.

Fix: A range of issues can affect the indexability of your website, from thin content to hreflang conflicts. Use Google Search Console to check and resolve indexing issues.

10. Improper Meta tags

Meta tags help search engines identify the content of your pages and better connect them with the keywords used by searchers. Writing irrelevant or unoptimised meta title tags and descriptions, or not writing them at all, means there will be gaps in Google’s (and users’) understanding of your website and click-through rates may decrease.

Fix: Take the time to write a unique meta title (50-60 characters) and description (up to 155 characters) for each page, choosing relevant keywords to form a click-worthy link for users in the SERPs.

11. Improper URL Structure

Your URL structure plays a role in SEO, and URLs containing underscores or non-ASCII characters make your URLs less readable and less search-friendly.

Fix: Craft short, lowercase, descriptive URLs with relevant keywords separated by hyphens. Avoid having more than two subfolders in your URLs.

12. Faulty redirects

Unnecessary and/or too many 301 redirects dilute SEO value and page load time.

Fix: Redirect the HTTP version of your page to HTTPS. Using a tool like Screaming Frog, fix all the broken redirects, internal redirect chains, and 404 pages.

13. Using iFrames

An Inline Frame (iFrame) is an HTML file inside of another HTML file. As this content doesn’t technically live in the page’s code, some content within the iFrame is rendered invisible to Google and often results in low-quality pages with thin content. They can also cause security issues and hurt your load speeds and web vitals scores.

Fix: Avoid having iFrames in your website’s HTML code.

14. Orphaned pages

A page without any internal links to it is called an orphan page. If you want Googlebot to crawl and index your content, it should be able to reach it when it goes around your website via internal links. Consequently, orphan pages can remain unindexed and thus invisible in the SERPs even if included in the XML sitemap.

Fix: Review your internal linking structure. Make sure the pages you want to be indexed (and ranked) in the search engines have a good number of relevant “dofollow” internal links.

15. Not using anchors or writing generic anchors for internal links

Not writing any anchor text and placing the URL as is, or, writing a generic and unoptimised text such as “Click here” for internal links doesn’t help your SEO.

Fix: To pass more internal link juice, write keyword-optimised and unique anchors for all internal links and CTAs so users and crawlers better understand the page they’ll be redirected to.

16. Putting Too Many Keywords On A Single Page

An extremely common SEO mistake, stuffing a page or piece of content with keywords is considered spammy and detrimental to your rankings.

Fix: Include relevant keywords naturally and always prioritise readability. Don’t stuff keywords.

17. Sitemap.xml not found or not specified in robots.txt

Without a sitemap, it’s difficult for Googlebot to explore, crawl and index the pages of your site. And without a link to your sitemap in your robots.txt file, search crawlers can’t fully understand your website structure.

Fix: Maintain an up-to-date sitemap.xml file with zero broken pages, and reference it in your robots.txt file.

18. Incorrect pages found in sitemap.xml

Having a sitemap with broken links and redirect chains hurts your SEO.

Fix: Check your sitemap for any 404 errors, redirect chains, and non-canonical pages. Make sure they return a 200 status code.

19. Links that lead to HTTP pages on an HTTPS site

If an internal link on your secure (HTTPS) website points to an HTTP URL, browsers will show a warning about it being a non-secure page. This can damage your overall website credibility, user experience, and thus SEO.

Fix: Replace all HTTP internal links with the HTTPS versions.

20. Non-canonicalised version of the URLs used in internal linking

Pointing internal links to the canonical (aka primary/preferred) version of a page is one of the ideal SEO best practices. But in many websites, because of the CMS default settings, the non-canonicalised version (aka duplicate version) of the pages are used in internal linking.

Fix: Ensure you don’t point all the internal links to the non-canonicalised version by default, but instead, point to the canonical URL so you don’t waste your crawl budget and help crawlers index the correct version of the URL to display in the SERPs.

21. Uncrawlable/Blocked pagination URLs

It is crucial to ensure that your pagination pages are not being blocked or canonicalized along with other parameter URLs. This is a common mistake on ecommerce sites.

Fix: Use a crawler tool like Screaming Frog to check if paginated pages are accidentally canonicalized or blocked by robots.txt and also to identify query string URLs (parameters) that are being generated due to facets/filters. Ensure pages linked via pagination have a chance to be indexed by including them in your XML sitemap.

22. Internal URLs redirect chains

A redirect chain is built when there are multiple URLs set up to redirect to another page. These additional intermediate pages make it harder for crawlers to crawl your website, and negatively impact your crawl budget. Googlebot may even leave altogether if it finds too many redirects. Also, the more URLs in the redirect chain, the longer it will take for the page to load, hurting UX.

Fix: Use a tool like Screaming Frog to find redirect chains. Wherever possible, change the 301 redirects to have everything redirect directly to the final page.

23. Invalid HTML elements in the <head>

This issue arises when a URL contains a <noscript> tag in the <head> section, which includes invalid HTML elements. This tag defines alternate content for users that have disabled scripts in their browser or have a browser that doesn’t support scripts.

Fix: The <noscript> tag, when used inside the <head>, must only contain <link>, <style>, and <meta> elements. The inclusion of other HTML elements can cause issues for search engine crawlers that do not render JavaScript. So, simply move the <noscript> into the <body>, where other elements are valid. Or, remove all invalid elements from the <noscript> tag.

24. Trailing slash issue (/)

If your content can be seen on both the trailing slash version and non-trailing slash version of a page’s URL, the page can be considered as having two separate URLs. So a trailing slash at the end of a page’s URL can lead to issues like duplicate content.

Fix: Pick one: all pages with or without a trailing slash. Whatever you decide, ensure all the different linking elements like 301 redirects, XML sitemaps, internal links, canonical tags, etc. point to the preferred URL version you want indexed.

25. Mixed content in use (loads HTTP resources on HTTPS URL)

Google has frequently warned about the delivery of mixed content. Recently, the search giant stated that Chrome would stop loading mixed content. That’s because if a page has both secure (HTTPS) and insecure (HTTP) content, it is vulnerable to hacking.

Fix: HTTPS pages shouldn’t have access to HTTP resources. Amend or remove any unsafe links.

26. An image format that Google does not support

Google Images supports images in the following formats: BMP, GIF, JPEG, PNG, WebP, and SVG. It is possible that you’re using an unsupported format or tag, such as a .jpg image in the <image> tag inside an inline SVG.

Fix: Opt only for any of the supported image formats: BMP, GIF, JPEG, PNG, WebP, and SVG.

27. 200 response for 404 pages (Soft 404s)

Many websites have “404 not found” pages that return a 200 “OK” status code, instead of a 404 code or a 301 redirect to a valid page. The issue with showing a 200 code to crawlers is that it means a page that no longer exists, technically does exist. Google then indexes those pages and counts them as soft 404 pages.

Fix: Make the server return 404 status directly. Don’t use a redirect.

28. Internal links to noindexed pages

The noindex attribute is an instruction given to search engines to not index a page. It is typically used on pages like the user login page, pagination, dynamically created pages, theme pages, taxonomies, internal search results, and registration pages. Internal links pointing to a page on your site with a meta robots noindex gives Google a conflicting signal by suggesting it’s an important page but also saying it’s not an important page via the meta robots. Such conflicting signals hurt SEO.

Fix: Add noindex tags only to pages that you don’t want indexed. If the page is worth building an internal link to, then consider changing your meta robots to index. Or, create a better resource page that you want Google to index and point your internal links to that page instead.

29. Improper Language Declarations

If you have multiple versions of a page for different languages/regions, you can declare these variations to Google using hreflang. But while Google favors hreflang tags for multilingual sites, Bing doesn’t use the hreflang tag when matching language to page content. It uses the “content-language” meta tag in <head>. Using the wrong language code or country codes can cause usability and accessibility issues, which indirectly impact SEO.

Fix: Ensure all language codes you use to declare the language are in ISO 639-1 format and optionally the region (in ISO 3166-1 Alpha 2 format) of an alternate URL. Specifying only the region is invalid.

30. Content getting filtered by Safe Search

It is possible that Google’s SafeSearch filter is incorrectly filtering out your pages from the SERPs. You won’t get any warnings in Search Console and your website’s overall visibility and traffic can be taking a major impact (if your audience has SafeSearch turned on).

Fix: Check if your pages are getting filtered by Google SafeSearch. Use the Safe Search Bookmarklet to easily see two versions of Google’s SERPs, one with SafeSearch enabled and the other where it’s off. Place all NSFW content in a specific directory to help Google’s algorithm understand where explicit content resides on your site. Use adult meta tags to self-mark pages that contain NSFW content. And if your website is getting flagged incorrectly, post on Google’s webmaster forums to make your case.

31. Missing Intent Completely

Google’s biggest priority is satisfying its user’s search intent: whether it’s informational, transactional, navigational or a combination of them. Failing to align your content with user intent would lead to more bounces, fewer conversions, and poor rankings overall.

Fix: For each page and piece of content, identify user intent and optimise for it. If it’s an informational search, cover the topic comprehensively so the user doesn’t feel the need to head to a competing article. If it’s a commercial investigation (x vs. y, best xyz, etc.), structure the content with charts, tables, pros and cons, etc. for easier decision making. Determine which content type (article, product page, landing page) and format (guides, infographics, etc.) would best serve your users. Check the SERPs to see what’s already performing well, and what related searches Google recommends. Use keyword research tools to align your keyword optimisation with search intent.

32. Hacked site with hidden links and content

It is possible that you have content and/or links on your site without your permission due to loopholes in your website’s security. Hacked content can include malicious JavaScript. user-generated spammy pages and comments, or hidden text, links and redirects. To protect users and maintain the integrity of search results, Google tries its best to keep hacked pages out of the SERPs. This means your website’s SEO can be taking a huge toll due to hidden links and content. 

Fix: Use security plugins to find and clean up hacked/hidden content and links. Extensions like Sucuri will check your website, from theme to database, to identify and remove malicious links and content. In general, regularly scan your website for malware, spam injections, and errors in code.

Check out this infographic we have created outlining all the SEO issues discussed above and share it!

Common SEO issues infographic Supple
 

Enter Your Website & get an instant SEO Report for FREE