Adnan123 asked
Member (7 upvotes)

How to prevent duplicate URLs?

After getting my website created by a web design company, I've noticed that different URLs equal the same page.

For example

is the same as text

How do I prevent Google/Bing from indexing both pages & tell them which page is the one I want to use?
Avatar for member Chedders
Moderator (243 upvotes)

This is a job for the canonical tag. in your case you would need to add to the head

<link rel="canonical" href="https// text" />

Assuming thats how you want it to appear in google search results

It's a powerful and often underused tag within websites, It helps remove duplicate content and also cut down the variations of URLS for the same page. Its a clear signal to google which is the correct page. 

Consider the following as well 


On most websites all the above would render the same page but the url is different, So you would have 5 versions of the same page producing by its very nature duplicate content. 

The use of a canonical tag tells google which is the correct one.

And yes I did also mean to put in with a capital E as that is still a separate page in googles eyes, Seeing as you have no control how users may link to you every page on your site should have a self referencing canonical tag.

Hope that makes sense.

Good advice?
Comments 0
login to reply
Thanks for leaving no stone unturned. Really appreciate it!
Avatar for member Neha
Member (-4 upvotes)

Duplicate content is a complicated one for Search engine optimization so, canonicalization helps us to avoid the issues of content otherwise Google may pick wrong content to rank. You use redirection which will be a better option for you.

Let's take a look at some common ways duplicate content is created:

  1. URL variations
  2. HTTP vs. HTTPS or WWW vs. non-WWW pages
    If your web site has both versions with and without the "www" prefix for example "" and "" with same content on each versions, then you have created duplicates pages.In the same way if your website as http:// and https:// for your website and bothe website versions are live withe same content then also you would have a duplicate content issue.
  3. Scraped or copied content
  • Fixing duplicate content issues
  • 301 redirect
  • Rel="canonical"
  • Meta Robots Noindex
  • Preferred domain and parameter handling in Google Search Console

Good advice?
Comments 0
login to reply
Copyright ©2024 SEOChat. All Rights Reserved. Owned and operated by Search Ventures Ltd and Chris Chedgzoy.