Way to Detect Duplicate Content - PowerPoint PPT Presentation

About This Presentation
Title:

Way to Detect Duplicate Content

Description:

When the same content appears at more than one site address, which is considered duplicate content. It is not penalized by Google. It's important to know when to remove duplicate content from the site. Find out how to detect duplicate content on the website. – PowerPoint PPT presentation

Number of Views:10
Slides: 17
Provided by: shrushtidigital
Category: Other
Tags: seo

less

Transcript and Presenter's Notes

Title: Way to Detect Duplicate Content


1
(No Transcript)
2
What is Duplicate Content
  • There is duplicate material on the website when
    the same content appears in multiple places. When
    the same content appears at more than one site
    address, thats considered duplicate content. The
    terms one place and one URL are used
    interchangeably.
  • The existence of duplicate content does not
    always result in a penalty, but it can still hurt
    search engine rankings occasionally. It can be
    challenging for search engines to determine which
    version of a piece of content is most relevant to
    a given search query when multiple sources of the
    same piece of content exist.

3
How Bad is Duplicate Content for SEO
  • Duplicate content is not penalized by Google. The
    browser does, however, block content that is the
    same, which has the same consequences as a
    penalty your web pages will lose their rankings.
  • Its confusing for Google to display the same
    content on identical pages, so it must decide
    which one to display. Whether the content was
    produced by a third party or by an individual, it
    is likely that the original version will not
    appear in top search results.
  • Another reason duplicate content harms SEO is
    that it can confuse search engines.
  • Check out what else duplicate content sucks for
    SEO!

4
Internal duplication of content
Elements on the Page
  • Your website should have each of the following to
    avoid duplicate content issues
  • Page titles and meta descriptions should be
    unique in the HTML code
  • Your headings dont follow the format of H1, H2,
    H3, and so on.
  • Only a handful of words make up a pages title,
    meta description, and headings. Nevertheless, you
    should keep your website away from the grey area
    of duplication as much as possible. Moreover, you
    can create a meta description that will be seen
    as valuable by search engines.
  • Because you have too many pages, you will not be
    able to write a unique meta description for each
    page. In nearly all cases, Google uses the meta
    description that appears in your content.
    However, it is still better to write a custom
    meta description if you can, as it is a critical
    element in driving click-throughs.

5
Descriptions of products
  • It can be difficult for eCommerce sites to create
    unique product descriptions for each item on
    their website, since they must create
    descriptions that are unique to each product.
  • You must, however, differentiate your product
    page for filter coffee from the other websites
    offering that product in order to rank for
    filter coffee from coorg..
  • Provide a unique description for each website
    that sells your product, including websites and
    resellers selling your product.
  • See our article on how to write a great product
    description page if you want your product
    description page to stand out from the rest.
  • The size and color of products should not be
    displayed separately. Put multiple product
    variations on a single page by implementing web
    design elements.

6
The Trailing Slash, the WWW, and HTTP
  • Internal duplicate content can often be found in
    URLs which include
  • Without www (http//XYZ.com) and with www
    (http//www.XYZ.com)
  • http (http//www.example.com) and https
    (https//www.example.com)
  • a trailing slash at the end of a URL
    (http//www.example.com/) and without a trailing
    slash (http//www.example.com)
  • An easy way to test your landing pages is to take
    the most valuable text on the page, but the text
    in quotes, and then Google it. The exact text
    will then be searched for on Google. The first
    step to determining why there appears to be more
    than one search result page is to investigate the
    possibility of the things listed above.
  • There is no way to resolve conflicting versions
    of a website except by implementing a 301 change
    that directs users from the unpreferred version
    to the preferred one in the case where www vs.
    non-www or trailing slashes vs. non-trailing
    slashes appear on it.
  • Using the trailing slash or www in your URLs has
    no significant SEO benefit. You can choose
    whether to use them or not.

7
Issues with Externally Duplicated Content
  • The chances of your valuable content being
    republished on another website are good if your
    website has significant amounts of valuable
    content. Unfortunately, you wont be able to take
    advantage of this. Duplicate content can be found
    in a number of ways

8
Using Scraped Content
  • In the form of scraped content, a website owner
    tries to increase their organic site visibility
    by stealing content from another site. In
    addition, webmasters have the option of
    automating the stolen content they collect.
  • As scrapers sometimes do not bother to replace
    branded terms within the content, it can
    sometimes be easy to recognize scraped content.
  • This penalty works as follows Google staff
    examines websites to find out whether or not they
    comply with Googles Webmaster Quality
    Guidelines. Google can either lower your ranking
    on its search engine or remove you from its
    search results if you have been flagged for
    manipulating its search index.
  • In the event that scraped content is being used
    on your site, Google should be notified by
    reporting the webspam using the Copyright and
    other legal issues section.

9
Published content
  • When the content on your blog is published
    elsewhere, it is commonly called content
    syndication.. The content you voluntarily share
    with another site isnt scraped.
  • It may seem absurd, but syndicating your content
    has its benefits. You can increase traffic to
    your website by making your content more visible.
    Essentially, you exchange your content and
    perhaps search engine ranking in exchange for
    backlinks.

10
How do I Fix Duplicate Content
  • It is impossible to find a solution that is
    universally applicable to duplicate content.
    However, there are some solutions to some of the
    most common issues

11
1 Pages in print-friendly format 2 Issues with
HTTP/HTTPS or Subdomains 3 Session identifiers
and UTM (campaign, source, medium, term, and
content) parameters 4 Pagination 5 A Single
Page in Different Languages/Countries 6
Plagiarized Content 7 Syndicated Content 8
Boilerplate Content
12
Does duplicate Content get Penalized?
  • I understand that duplicate content carries a
    penalty. Generally speaking, regular sites are
    not affected by it, except in rare cases. The
    human reviewers of Google may flag a page if it
    contains copied or scraped content. In the event
    that the reviewer finds that the content is
    duplicate content with the intent of manipulating
    search engine results, Google will impose a
    penalty, potentially causing the site to be
    ranked lower or dropped completely from the
    search engine results. It is imperative that you
    do not steal content. Produce great content of
    interest to your readers instead.

13
Google Duplicate Content Checker
  • Below is the link to Where you can read clearly
    what it means from Google to have duplicate
    content on your website.
  • https//developers.google.com/search/docs/advanced
    /guidelines/duplicate-content
  • I will take an excerpt from the above link and
    put it into simple non-technical language.
  • Regardless of whether you have a robots.txt file
    or another method to block crawlers from reaching
    duplicate content on your site, Google does not
    recommend blocking crawlers. the content.

14
  • In the absence of the ability to crawl duplicate
    content pages, the search engines are unable to
    detect duplicate content pages, so they will
    treat them as an individual, distinct pages.
    Search engines should be allowed to crawl these
    URLs, but mark them as duplicates with
    relcanonical links, 301 redirects or a URL
    parameter handler. Whenever duplicate content
    causes us to crawl your website too frequently,
    you can set the crawl rate in Search Console.
  • The existence of duplicate content on a site does
    not warrant action unless there is evidence that
    the content exists in order to deceive and
    manipulate search engines. As long as you follow
    these guidelines, if your site has duplicate
    content issues, we will display the most relevant
    version of

15
Final say
  • Content duplication can be a menace for your
    website especially when you own an e-commerce
    site. Follow the steps mentioned above and make
    your website free of duplication and save your
    time.
  • If you have any other tips feel free to comment
    below, contact us if you have any questions
    regarding SEO.

16
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com