On Page SEO - PowerPoint PPT Presentation

About This Presentation
Title:

On Page SEO

Description:

This presentation contains detailed information on page SEO. – PowerPoint PPT presentation

Number of Views:0
Slides: 15
Provided by: immanishacharya
Tags:

less

Transcript and Presenter's Notes

Title: On Page SEO


1
on page SEO
  • Tips for

2
  • Lets start with a few SEO best practices
    everyone should have in the bag. These wont
    directly improve rankings, but theyre important
    in setting yourself up to rank higher in Google.

3
Points to be considered
  • Title
  • Meta Description
  • URL Optimization
  • Anchor Tag Optimization
  • Image Optimization
  • Header Tag Optimization
  • Robots.tag
  • Robots.txt
  • Set up Google Search Console

4
Title description
  • Keep them short. Under 70 characters is best to
    avoid truncation.
  • Match search intent. Tell searchers you have what
    they want.
  • Be descriptive. Dont be vague or generic.
  • Dont clickbait. Make sure they align with your
    content.
  • Include the keyword. Use a close variation if it
    makes more sense.
  • Include the year. For topics that demand
    freshness.

5
Meta Description
  • Keep them short. Under 160 characters is best to
    avoid truncation.
  • Expand on the title tag. Include USPs that you
    couldnt fit there.
  • Match search intent. Double down on what
    searchers want.
  • Use an active voice. Address the searcher
    directly.
  • Include your keyword. Google often bolds these in
    the results.

6
URL Optimization
  • If youve set up your website for SEO success,
    your URL structure should be sound. But you still
    need a descriptive slug for each page. Google
    says to use words that are relevant to your
    content.8 Often the easiest way to do that is
    to use your target keyword. 

7
Anchor Tag Optimization
  • Anchor text should indicate to users what kind of
    page theyll be taken to if they click the link.
    A few tips are as follows
  • Make Sure Anchor Words Are Relevant Relevant
    anchor text helps Google understand your site
    structure. Which means it could help your website
    rank higher.
  • Dont Over-Optimize Your Anchor Text Theres no
    set length guidelines for anchor text, but its
    best to keep it as succinct as possible. We
    recommend keeping anchor text to five words or
    less.
  • Pay Attention to Surrounding Text The text
    surrounding your anchor text can help readers
    better understand the context of what youre
    linking to. 
  • Fix Alt Text Issues To check for missing alt
    text on your site, head to the Issues tab in
    the Site Audit tool.

8
Image Optimization
  • Images from your pages can rank in Google image
    search and send more traffic your way. You need
    to do three things to optimize them.
  • Use descriptive filenames
  • Google says that filenames give them clues about
    the images subject matter.9 So dog.jpg is
    better than IMG_859045.jpg.
  • Use descriptive alt text
  • Google also uses alt text (alternative text) to
    understand the subject matter of an
    image.9 This is an HTML attribute used on ltimggt
    tags to describe the image.
  • Compress them
  • Compressing images makes file sizes smaller,
    leading to faster load times. Plenty of tools
    exist for doing this. ShortPixel is a good
    option.

9
Header Tag Optimization
  • Always include an H1 tag.
  • Use only one per page.
  • Include the primary keyword for your content.
  • Avoid populating the tag with too many keywords.
  • Ensure that your target audience can easily read
    the H1 tag.
  • Use up to 70 characters in the tag length.
  • Make the tag unique.

10
Robots.tag
  • 1) Do not use Meta Robots and x-robots on the
    same page as one of them will become redundant.
  • 2) In case you dont want to get the pages
    indexed but want to pass on the link equity to
    linked pages, you can use Meta Robots tag with
    directives as noindex, follow. This is the best
    technique to control indexing instead of blocking
    using robots.txt.
  • 3) You dont need to add index or follow
    directives on each page of your website to get it
    indexed. It is considered by default.
  • 4) In case your pages are indexed, do not block
    in them in robots.txt and use Meta Robots
    simultaneously. Because, in order to consider the
    Meta Robots tag, crawlers need to crawl the page
    and robots.txt blocking wont allow them to do
    so. In short, your Meta Robots tag will become
    redundant.
  • In such cases, deploy the robots meta tag first
    on your pages and wait for Google to de-index
    them. Once de-indexed, you can block them through
    robots.txt to save your crawl budget. However,
    this should be avoided as they can be used to
    pass link equity to your important pages. You
    should block de-indexed pages via robots.txt only
    if they serve no purpose at all.
  • 5) Use X-robots tag to control crawling of
    non-HTML files like images, PDFs, flash or
    video, and so on.

11
Robots.txt
  • First and foremost, please research and
    understand which areas of the website you dont
    want to be crawled. Do not simply copy or reuse
    someone elses robots.txt file.
  • Always place your robots.txt file in the root
    directory of your website so that search engine
    crawlers can easily find it.
  • Do not name your file anything apart from
    robots.txt since it is case-sensitive.
  • Always specify your sitemap URL in robots.txt as
    it helps search engine bots to find your website
    pages more easily.
  • Do not hide private information or future event
    pages in robots.txt. Because it is a public file,
    any user can access your robots.txt file by
    simply adding /robots.txt after your domain name.
    Anyone can see which pages you want to hide
    hence it is suggested to not use robots.txt for
    hiding sensitive pages
  • Create a dedicated and customised robots.txt file
    for each and every sub-domain that belongs to
    your root domain.
  • Before going live, be triply sure that you are
    not blocking anything that you dont wish to.
  • Always test and validate your robots.txt file
    using Googles robots.txt testing tool to find
    any errors and check if your directives are
    actually working.
  • Googlebot wont follow any links on pages blocked
    through robots.txt. Hence, ensure that the
    important links present on blocked pages are
    linked to other pages of your website as well.
  • While setting up robots.txt file, do keep in that
    the blocked pages wont pass any link equity to
    the pages they are linking to.
  • Do not link pages blocked in robots.txt file from
    any other pages of your website. If linked,
    Google will end up crawling those pages
    through internal links.

12
Set up Google Search Console
  • To optimize your website for organic search, you
    probably use Google Search Console to learn which
    pages receive the most impressions and clicks,
    and which queries drive them. To get the
    information you need, you may need to visit
    several areas within GSC and view multiple
    reports.
  • Now you can quickly assess your overall SEO
    performance in a single dashboard that monitors
    fundamental metrics, such as
  • Impressions. See how many impressions and clicks
    your website pages receive in Google.
  • Average position. Track your average search
    position and monitor daily, weekly, or monthly
    fluctuations.
  • Position by pages. Learn the search results page
    position of any page on your website.
  • Position by queries. See how many search queries
    each position group receives.

13
References
  • To learn more on how to optimize on page SEO,
    subscribe to Digital Pundits SEO course. Digital
    Pundit is one of the best institute to offer
    Digital Marketing Course in Ahmedabad, Gujarat

14
  • Thank you
Write a Comment
User Comments (0)
About PowerShow.com