VitalSentinel
Features

Indexing Monitoring

Track which pages are indexed by Google and identify crawling issues.

Indexing monitoring helps you understand which of your pages are in Google's index and why some pages might not be appearing in search results.

What is Indexing?

When Google crawls your website, it adds pages to its index. Only indexed pages can appear in search results. Common indexing issues include:

  • Pages not discovered by Google
  • Pages discovered but not indexed
  • Pages blocked by robots.txt
  • Pages with noindex directives

How It Works

VitalSentinel integrates with Google Search Console's URL Inspection API to provide:

  1. URL inspection data - Status of individual URLs
  2. Coverage reports - Overview of indexing across your site
  3. Issue tracking - Problems affecting indexation
  4. Mobile usability - Mobile-friendliness issues
  5. Rich results - Structured data detection and validation
  6. AMP status - Accelerated Mobile Pages detection (if applicable)

Indexing monitoring requires connecting your Google Search Console account. See Google Search Console Integration.

Dashboard Overview

Summary Statistics

MetricDescription
Total InspectionsURLs checked for indexing status
IndexedPages in Google's index
Not IndexedPages not in the index
Coverage PercentagePercentage of pages successfully indexed
Error CountNumber of indexing errors

Status Breakdown

Pages are categorized as:

  • Indexed - In Google's index, can appear in results
  • Crawled, not indexed - Google found it but chose not to index
  • Discovered, not indexed - Known to Google but not yet crawled
  • Excluded - Intentionally or unintentionally blocked

URL Inspection Details

For each URL, you can see:

FieldDescription
Index VerdictPASS, PARTIAL, FAIL, or NEUTRAL
Coverage StateCurrent indexing status
Last Crawl TimeWhen Google last crawled the page
Crawled AsDESKTOP or MOBILE
Canonical URLThe page Google considers canonical
Mobile UsabilityMobile-friendliness verdict
Rich ResultsDetected structured data types
Referring URLsPages linking to this URL

URL Status Categories

Indexed

Your page is in Google's index. This means:

  • Google has crawled the page
  • Content was deemed indexable
  • Page can appear in search results

Crawled, Not Indexed

Google crawled the page but didn't add it to the index. Common reasons:

  • Low-quality content - Thin or duplicate content
  • Soft 404 - Page looks like an error but returns 200
  • Duplicate content - Very similar to another page
  • Low value - Not useful for searchers

What to do:

  • Improve content quality
  • Add unique value
  • Check for duplicate content issues

Discovered, Not Indexed

Google knows the URL exists but hasn't crawled it yet. Reasons:

  • Crawl budget - Site has too many URLs
  • Low priority - Page seems less important
  • Recent submission - Not yet processed

What to do:

  • Wait for Google to crawl (can take weeks)
  • Improve internal linking
  • Submit sitemap

Excluded by robots.txt

The page is blocked in robots.txt. If intentional, this is expected. If not:

What to do:

Excluded by noindex

The page has a noindex directive. Check:

  • Meta robots tag: <meta name="robots" content="noindex">
  • X-Robots-Tag HTTP header

What to do:

  • Remove noindex if the page should be indexed
  • Keep noindex for pages that shouldn't appear in search

Submitting URLs

Via Sitemap

The best way to inform Google about your pages:

  1. Create an XML sitemap
  2. Reference it in robots.txt
  3. Submit in Google Search Console

URL Inspection

Request indexing for specific URLs:

  1. Go to Indexing in VitalSentinel
  2. Select a URL
  3. View its current status
  4. Use "Request Indexing" in Search Console if needed

Common Issues

Crawl Budget Issues

If you have many pages, Google may not crawl them all. Solutions:

  • Remove low-quality pages
  • Improve site speed
  • Fix redirect chains
  • Block non-essential pages

Duplicate Content

Multiple URLs with the same content. Solutions:

  • Use canonical tags
  • Implement redirects
  • Remove duplicate pages

Redirect Chains

Multiple redirects slow crawling. Solutions:

  • Redirect directly to final URL
  • Update internal links
  • Fix redirect loops

Soft 404s

Pages that look like errors but return 200. Solutions:

  • Return proper 404 for missing content
  • Add content to thin pages
  • Redirect to relevant pages

Best Practices

Check New Content

After publishing:

  1. Wait a few days for discovery
  2. Check indexing status
  3. Request indexing if needed

Fix Issues Promptly

Indexing issues can:

  • Delay content appearing in search
  • Reduce organic traffic
  • Impact SEO performance

Data Source

Indexing data comes from the Google Search Console API. You must connect your GSC account to use this feature.

See Google Search Console Integration to connect your account.

Alerts

Set up alerts for indexing issues. Available metrics:

  • Index Coverage - Alert when coverage percentage drops below a threshold
  • Indexed Pages - Alert on changes to indexed page count
  • Index Errors - Alert when error count exceeds a threshold

See Setting Up Alerts for configuration details.

On this page