Premium Crawlers & Indexing Checker
Check Crawlers & Indexing (Single or Bulk)
Enter one or many URLs, choose a User-Agent to simulate, then run live checks. For maximum accuracy run server-side API (optional). Browser uses public proxy to reduce CORS errors.
Run a Check
How this tool checks sites
What it performs
- robots.txt: fetches
/robots.txtand looks for Disallow and Sitemap directives. - meta robots: inspects page HTML for
<meta name="robots">or bot-specific tags. - Sitemap: checks common
/sitemap.xmland robots-listed sitemaps. - Index verdict: synthesises robots + meta signals to provide an indexability verdict.
Note: Browser checks use a public proxy (AllOrigins) to reduce CORS limitations. For reliable enterprise-scale checks, deploy the included API server (Express) locally or on your host and set the API base in the client code.
No comments:
Post a Comment