rankion.ai

Content Audit (Site Crawl)

Audit a whole site — issues per page, recommendations, JSON export for tooling.

Content Audit crawls a whole domain (or a defined URL set) and rates every individual page on content quality, on-page SEO, GEO suitability, and technical hygiene. Instead of single-URL optimization (Content Optimizer), you get the site-level overview: which pages are thin, which cannibalize each other, which are missing structured data, which are candidates for refresh, merge, or delete. The result is a navigable report with page-level drilldown plus JSON export for external tools or reporting pipelines.

What it can do

  • Site-wide crawl — entire domain or sitemap subset, depending on type (full = up to 1,000 pages, quick = up to 50, sitemap/single for the UI paths).
  • Per-page finding — every URL found gets its own ContentAuditPage with issues, scores (SEO 0–100, quality 0–100), word count, and a priority bucket.
  • Issue classification — thin content, duplicate, missing meta, broken internal links, missing H1, citation gaps (GEO), poor readability. Issues live as a JSON array {type:error|warning|info, category, message} per page.
  • Solved toggle + bulk mark — mark pages as solved one by one or via filter-based bulk PATCH (e.g. "all low-priority error pages → solved").
  • Pages filter — listing filterable by status, priority, min/max_seo_score, issue_type, sortable by 7 columns.
  • Export — JSON, CSV, or XLSX for your own dashboards / sheets / Looker.
  • Async crawl — background job; you can leave the page and come back later.

When to use

  • You're taking over an existing site and want to know the status quo first.
  • You're planning a relaunch and need a content inventory.
  • You want to identify candidates for merge / delete / refresh.
  • You want a monthly health report for stakeholders.

Workflow

  1. Start auditPOST /content-audits with {source_url, type:"full"|"quick"}. Returns 202 + audit_id.
  2. Crawl runs in background — poll GET /content-audits/{id} until status=completed. You can leave the page.
  3. Pages listingGET /content-audits/{id}/pages?status=open&priority=high&issue_type=error shows only the pages that still need work.
  4. DrilldownGET /content-audit-pages/{id} for the detail of a single page (issues + recommendations + page data).
  5. Single togglePATCH /content-audit-pages/{id}/solved marks one page as solved (body optional — flips when omitted).
  6. Bulk markPATCH /content-audits/{id}/pages/bulk with a filter and is_solved flips a whole batch in one call (see Bulk pattern below).
  7. ExportGET /content-audits/{id}/export?format=json|csv|xlsx for external processing.
  8. Follow up — send critical URLs to Content Optimizer or run a deeper Page Deep Audit (Vision + AI Render).

API

Method Endpoint Notes Credits
GET /v1/content-audits List of your audit runs (paginated)
POST /v1/content-audits Body `{source_url, type:"full" "quick"}, async 202`
GET /v1/content-audits/{id} Summary + eager-loaded pages relation
GET /v1/content-audits/{id}/pages Pages listing. Filters: status, priority, min_seo_score, max_seo_score, issue_type, sort, dir, per_page
PATCH /v1/content-audits/{id}/pages/bulk Bulk-mark. Body: {filter:{priority?, status?, min_seo_score?, max_seo_score?, issue_type?}, is_solved}. Throttle 30/min.
GET /v1/content-audits/{id}/export Export `?format=json csv
GET /v1/content-audit-pages/{id} Detail of a single page
PATCH /v1/content-audit-pages/{id}/solved Toggle a single page solved/open

Body example PATCH /content-audits/{id}/pages/bulk — close all low-priority open pages that still carry an error issue:

{
  "filter": {
    "priority": "low",
    "status": "open",
    "issue_type": "error"
  },
  "is_solved": true
}

Response: {data:{affected:int, is_solved:bool}, meta:{applied_filter:{...}}}. filter.status is the FROM-state (only rows currently in this state are flipped). issue_type matches against issues[].type (error / warning / info).

Body example POST /content-audits:

{
  "source_url": "https://meinedomain.de",
  "type": "full"
}

Credits & Limits

  • Start: 10 credits per audit run, regardless of crawl depth.
  • Async — the page can be closed, the crawl keeps running on the server.
  • Crawl limit per run depends on the plan; very large sites are pulled in sitemap chunks.
  • Protected areas (login, robots block) are skipped and flagged in the report.

Related modules

  • Content Optimizer — optimize problem URLs found here, one by one.
  • Page Deep Audit — deep audit of critical pages with visual + Lighthouse + AI.
  • Content Freshness — additionally monitor outdated pages.
  • Competitor Analysis — benchmark against competitors.
  • Content Audit ≠ Site Audit. Content Audit (/audit) is the content-quality scanner (SEO score, quality score, recommendations per page). Site Audit (/site-audit) is the crawler with crawl-issue detection plus an automatic grounding bridge — two distinct modules with complementary focus.
Letzte Aktualisierung: May 10, 2026

Cookies: We use necessary cookies for functionality and optional ones for improvements. Details

Necessary
Active
Analytics
Marketing