Community Monitor API
Fetch and filter mentions, reviews, and discussion threads about your brand.
The Community Monitor crawls public platforms (Reddit, Hacker News, Twitter/X, Quora, Trustpilot, YouTube comments, etc.) for mentions of your brand or defined keywords. The API delivers raw mentions plus aggregated alerts — perfect for wiring reputation dashboards or Slack bots up to Rankion.
Module context: Community Monitor.
Mentions
A mention is a single hit: platform, author, text excerpt, sentiment, permalink.
| Method | Endpoint | Description | Credits |
|---|---|---|---|
| GET | /v1/community/mentions |
List of all mentions, paginated | — |
| GET | /v1/community/mentions/{id} |
Detail of a mention | — |
Filters (query parameters):
?platform=reddit|hackernews|twitter|trustpilot|quora|youtube?sentiment=positive|neutral|negative?keyword=<string>(text match on the mention body)?from=YYYY-MM-DD&to=YYYY-MM-DD?page=1&per_page=25
curl "$BASE/community/mentions?platform=reddit&sentiment=negative&from=2026-04-01" \
-H "Authorization: Bearer $TOKEN" | jq '.data[] | {id, platform, sentiment, url, snippet}'
Response (truncated):
{
"data": [
{
"id": 451,
"platform": "reddit",
"author": "u/foo",
"snippet": "I tried Rankion and the GEO scoring was…",
"sentiment": "negative",
"score": -0.42,
"url": "https://reddit.com/r/seo/comments/…",
"discovered_at": "2026-04-28T14:21:00Z"
}
],
"meta": {"current_page": 1, "per_page": 25, "total": 138}
}
Dispatch a scan
A scan triggers an immediate crawl run for a specific keyword across selected platforms — independent of the auto-schedule.
| Method | Endpoint | Body | Credits |
|---|---|---|---|
| POST | /v1/community/scan |
{keyword, platforms[]} |
5 |
curl -X POST "$BASE/community/scan" \
-H "Authorization: Bearer $TOKEN" -H "Content-Type: application/json" \
-d '{
"keyword": "rankion ai",
"platforms": ["reddit", "hackernews", "trustpilot"]
}'
Response 202 Accepted:
{
"scan_id": 77,
"status": "queued",
"message": "Community scan dispatched"
}
The scan runs in the background. Newly found mentions appear in the list as the job runs through (typically 30–120 s depending on the number of platforms).
Alerts
Alerts are aggregated signals: clusters of negative mentions, sentiment drops, review-score crashes. The system raises them daily/weekly.
| Method | Endpoint | Description | Credits |
|---|---|---|---|
| GET | /v1/community/alerts |
Active + historical alerts | — |
curl "$BASE/community/alerts" -H "Authorization: Bearer $TOKEN" \
| jq '.data[] | {id, type, severity, message, triggered_at, mentions_count}'
Typical type values:
sentiment_drop— sentiment average is > 1 SD below baselinenegative_burst— unusually many negative mentions in a short windowreview_alert— external review sources report a score crash (see Review Sources API)
Notes
- Mentions are deduplicated by
(platform, external_id)— the same Reddit comment is never stored twice, even if multiple scans run. - Sentiment is heuristic (model-based). For legal/compliance purposes, always check the original URL, not the score label.
- Auto-crawls run in the background (multiple times daily) — the manual
POST /scanis just for ad-hoc pulls. Without a scan call the mention list still fills up. - Platform availability changes. Twitter/X crawl depends on the API status — on outage, a targeted
POST /scanwithplatforms:["twitter"]returns a503or empty result. - Public webhook pattern for mentions is not (yet) publicly exposed — if you need real-time push, poll
/community/alertsevery minute.
Related: Review Sources API · Community Monitor.