Manus AI is an autonomous AI agent platform that browses the web on behalf of users. Its
crawler identifies itself with a Manus-User/1.0 token in the user-agent string.
Kaistone Radar’s bot detection patterns do not include a match for this token, so
Manus AI visits are classified as “Unknown / Human” despite being automated
AI agent traffic.
The BOT_PATTERNS array in beacon.mjs contains regular expressions
matched against the user-agent header. When a hit arrives, the patterns are tested in order
and the first match determines the bot label. If no pattern matches and the IP is not in a
known infrastructure range, the hit is labelled “Unknown / Human.”
At the time of this finding, the pattern list includes 30+ entries covering OpenAI, Anthropic, Google, Bing, Meta, and various HTTP client libraries — but Manus AI is absent.
On 25 March 2026, a hit was recorded with the following attributes:
| Field | Value |
|---|---|
| Timestamp | 2026-03-25T02:19:42.973Z |
| IP | 54.236.223.224 (AWS us-east-1) |
| User-Agent | Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36; Manus-User/1.0 |
| Page | /tree/ |
| Classified as | Unknown / Human |
The user-agent string appends Manus-User/1.0 after a standard Chrome UA, following
the common convention of bot identification via UA suffix. The IP resolves to AWS infrastructure
in the us-east-1 region, consistent with a cloud-hosted AI agent rather than a human visitor.
Notably, Manus AI visited the /tree/ root page, suggesting it followed the link
from the home page. However, it did not proceed deeper into the tree structure — no hits
were recorded for any child pages.
Manus is an AI agent platform that performs multi-step web tasks autonomously. Unlike traditional search engine crawlers that index pages for later retrieval, Manus operates in real time — browsing, reading, and acting on web content as part of user-initiated tasks. This makes it a distinct category of AI web visitor: not a crawler in the traditional sense, but an autonomous agent that interacts with sites on behalf of its users.
The AI agent landscape is expanding rapidly. New platforms like Manus, Devin, and others are deploying web-browsing agents that visit sites with custom user-agent strings. Each new platform requires a corresponding pattern in the detection list. Without proactive pattern updates, an increasing proportion of AI traffic will be misclassified.
This highlights a structural challenge: the bot detection approach relies on a manually maintained list of known patterns. As the number of AI agent platforms grows, maintaining this list becomes a continuous effort. A community-maintained registry of AI bot user-agent patterns would benefit projects like Kaistone Radar.
Add a Manus pattern to the BOT_PATTERNS array:
{ pattern: /manus/i, name: 'Manus AI' }
Additionally, add a corresponding colour entry in the dashboard:
'Manus AI': '#e040fb' // purple-pink
Longer term, consider implementing a more scalable approach to bot detection — such as loading patterns from a configurable JSON file or fetching from a shared registry — to reduce the friction of adding new AI platforms as they emerge.
manus pattern was added to the
BOT_PATTERNS array in beacon.mjs and a corresponding colour
(#e040fb) was added to the dashboard. Future Manus AI visits will be correctly
identified and displayed with a purple-pink badge.
Beacon source: netlify/functions/beacon.mjs
Dashboard: /dashboard/
Findings index: /findings/