Robots
A robot is an AI-generated crawler compiled to Rust for a specific host and country combination. Robots understand a site's navigation, product pages, and data structure.
Creating a robot
Creating a robot starts with a robot build. You provide a URL from the target site and a country code. Extralt's AI analyzes the site and compiles a crawler.
curl -s -X POST "https://api.extralt.com/robot-builds" \
-H "Authorization: Bearer $EXTRALT_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"url": "https://example-store.com/products/sample",
"country": "US"
}' | jqBuild lifecycle
Robot builds go through these stages:
| Status | Description |
|---|---|
pending | Build is queued |
building | AI is analyzing the site and generating the crawler |
completed | Build succeeded, robot is ready |
failed | Build could not complete |
Builds typically take 3-5 minutes. The AI needs to load pages, understand the site structure, and generate extraction logic.
Monitoring builds
Check build status by polling the build endpoint.
curl -s "https://api.extralt.com/robot-builds/$BUILD_ID" \
-H "Authorization: Bearer $EXTRALT_API_KEY" | jq '.status'In the dashboard, you can track build progress in the Builds tab. When a build succeeds, the robot appears in the List tab.
Robot details
Each robot is unique to a host + country combination:
- Host: The domain the robot can crawl (e.g.,
example-store.com) - Country: The country context for the extraction (affects language, pricing, availability) Robots are reusable. Once built, you can create multiple runs with the same robot.
Rebuilding a robot
If a site changes significantly and extraction quality degrades, you can trigger a new build for the same host + country. The new robot replaces the old one when the build completes.
What's next
- Running Extractions -- use your robot to extract data
- API Reference -- full robot build endpoints