The Single Page Requester lets you run a one-off crawl for a single URL inside an existing project. It’s ideal for validating fixes, checking a recently updated page, or troubleshooting an issue without waiting for the next scheduled crawl or creating a separate project.
Most single-page requests use just one credit. Any enabled data extensions may slightly increase the total credits used. Only the last 10 requests are stored, and request data expires after 90 days.
Common Use Cases
Use the Single Page Requester when you want to:
- Verify recent changes
- Check that a fix (e.g., a canonical tag, meta robots change, or performance tweak) is live and being seen by the crawler.
- Troubleshoot a specific issue
- Quickly run diagnostics on a problem URL without waiting for the next full crawl.
- Test configuration changes safely
- Temporarily override rendering or scope settings for a single URL to see the impact, without altering the main project configuration.
Where to Find the Single Page Requester
You can find the Single Page Requester in the left-hand navigation, within any SEO, Site Speed, or Accessibility Project. Clicking on that link will take you to the Single Page Requester page.
Running a Single Page Request
Enter the URL
- In the Page URL field, paste or type the full URL you want to crawl.
- The request will run using the currently selected project’s settings by default.
Choose whether to generate link data (for SEO projects)
- Toggle Generate link data on or off.
- Generating link data will increase the request time, so only enable this setting when needed for faster performance.
Review which settings will be used
- By default, the requester uses your existing project settings.
- The button on the form will indicate you are Using default project settings.
(Optional) Override project settings for this request
- Click Override Project Settings to adjust settings just for this single URL.
- A label (for example, “1 settings overridden”) clearly shows that you’ve changed defaults for this request.
- These overrides apply only to the current request—every new request starts again from the project’s default settings.
If you want to go back to your standard configuration, click reset to project settings.
Start the request
- Click Request.
- You’ll see a confirmation that the request is running, and the URL will appear in Latest Requests on the right.
Reviewing the Data
Once the single page request is complete, you’ll see a page-level report with different tabs for the relevant data:
Metrics
Detailed metrics for the page.
Metrics may differ slightly from those in full crawl reports, because post-crawl, aggregate metrics are not generated for single page requests.
Preview, Body, Request Data
A preview screenshot of the page, the Raw vs Rendered HTML, and full request data.
HTML and screenshot views let you visually confirm what the crawler saw—helpful for debugging rendering, layout, or content issues.
In addition to the tabs above, you’ll also see the following, depending on the project type:
- SEO:
- Links: A list of all the links found on the page (if chosen to generate link data).
- Accessibility:
- A11y Issues: A list of identified accessibility issues found on the page.
- Site Speed:
- Audits: A list of audits run on this page, along with the relevant score.
- Audit Issues: A list of individual resources identified as issues on the page.
Key Things to Remember
Each single page request uses one credit per URL (unless additional extensions are added, which may have additional costs per URL).
Requests inherit the current project settings by default.
Any overrides you make apply only to that request – the next one starts with project defaults again.
Single page requests show crawler-only metrics, so numbers may differ slightly from aggregate project reports.