Flatten iFrames & Shadow DOM Now Enabled by Default
Release Date: May 19th 2025
We already offer the option to flatten iFrames and Shadow DOM in the crawl settings to ensure crawl data is as aligned to Google and Bing as possible, and help raise awareness of the impacts of broad use of componentized DOM elements to renderability. When we initially launched this capability, our tests showed that Bing could not flatten the shadow DOM. More recent testing has indicated that Bing’s capabilities to flatten DOM elements have improved, and we therefore believe it’s appropriate to enable this feature by default. This will more accurately reflect Google’s rendering capabilities, as we no longer have to worry that this would mask Bing’s inability to render those components. Therefore, all new projects will have these settings enabled by default.
There is no additional cost for these settings, so they do not impact how your existing credits are used. For some websites, enabling flatten iFrames and Shadow DOM will result in changes to crawls—most notably increased internal link detection—so we suggest you make a note of the day the feature was enabled by default, and the changes reported in your crawls. This will help you better understand the presence and utilization of shadow DOM features. Please note that when cloning a crawl, or creating new crawls in an existing account, these options will be enabled by default, and that might explain differences with pre-existing crawls that are being run without it.
If you do need to disable these options, you can do this in the JavaScript Rendering section of Advanced Settings in Step 4 of the project setup. This is where you’ll find additional JavaScript settings (e.g. blocking ad scripts, analytics scripts, and 3rd party cookies), and add in custom rejections, custom JavaScript, or external JavaScript resources.
It should be noted that there are a number of nuances in this area, particularly in relation to flattening iFrames (for example, while Google can flatten iFrames, it doesn’t always do this). These options can therefore be useful to troubleshoot the differences to a page when they are and aren’t flattened. To find out more, we recommend having a read of these useful resources:
Project Tags
Release Date: May 13th 2025
To make it easier to find the projects you need in Analyze (especially when you have a lot of projects), we’ve added the ability to tag projects so you can reduce the ‘noise’ of any irrelevant projects. Once created, the project list views can be filtered by the tags, and assigned different colors so they can be quickly identified in the project list. Multiple tags can be applied to projects, and you can also apply a tag to multiple projects at the same time.
Find out more about creating and assigning project tags.
Changed Status Code Report Fix
Release Date: May 13th 2025
We identified and fixed a bug that caused some pages with valid status changes (e.g. 200 to 301) to be excluded from the Changed Status Code report. The filtering of the report previously only considered the current Content-Type, and therefore excluded cases where a page changed from or to a different format. As this has now been corrected, you may notice an increase in pages returned in this report.
New Project Setting: Ignore X-Robots
Release Date: May 1st 2025
Staging environments often block indexing by adding a global noindex tag via the X-Robots-Tag: noindex header. During a crawl, this makes every URL look non-indexable.
A new project-level setting, “Ignore X-Robots headers”, solves this:
-
When enabled, Lumar will disregard all directives in X-Robots-Tag headers for that project.
-
Meta robots tags, canonicals, and robots.txt are still respected—only the header is ignored.
-
Find it under Settings ▸ Advanced Settings ▸ Spider Settings ▸ Robots Directives when creating or editing a project.
Use this for staging sites (or any controlled environment) where the header is present for safety, but you still need full indexability analysis.
This new setting is in step 4 of the project settings, in the Spider Settings > Robots Directive section of Advanced Settings.
Filter In / Filter Out on Breakdown Chart Click
Release Date: May 1st 2025
You can now choose whether to Filter In and Filter Out just by clicking a segment of any Breakdown chart.
-
Previously, a click always filtered in the selected chart segment; reversing the filter meant opening the filter panel and updating it manually.
-
Now clicking a chart segment or legend item opens a modal with two options:
-
Filter In – include only the selection.
-
Filter Out – exclude the selection.
-
This makes exploratory analysis much faster—flip between focused and exclusion views without breaking your flow.
Breakdown Charts Compatible with Added / Moved / Missing
Release Date: May 1st 2025
Breakdown charts now stay perfectly in sync when you jump between the Added, Moved, and Missing tabs on any comparison report:
-
Charts redraw instantly as you switch tabs, so the visual context always matches the table below.
-
Any filters applied in one tab persist when you move to another.