Web pages appear to be missing after crawl
If your crawl completed, but it looks like some of your web pages are missing, please check your Source Gap analysis, as your web source may now have been selected, or failed to crawl. Please refer to 'Crawl not starting' below.
On the Crawl Overview page (in the left hand navigation) you'll see a 'Pages in Crawl Sources' chart. In the example below, you can see the Backlinks and Google Search Console found all the URLs, and the web source only found 1.
HTML element appears in the source but doesn't seem to be recognized by Lumar
The webpage might not have been crawled properly due to Lumar being blocked or the webpage failing or returning a soft 404. This can be seen by the status code reported in Lumar or by checking for other discrepancies in the report.
The second reason could be due to a tag which should only exist in the body, such as an iframe or div, residing in the head of the page’s HTML, which causes our parser to think the head has ended early and the has body begun. This will cause the proceeding head metrics to go unrecognized and unreported by Lumar.
- Moving the tag from the head to the body will resolve this.
- If you have the JS rendering add-on, you can take advantage of our Custom JS Scripts feature. This script will remove iframes from the head during the crawl: document.head.querySelectorAll('iframe').forEach(function(e){e.remove()})