Skip to Content
OpportunitiesTechnicalSite IssuesJavaScriptPages with Blocked Resources

Pages with Blocked Resources

What This Means

Pages with resources (such as images, JavaScript and CSS) that are blocked from rendering by robots.txt or an error. This filter will only populate when JavaScript rendering is enabled (blocked resources will appear under ‘Blocked by Robots.txt’ in default ‘text only’ crawl mode). This can be an issue as the search engines might not be able to access critical resources to be able to render pages accurately.

What Triggers This Issue

This issue is triggered when pages have resources such as images, JavaScript, and CSS that are blocked from rendering due to restrictions in the robots.txt file or because of an error. For example, a directive like: Disallow: /wp-includes/ In the robots.txt file, which would block search engines from crawling any resources that begin with: https://www.getasky.com/wp-includes/  Such as: https://www.getasky.com/wp-includes/js/jquery/jquery.min.js 

How To Fix

Update the robots.txt and resolve any errors to allow all critical resources to be crawled and used for rendering of the websites content. Resources that are not critical (e.g. Google Maps embed) can be ignored.


← Back to JavaScript

Last updated on