Is LLMS.TXT useful?
LLMS.TXT again I've written about LLMS.TXT in the article about how getting one listed in an llms.txt directory mysteriously...
Expert insights, guides, and tips to improve your website's SEO performance
A new feature on The Crawl Tool is the ability to see what Offsite Javascript a website is using. There is a report "Offsite JS Scripts" and another "Offsite JS By Page", the difference being that the former lists all scripts on the site alone - which is useful for a general overview, and the latter lists them by page to help isolate down where they are used.
There are several reasons you might want to know this information.
It has become common practice amongst many web developers to link to offsite CDNs of scripts they use. This has the advantage that those scripts can automatically update without someone having to manage the site and keep it up to date. It's a kind of lazy mans security benefit. Combined with the CDN likely being closer to the end user, this is often cited as faster.
This would probably be true if we were in the original days of the internet. There are various things that slow down web requests, starting with converting any domain name to an IP address, continuing with the latency in setting up a connection to the server, and going through to the overhead in requesting pages. Modern web protocols minimize these by batching requests together to a particular server and minimizing the overhead, mostly connecting just once to each server.
The key there is "mostly connecting just once to each server". If there is just one server then it very likely to be quicker than if a browser must connect to multiple servers to serve data, particularly if that data isn't very large in size such as some scripts.
Moving scripts from being remotely hosted to hosted on the website itself can have performance advantages, so the reports allow you to see which scripts you might want to move.
Having your website run code on a remote site means that you have to trust that remote site to not only deliver safe code but continue to deliver safe code in the future. A recent well known example of why this is a problem is the case of Polyfill (we won't link to the actual site for reasons that will be obvious). This was a popular script that added functionality to older browsers that people often used over the CDN.
A company bought the Github and domain for this project, and started injecting code in some websites to redirect users to malicious and scam websites.
Linking to a remote script is equivalent to giving them complete control of your site, which you may not want to do.
With privacy policies and cookie policies being something that virtually every website needs nowadays it's important to know exactly what third party services you are relying on and quite likely sharing data with. A scan for scripts can give you an instant overview of where the website frontend is communicating to and, thus, likely sending data to.
Start with a free crawl of up to 1,000 URLs and get actionable insights today.
Try The Crawl Tool FreeLLMS.TXT again I've written about LLMS.TXT in the article about how getting one listed in an llms.txt directory mysteriously...
What's this about Adding Other Media to robots.txt I recently came across John Mueller's (a Google Search advocate) blog. I ...
Understanding the Importance of having a fast Mobile website I, personally, spend a lot of time focusing on site speed. The ...
What are robots.txt, sitemap.xml, and llms.txt These files are used by search engines and bots to discover content and to le...
AI Crawlers and Citing Sources The rise of AI, rather than search, crawlers visiting websites and "indexing" information is ...
Why we need our pages faster Unless you've been hiding under a rock, you'll know that speed is an important part of user exp...