Some Experiments into How Google's Crawler works
Why experiment with Googlebot Beyond the fact it is interesting to understand how it works, it is potential useful if you ca...
Expert insights, guides, and tips to improve your website's SEO performance
In the third of our weekly update on new features, we introduce the Public API.
A lot of effort has been put into the user interface of The Crawl Tool to make it easy to use and access the data on your crawled projects. But you might want to create your own little tool or do a simple task programmatically. Now you can with our public API.
At the moment we have 3 functions you can call:
This is the start of our desire to enable all types of website owners, SEOs, and agencies to improve their sites in whatever way they want. We'll keep working on this. For example, there's a huge backlink project we're working on that we hope to add to the API in the near future.
Start with a free crawl of up to 1,000 URLs and get actionable insights today.
Try The Crawl Tool FreeWhy experiment with Googlebot Beyond the fact it is interesting to understand how it works, it is potential useful if you ca...
LLMS.TXT again I've written about LLMS.TXT in the article about how getting one listed in an llms.txt directory mysteriously...
What's this about Adding Other Media to robots.txt I recently came across John Mueller's (a Google Search advocate) blog. I ...
Understanding the Importance of having a fast Mobile website I, personally, spend a lot of time focusing on site speed. The ...
What are robots.txt, sitemap.xml, and llms.txt These files are used by search engines and bots to discover content and to le...
AI Crawlers and Citing Sources The rise of AI, rather than search, crawlers visiting websites and "indexing" information is ...