Some Experiments into How Google's Crawler works
Why experiment with Googlebot Beyond the fact it is interesting to understand how it works, it is potential useful if you ca...
Expert insights, guides, and tips to improve your website's SEO performance
It's the second of our new weekly videos showcasing the improvements to The Crawl Tool in the last week. It seemed like there were a million things in the first update, and only two in this second one. But they are big things.
The crawling and reporting aspects of The Crawl Tool are the exciting part and so they've been the focus of most of the work. This left things like the home page, the guide, and the blog, looking a bit dated. That's more noticeable than before since the beginning of these weekly posts, that weren't that easy to read.
So all those pages have had a fresh, new, more modern, and clean redesign that will make everything easier to see, read, and digest.
A clear request from users is for backlink information. There's lots of ways we can use that - from a simple report that you can search or export if you wish, to backlink info by page on the backlink info pages, to a backlink research tool. We've been unable to do these because access to backlink databases is expensive and it would compromise our low cost approach.
Until now...we've started building our own backlinks database. This database building process will last a couple of weeks more before we can start adding the features we just mentioned. Except for one - we currently have a partial database. When you crawl a website it'll query that partial database and fill in a "Back Links" report for you on the dropdown. Until we've built the database, it's just a fraction of the backlink information that we'll be able to provide users with but we figured we may as well give what we have now.
We're considering opening an API to this data and are interested in hearing any ideas for potential use cases, requirements, etc, to sound out if that's a worthwhile effort.
Start with a free crawl of up to 1,000 URLs and get actionable insights today.
Try The Crawl Tool FreeWhy experiment with Googlebot Beyond the fact it is interesting to understand how it works, it is potential useful if you ca...
LLMS.TXT again I've written about LLMS.TXT in the article about how getting one listed in an llms.txt directory mysteriously...
What's this about Adding Other Media to robots.txt I recently came across John Mueller's (a Google Search advocate) blog. I ...
Understanding the Importance of having a fast Mobile website I, personally, spend a lot of time focusing on site speed. The ...
What are robots.txt, sitemap.xml, and llms.txt These files are used by search engines and bots to discover content and to le...
AI Crawlers and Citing Sources The rise of AI, rather than search, crawlers visiting websites and "indexing" information is ...