The IP addresses of Google´s Googlebot service have been recently made public by Google.
Since the dawn of the Internet, search engines have been an important part of the web. People might not remember life before Google, but there were other search engines before it. But with the being said, once Google stepped in, it changes the rules of search forever.
Starting with their Page Rank Algorithm, which took the work by storm in the mid of the 2000s. Everybody started following that trend, many SEO practices started in that time. Then Google brought Panda in 2011, then Penguin in 2012, Hummingbird in 2013 and so and so and so until BERT in 2019.
Another one of the advances they brought to the table was the Googlebot. This is the name used to designate the two Google crawlers. One of those simulates to be a user navigating from Desktop and another one simulates a mobile user.
Recently, Google has released the IP addresses of these crawlers in order to give webmasters knowledge of whether their crawlers are on your site and when. Previously you needed to reverse DNS in order to determine the origin and identity of the crawler.
How can I do this?
You can find the list of crawler IP´s here! They also offer a full list of IP addresses here. You need to be clear that these addresses might change from time to time so it´s important that you keep checking the list. You can check the IPs using the command line method or the automated list method.
Why is Googlebot so important?
Today, every website owner should expect to have crawlers on their websites. This, of course, is consuming resources on your server. If you think there´s something suspicious crawling your website this newly released information can make your life easier.
You can see the IP of the crawler that´s visiting your site, and then you can check it against the list that Google has just released. Is the IP in the last? Good! Google is doing its work on your website, congratulations!
On the other hand, if the IP is not on the list, well you can proceed and block it to prevent resource leakages in your server.
One of the most valuable assets of a website is the server resources. If you have too many crawlers on your site at once, there might be an issue if many of those are at the same time. Your site could get slow or even go down. This is bad news, considering that the real Google crawler could visit your site anytime… and if it´s down it could impact the performance of your website on the search engine results.
Now you have another tool, that will make your life easier to avoid having useless bots lurking on your site. You could also check with your hosting provider if they have services that could help you manage this. For example, Cloudflare has tools that can be really useful in these situations.