Most people have heard of “crawl rate”, but what is it? Crawl rate is the frequency and speed at which Google bots – and other search engines – “crawl” your website looking for new content to index. Even though most people know being indexed is a good thing, many may not realize that crawl more about speed than frequency. A high crawl rate makes your website means it has a better chance of being properly indexed and discovered. Here are some pointers on how to improve your crawl rate.
SEO vs. Crawlers
Although both of SEO and crawlers are related to your website and will affect traffic, what is the difference? To put it simply, SEO is optimization for users, and Crawl Rate Optimization (CRO) is optimization for the Google spider.
1. Why Spiders are Awesome
Crawl rate is assuming greater significance in the SEO process, as businesses develop their websites and provide the Google spider with more information daily. One myth about crawl rate is that it refers to the frequency of crawling. The fact is, crawl rate refers to the speed of crawling. The question is how do you get Google to crawl faster? Here are some tips and tricks to ensure the Google spiders visit you regularly and find information quickly. Crawling, according to the SEO periodic table, helps is one of the more effective actions (given a +3) for SEO.
2. Feed the Bot
Content is the fodder on which the bot feeds. Have a blog on your website that not only provides information to your visitors (human and bot), but also invites them to take a look through the site right then and there. There are those who update daily, inviting Google for a look, but if you can’t do it daily – and it’s overkill anyway – once to three times a week is a good bet.
3. Make Sure You Have 99% Uptime
Your website will be down some of the time for maintenance. But if your website is down even half the time, Google can’t pay you a visit – not to mention humans. Getting a good hosting plan that will ensure your site stays up 99% of the time is as important as having a well-developed website. This article suggests some great hosting plans. Having a high uptime not only ensures that Google spider can crawl and do its job, but it also ensures that you are available when visitors come knocking – remember the internet never sleeps! Check out this article here that suggests some great tools to check your uptime.
4. Pay Attention to the Title
Make your page title tells the story of the page. Having a common <title> tag for all your pages may be a good idea programming wise. But it’s not so good when you consider CRO. When you write about women’s wear, having “Women’s Wear” in your title will only tell Google that the page has content about women’s clothing. Having separate titles like “Women’s Summer Wear” and “Accessories for Black Outfits” will tell the crawler exactly what is on the page. By having more specific, even unique titles and meta descriptions, it will help your visitors and Google understands the content a lot better. Google Webmaster even encourages this.
5. Make use of robot.txt
There is nothing more frustrating than traveling a complicated route without a map. Make navigating your website easy. Give the spider a site map that tells it what to look for and where it can be found. Websites today have a lot of images and other media. While this is necessary, all that flash can confuse the bot that is looking for words. A site map can guide the bot to the relevant pages where it can get the information it seeks to index your site. It will also help the many users visiting your site. There are many different ways to design your sitemap but here are a few suggestions.
When Google pays you a visit, it uses robot.txt to understand which pages are available for crawling. However, use the robot.txt list to tell the crawler what not to crawl. Use this to keep Google out of routine, mundane, or unchanged pages. This will speed up the crawling and encourage the bot to return sooner for another visit. One difference between humans and the Google spider is that humans need repetition while if Google gets too much repetition, it may reject your site on its next round. So make sure it stays out of repetitive pages, media-heavy pages, and other irrelevant pages. Save those for your customers.
6. Reduce Loading Time
Let’s get back to your hosting plan for a moment. If your site uploads slowly for reasons such as heavy media load, the bot will not wait around. A site that loads quickly, can be crawled and indexed quickly by the Google spider. One may argue that loading time is a necessary evil because media is required. However, a media heavy web page not only takes time to load, it is also likely to confuse spider and human visitors! So save the media for the converts. Give the spider words to understand your product. By all means throw in an image or two so that people get a feel for the product, but keep the resolution low, and scale the file down or compress if you have to. Try to think of image sizes as less than 1 MB. Give the Google spider more information about your business than pictures of it. That will give you the advantage of Google Index as well as improve your conversion rate. If you’d like to quickly check your load time try out with the PageSpeed tool from Google.
7. Use Google Tools Effectively
Over the years, Google has offered up a number of tools like Google Ad-words and Webmaster to help you optimize your website for search engine crawling. Use these tools effectively to tell the Google Spider how to crawl your site and what to look for. Google ad-words is a bit more about SEO wording but can still be very effective for your site. For the Webmaster tool, it is very helpful to tell Google how to crawl your site, for instance, if your site is in multiple languages, the spider will have trouble crawling your website so you need to provide that information to make the spider’s job easier. In its own article about Google Bot, Google says, “Google spider shouldn’t access your site more than once every few seconds on average.”
8. Interlink Your Content
Back to maps once again! If your web pages are well linked, the spider can crawl faster since one page leads it to the next, so interlink your content. Interlinking creates a veritable map route for the bot to follow. A linked site is easier and faster to crawl than a site that has many independent pages. Again, Google’s Webmaster is a handy tool to check whether your interlinking is optimal. It also lets Google know you are updating.
9. Use Tags and Alt Text
No website is complete without images, so following the less is more rule, caption the ones you have effectively. The images you do have should be accompanied by alt text, captions, and tags that can help Google spider understand what the images are about. Make good use of all attributes of the <img> tag. There are many tools on the web to check your alt text, like this one. Again while alt-text should be brief, give as much information as you can so that a clear picture can be formed.
As the internet gets flooded with more and more websites, Google is forced to decide which sites NOT to visit. Improving your crawl rate is not just about doing the right thing, but also about what not to do. In fact, it would not be wrong to say that the next step after SEO is CRO (Crawl Rate Optimization). They happen together and are built into the fabric of your site and content creation. So here’s to better building!
If you found this piece helpful, follow Oursky’s Medium Publication for more startup/entrepreneurship/project management/app dev/design hacks! 💚