If you’re creating content then it may seem obvious that you need Google to actually know you exist.
You’ve just created your blog and written your first masterpiece. You get the nervous feeling of when you hit publish that it is off onto the internet for everyone to read.
But there’s a problem with that. Unless your audience knows about it, Google may take weeks or even months to find it! I’ve been there and trust me it’s not fun waiting around.
Fortunately, in this article, I’m going to show you 5 ways on how to get Google to index your site and blog posts in record time.
It’s part of the fundamentals of being a webmaster. Never, ever overlook SEO techniques to help you not only get indexed but also get ranked.
The way search engines work is they regularly send out their crawlers (known as a web spider) to find websites. Once found, they then add it to their index for filing.
It’s important to know here that indexing and ranking are not the same things. Once you’ve been indexed you can be found via a relevant search term.
Where you end up on the search page is your ranking.
What we’re talking about here is telling Google that you actually exist.
A study by Ahrefs found that almost 60% of pages that were 3yrs old or more held the top 10 positions on page one of the search results.
Doesn’t look good if your page is new right?
Now I’m not saying it’s impossible to rank on the first page, just know that time is our friend here and the quicker we get the page indexed, the quicker we start the clock.
- Go to Google
- Enter the search term modifier site:yourownsite.com
This will bring up all pages that Google has found on your site.
Note: It may not show all current pages because the crawlers haven’t added them to their central filing system yet.
- To see if the page you want is actually indexed, add the URL slug after your website
An alternative way is to use Google’s Search Console (formerly known as Google Webmaster Tools) and use the URL inspection tool.
From here you can search for the exact URL and the results will look like this:
In this case, the URL has been found and indexed. Congratulations!
Otherwise, it will look like this:
Here Google hasn’t found it, noted by the words “URL is not on Google”.
If you’ve found that your website or the page hasn’t been found, then by clicking on the “Request Indexing” button will start the indexing process for you.
It’s a good idea to get in the habit of clicking “Request Indexing” whenever you have a new page published to your website.
What you’re saying to Google is that there’s something new for them to go check out and it would be worth their while to index it sooner. Remember, it’s all about the user experience.
Of course, if there are issues with older pages then just requesting an index may not likely fix the issue all the time.
In which case here are 5 strategic ways you can get Google to index you faster.
A sitemap as the name suggests is a map of your site. It is a file that contains a list of all the pages within a certain domain.
Google’s web crawlers are good but they’re not that good. Every now and again they need a little help to find new pages or content within that domain.
The sitemap also tells Google how often it needs to crawl your website to find new and updated information to index.
While Google may eventually find your pages and index them if you include a sitemap it can shorten the time Google indexes your page. Eventually, you can get it from 24hrs to just a few minutes.
The best thing is if you’re using WordPress and have an SEO plugin (like RankMath) then creating a sitemap is fairly straightforward.
You take the link and you’ll add it within Google Search Console.
One thing to be mindful of though, is that just because you have a sitemap generated doesn’t guarantee that your pages will be included in the search engines.
We know so far that web crawlers like to go from link to link. So if your page doesn’t have any internal links you’re effectively stopping the link in the chain (pardon the pun).
A tip here is to link from one of your more popular pages, one that is gaining the most amount of traffic, and link it to your new page.
Why? Because Google will likely recrawl a popular page more often than a non-popular one.
To get the most out of internal linking, make it a “follow” link and link out to relevant pages.
As an example, if you wanted to add an internal link to an SEO guide then it would make sense to want to add a link to SEO strategies as it is a relevant article to the audience and something they would be interested in.
There are a few factors that come into play to determine how many pages Google will crawl in any given day.
Things like, the health of the website, site speed, how many errors there are (use Google Search Console for this), and even how many links are pointing to the site.
According to Google, the bigger a website is, the more you may want to pay attention to it.
If you have pages that just aren’t converting for you, it may make more sense to just remove it altogether. The benefit here would be to increase your crawl budget.
If you’re able to obtain a link from a high-ranking website that points to one of your pages, you’re effectively telling Google that your site is valuable. It’s a similar thing to someone voting for you.
This can be beneficial for you because it’ll force Google to look at it and hopefully index you, even better if it ranks you at the same time.
Because Google sees value if someone links back to you, it means that your site will get crawled more often, meaning a faster chance of getting indexed and the snowball continues.
Do not get overwhelmed by this file name. The purpose of a robots.txt file is to tell the web crawlers which pages to crawl or not to crawl.
To check to see if you have any indexing issues go to yourwebsite.com/robots.txt and look for this bit of text:
If you see the / in the disallow section it’s telling the bots not to crawl any pages on your site at all. But it’s ok you can just go ahead and delete whatever is in there.
Yes, it is that easy!
Advanced Tip: Use the disallow function for any pages you don’t want the bots to crawl as a way to boost your crawl budget.
If you’re not found in the search engines, no one will be able to find you (unless you tell them) and it’ll be that much harder to obtain organic traffic.
But you don’t need to be a computer whiz to get started.
Using tools like Google Search Console and SEO plugins mean that you can have full control over what can get indexed and also gives you plenty of information on how your website is performing.
Combine that with these indexing tactics and you’ll be on your way to getting indexed and crawled in record time.
What’s the quickest time you’ve been indexed by Google? Let me know in the comments below.
It’s time to be the pilot of your life and not just the passenger.
Chris Bournelis is a blogger and part-time web developer. He has been working in online business since 2015. Join Chris here at ChrisBournelis.com for the best SaaS reviews and tips to get the most out of your online business.