If you’re creating content then it may seem obvious that you need Google to actually know you exist.
You’ve just created your blog and written your first masterpiece. You get the nervous feeling of when you hit publish that it is off onto the internet for everyone to read.
But there’s a problem with that. Unless your audience knows about it, Google may take weeks or even months to find it! I’ve been there and trust me it’s not fun waiting around. I would be checking my Google Analytics website at least ten times a day to see if any traffic had come to my new blog post.
Fortunately, in this article, I’m going to show you the best ways on how to get Google to index your site and blog posts in record time.
It’s part of the fundamentals of being a webmaster. Never, ever overlook SEO techniques to help you not only get indexed but also get ranked.
- What do the Terms Crawling and Indexing Actually Mean?
- How to Find Out if You’re Already Indexed on Google?
- How to Get Google to Index Your Site
- Bottom Line
The way a search engine works is they regularly send out their crawlers (known as search engine spiders) to find websites and any web pages within the entire site itself. Once found, they then add it to the Google index for filing.
It’s important to know here that Google indexing and ranking are not the same things. Once you’ve been indexed you can be found via a relevant search term. Simply search Google itself, or use an alternative method through your Google Search Console admin page.
Where you end up on the search page is your ranking.
What we’re talking about here is telling Google that you actually exist.
A study by Ahrefs found that almost 60% of pages that were 3yrs old or more held the top 10 positions on page one of the search results.
Doesn’t look good if your page is new right?
Now I’m not saying it’s impossible to rank on the first page, just know that time is our friend here and the quicker we get the page indexed, the quicker we start the clock.
- The first step is to go to Google.
- Enter the search term modifier site:yourownsite.com
This will bring up all pages that Google’s search engine has found on your site.
Note: It may not show all current pages because the crawlers haven’t added them to their central filing system yet.
- To see if the web page you want is actually indexed, add the URL slug after your website in the search bar.
An alternative way is to use Google’s Search Console (formerly known as Google Webmaster Tools) and use the URL inspection tool.
From here you can search for the exact URL and the results will look like this:
In this case, the URL has been found and indexed. Congratulations!
Otherwise, it will look something like this:
Here Google hasn’t found it, noted by the words “URL is not on Google”. Some other more common issues can include:
- Discovered – currently not indexed.
- Crawled – currently not indexed.
- Blocked by robots.txt
For a full list of what all of these terms mean and more click here.
If you’ve found that your website or the page hasn’t been found, then by clicking on the “Request Indexing” button will start the indexing process for you.
If you’ve added an RSS feed to your home page then it’s likely that Google’s index will crawl it but not index it. Seeing as we’re not normally looking to make money on these pages, this makes sense that we don’t want them to show up in the organic search queries.
It’s a good idea though to get in the habit of clicking the “Request Indexing” button whenever you have a new page published to your website or you have updated the page with new content.
What you’re saying to Google is that there’s something new for them to go check out and it would be worth their while to index it sooner. Keep in mind that it’s all about the user experience.
Of course, if there are issues with older blog content then just requesting an index may not likely fix the issue all the time.
In which case here are some strategic ways you can get Google to index you faster.
A sitemap as the name suggests is a map of your site. It is a file in XML format that contains a list of all the pages and posts within a certain domain.
Google’s web crawlers are good but they’re not that good. Every now and again they need a little help to find new pages or content within that domain.
Sometimes also known as the XML sitemap, it tells the search engine bots how often it needs to crawl your website to find new and updated information to index.
While Google may eventually find your pages and index them if you include a sitemap it can shorten the time Google indexes your page. Eventually, you can get it from 24hrs to just a few minutes.
The best thing is if you’re using WordPress and have an SEO plugin (like RankMath) then creating a sitemap is fairly straightforward.
You take the link and you’ll add it within Google Search Console.
One thing to be mindful of though, is that just because you have a sitemap generated doesn’t guarantee that your pages will be included in the search engines.
We know so far that web crawlers like to go from link to link. So if your page doesn’t have any internal links you’re effectively stopping the link in the chain (pardon the pun).
Pro Tip: Create internal links from one of your more important pages. One that is gaining the most amount of traffic, and internally link it to your new page.
Why? Because Google will likely recrawl a popular page more often than a non-popular one.
To get the most out of internal linking, make it a “do-follow” link and link out to relevant pages.
As an example, if you wanted to add an internal link to an SEO guide then it would make sense to want to add a link to SEO strategies as it is a relevant article to the audience and something they would be interested in.
Finally, when it comes to adding internal links, be sure that there are no broken links throughout your website’s content. Any indexed pages could take a hit in search rankings because the chain has effectively been stopped.
There are a few factors that come into play to determine how many pages Google will crawl in any given day.
Things like the health of the website, site speed, how many errors there are (use Google Search Console for this), and even how many links are pointing to the site.
According to Google, the bigger a website is, the more you may want to pay attention to it.
If you have pages that just aren’t converting for you, it may make more sense to just remove it altogether. The benefit here would be to increase your crawl budget.
If you’re able to obtain a link from a high-ranking website that points to one of your pages, you’re effectively telling Google that your site is valuable. It’s a similar thing to someone voting for you.
This can be beneficial for you because it’ll force Google to look at it and hopefully index you, even better if it ranks you at the same time.
Because Google sees value if someone links back to you, it means that your site will get crawled more often, meaning a faster chance of getting indexed and the snowball continues.
Do not get overwhelmed by this file name. The purpose of a robots.txt file is to tell the web crawlers which pages to crawl or not to crawl.
To check to see if you have any indexing issues go to yourwebsite.com/robots.txt and look for this bit of text:
If you see the / in the disallow section it’s telling the bots not to crawl any pages on your site at all. But it’s ok you can just go ahead and delete whatever is in there.
Yes, it is that easy!
Advanced Tip: Use the disallow function for any pages you don’t want the bots to crawl as a way to boost your crawl budget.
If you’re not found in the search engines, no one will be able to find you (unless you tell them through social media accounts) and it’ll be that much harder to obtain organic traffic and climb the SEO rankings.
But you don’t need to be a computer whiz to get started, you just need a good content marketing strategy combined with good SEO practices. In other words, have high-quality content and inbound links.
Using tools like Google Search Console and SEO plugins mean that you can have full control over what can get indexed and also gives you plenty of information on how your website is performing.
This works with landing pages, an e-commerce site, a WordPress site, and even an entirely new site.
Something to be aware of is that not every page on your own site may get indexed at first.
However, with these tactics, you’ll be on your way to getting indexed and crawled in record time with an awesome amount of website visitors.
What’s the quickest time you’ve been indexed by Google? Let me know in the comments below.
It’s time to be the pilot of your life and not just the passenger.
Chris Bournelis is a blogger and part-time web developer. He has been working in online business since 2015. Join him here at ChrisBournelis.com for the best SaaS reviews and tips to get the most out of your online business.