How to get Google Indexed Your Website & Blog Pages under 15 minutes
Yes, you can get Google indexed your website less than 15 minutes.
How if Google ranks your new blog posts, web pages every time you publish them? How if people can find your fresh blog post on Google SERPs within a few seconds?
In today’s SEO tutorial, you will not only learn why Google does not index your website and how to get search engines to crawl your site, but also how to get search engines to index your site less than 15 minutes. (I will show you how to do it step by step later)
I have tried a few methods to faster index my websites and new blog posts. I have found 3 effective methods to get Google to index your website and updated web pages faster. In some cases, Google indexed URLs less than 5 minutes.
Yes, less than 5 minutes.
Before we start learning how to index a new website and web pages on Google, let’s learn what is Google indexing and reasons why Google does not to crawl and index your web pages. These are very important as you can get rid of future Google updates and secure your content.
What is Google Indexing and How does Google Index web pages?
Site crawling and indexing is a complicated process which search engines used to get information on the internet. If you search your Twitter handle on Google, you will find your Twitter profile with the title of your name. Same for other information such as Image search.
Basically, before indexing a website, search engines must crawl the website. Before crawling a website, search engines should be able to follow the link. There are many cases, webmasters don’t want search engines to follow a link. So in those cases, search engines could not be able to index the web page and rank it on SERPs.
Each search engines use programs called web spiders/web crawlers (also known as search engine bots) to crawl a web page. The famous Google’s web crawler is known as Googlebot which crawl the content all around the Internet and help to gather useful information.
For Bing search engine, it’s Bingbot. Yahoo search engine gathers web results from Bing. So Bingbot contributes to Bing and Yahoo search engines to add new content. Baidu use a web spider called Baidubot. Yandex search engine uses Yandexspider to find new content for Yandex web index.
Why Does Google Not Index Your Web Pages?
There are many reasons Google not to index your web pages. As you already know, web indexing is the three step of search engine ranking. First Googlebot follows the link (URL) and crawl the web page to find useful information and then make an index on Google databases. When a person searches something on Google, Google will use different factors (famously known as Google ranking factors) and list the result within a few milliseconds.
You can find whether Google has already indexed your website and web pages easily, simply doing a site index search. Site:problogtricks.com
9 Reasons why Google doesn’t index your web pages
You are not updating your blog regularly
This is one of the top reasons why Google doesn’t index your web pages. If you don’t add new content and update older posts, how do you convince Google that your blog is not dead? Still live?
The easiest way to Google to index your web pages every day is to update your blog. You don’t need to add more posts every day. Instead update older posts. Add up-to-date content, information and promote it on social networks. So you will not only convince Google that you are an active blogger, but also drive some extra traffic.
I have found that by updating older content, you can increase the search engine ranking of web pages. The reason? Google’s content fresh factor.
When updating existing blog posts, don’t forget to follow these tactics as well.
- Make sure your content are in topic: Sometimes your post title may mislead readers. So, make sure your article fulfill everything as promised.
- Make sure your article is grammatically correct: Writing error-full articles for your readers will lose the loyalty and decrease the credibility you have built over time. So use Grammarly or other grammar checker tool to check punctuation and find grammatical mistakes.
- Use Synonyms to polish your article: Using same meaning words or synonyms helps your article rank for more keywords. Use these synonym finder tools to generate more equivalent words.
- Add up more long tail keywords: Long tail keywords is so easier to rank rather than head tails. So adding a few more relevant long tail keywords won’t harm your article’s rankings, but will increase targeted traffic.
- Add relevant links: This will increases the efficiency of Googlebot crawls your blog. Add a few relevant inbound and outbound links in your article. Don’t over use this as more links added it will be harder to consume your content.
- Update older content: When you start updating posts you will find that most content worked a few years ago is not complement to this time. For an instance, images used a few years ago is not corresponding to this time. So, make sure your readers don’t get outdated content.
Google has penalized your blog
There are only two ways to penalize your website; Manual penalty and algorithmic update.
Do you know Google penalize over 400,000 websites manually in each month?
The main reason why many websites are penalized is that their backlink profile is not too good. Google Panda and Google Penguin are another two very important updates which will help people to find high quality and most relevant content on the Google search results.
If Google has penalized your blog, then you might be convinced that there is a huge, significant drop in traffic. Go to your Analytics program (most people use Google Analytics and others could use one or more of these real time traffic analyzing programs) and filter overall traffic stats during last six months or one year.
If you can see a huge, substantial traffic drop, Go to Google Search Console dashboard and see whether Google has sent you an important message.
Having lots of crawling errors is a BIG reason why Google doesn’t crawl your website and get new content into index.
So go to Google search console and look for any crawl error such as Not found (404), Not response, Time out.
These errors will not only lose your customers but also relationship with your website and Googlebot too.
Fix crawl errors as soon as possible. Then follow below third step to make a request to crawl again to Google.
Robots can’t Index your web pages
Have you mistakenly blocked search engine bots from crawling and indexing the web pages?
So Google will not be able to index your web pages. Make sure you did not add robots header meta tags to prevent search engines from indexing certain important web pages and used Robots.TXT file to block search engine bots form followpng the links.
Google webmaster tools provide you useful information about site index. You can use those tools such as Robots TXT testing tool to find whether you mistakenly block important web pages. You can see a live Robots.txt over here.
Haven’t enough quality backlinks pointing to web pages
Even we don’t think that Google Pagerank doesn’t cause to search engine ranking, it still matters in faster indexing web pages. Google indexes websites with higher Google pageranks. The more Pagerank your website has, Google will index your web pages more efficiently.
The low quality and irrelevant backlinks won’t help anymore in increasing the ranking. Instead, they will pave the way for Google penguin penalty for your website.
You can easily index your website on Google by generating new pages for your domain in domain search engines, domain value estimater websites and web page traffic analyzing websites. One good way is to start with SEMrush. And use this free tool to build some backlinks to your website.
Poor interlinking strategies
Have you ever thought why your quality, older web pages don’t get much traffic than they used to be?
The reason is your interlinking stretegy sucks. I have been talking much more about interlinking in Blogger SEO and Tumblr SEO guides. Interlinking helps to decrease website bounce rate, increase user engagement and also share the link juice within other web pages.
Internal linking is so much powerful. Your web pages can rank for hundreds of long tail keywords just alone using proper internal-linking strategies.
This is a result of poor interlinking and not having a sitemap for your blog. By generating more dead pages, search engines won’t find important web pages that must be ranked on Google SERP.
At least once, you must interlink certain web page to another page. Also, you should create a sitemap/Archive page on your blog, not only for search engines to find all indexable content, but also people to browse all posts deeply.
Embedding posts sitemap widget into 404 error (Not Found) page will also increase the crawl efficiency. If you use WordPress CMS, you can use this Google XML sitemap generator plugin to create a one for your blog.
Your Web host is down!
Choosing a good web hosting company to host your website will make it super easy to blog on. Even though there are lots of free WordPress web hosting services, you shouldn’t every opt for it.
Either use an inexpensive WordPress hosting company such as Bluehost or a WordPress optimized hosting company like WPEngine.
You have de-indexed your web pages showing in Google SERPs
Did you use Google’s “Remove URLs” tool to remove any URL from Google Index? So, you might not have anything to do rather than edit the URL and add a 301 permanent redirect from older post permalink to new permalink and get search engines crawl your site again.
When you work with Google Search console, you should be careful and be aware of what you are doing exactly. Wrong actions could resonate to remove URLs from Google index quickly rather than it takes to index on Google.
Why should you index your content as quickly as possible on search engines?
Now you know a several reasons why Google doesn’t index your web pages and doesn’t rank them on Google SERP. Here are a few reasons why you should get search engines to index your site as quickly as possible.
- Rank your web pages quickly: why you wait weeks or months to see your fresh content rank on Google SERP? How will it look like if Google rank your new posts within a few minutes in Google search result pages?
- Prevent penalizing your website from duplicate content issue: Perhaps someone might steal your content and do some works to rank on Google. The most horrible thing is that when their pages start outranking your post in Google SERP. Google might penalize your website for duplicate content issue. Therefore, the best thing to do right after publishing your new blog post is getting Googlebot indexed content as soon as possible.
- Increase Overall Keyword density: Google uses site relevancy and overall keyword density to rank web pages for search terms. You can find your website’s high density keywords and keyword stemming in Google webmaster tools. By indexing new posts, you can increase the overall keyword density, so increase the search engine ranking of older posts.
- Convince Google That Your website is active: People love fresh content. So Google also loves websites which update so often and provide quality content. In next time you update your older posts and publish a new post, you should try to get Googlespider indexed your content fast. This will be a good reason Google to know that you update your blog regularly.
3 Effective ways to index your website on Google
#1. Google Ping
The best thing in RSS feed services that most SEOs like is that the Ping services. If you use Feedburner as the RSS feed service for your blog. You can use this method.
Step #1: Sign in to your Feedburner account and go to Feed Title >> Publicize >> Pingshot.
Step #2: Activate the Pingshot service.
Step #3: Noindex RSS feed because, you don’t want to create duplicate content on authority websites.
Besides Feedburners Pingshot service, you can use an online Ping service to ping new URLs to RSS feed services and blog search engines. Here are few online blog pinging services.
#2: Google Submit Your Content – website Owner
This method is more efficient rather than Pinging URLs to block search engines. What you actually do here is that telling Google to index your new URLs quickly. Google does not guarantee that all submitted URLs will take Google to index. Anyway, I haven’t seen they don’t index submitted URLs.
Go to Google webmasters submit URL web page and add your web page address in the input box.
Then enter the human verification captcha code and then click on “Submit Request” button.
You can also add updated older post URLs to Google index again. So you can increase the visibility of post, because Google likes fresh content.
#3: Fetch as Google – Google Webmaster Tools
This is another method to get Google to crawl your site and index your website quickly. I really like this method, because I can crawl entire my website anytime I want. Another thing is that I can see at which time Google indexed the web pages.
I submitted my previous post, Image SEO – How to rank Images on Search Engines to Google Index. Result? Google indexed it within a few minutes and when I searched it again on Google, it showed me that before 13 minutes Google indexed the web page. Here’re the steps of properly adding a web page to the Google index in Google webmaster tools.
Story step #1: Make sure your post is not yet indexed by Google. You can easily find it by searching your web page address on Google. In my case, I searched Image SEO post’s permalink first.
As you can see the post is not yet indexed by Googlebot. Make sure that you didn’t add the noindex tag on the web page as it can prevent Googlebot from indexing the web page.
Story Step #2: Go to Google Search Console >> Crawl >> Fetch As Google.
Story Step #3: Enter the post permalink excluding the homepage URL. I entered 2014/11/image-seo-optimization-tips.html in the input field like screenshot below. Then click on Fetch button.
Story Step #4: In the next step, you would see a pop up box. Here you have to choose the submit method. To submit a URL to Google Index, you have two methods. They are,
- Crawl Only this URL: Google will only crawl the certain permalink and will not crawl other interlinked web pages. Google allows you to submit a ‘crawl only this URL’ request 500 times a month. If you have a small business blog, this amount is enough. If you want more, you can follow the #2 method in this post and #2 method in GWT intelligently.
- Crawl this URL and its direct links: Google will crawl the submitted URL and the interlinked links. You can make this request 10 times a month. It’s very useful when you want to re-index your whole website again.
As I have already requested an entire website crawl a few days ago, I needn’t to request it again. So I selected first method, ‘Crawl Only this URL’ and click on “Go” button.
Story Step #5: After submitting the URL to Google index you would see that Google is ready to take the URL to index. Simply click on “Submit Index” button.
After the clicking the ‘submit to index’ button, you would see which method you used to fetch as Google.
Story Step #6: Here’s the Google SERP when I search the URL again after 13 minutes of break.
Google webmaster Tools’s “Fetch as Google” tool is very useful when it comes to faster indexing a web page on Google. If you post more than 500 posts in a month, you can use Google external webmaster submit a URL tool.
It’s not guaranteed that your submitted URL will be crawled immediately and show it on Google SERP very soon.
However, as an SEOs point of view, these SEO tools are very useful than another crappy tools which promise that they will help index your web pages on Google quickly.
Pro tip: whenever you want index your entire website again, enter your sitemap or archive page URL. So you can easily get Google to index all post pages again easily. Make sure your all post pages include in the archive page.
Internet has become a busier place than any other day. So you have to be careful when it comes to search engine optimization. There are thousands of spam websites who are looking forward to steal your valuable hard-made fresh content. Some are doing it as a hobby; others do it with the intent of penalizing your website from Google.
The best way to protect your blog and make a strong foundation is by getting search engines to index your site as fast as possible. By following above three methods, you can index your website on Google quickly less than ten minutes.
In most cases, i was able to get Google crawl my new posts under five minutes. I don’t have screenshots to prove that now. But, I know that you think that it is possible now.
So, How else can you get Google to index your website fast? Let me know in the comments below. I will respond to every comment.