22

How to get Google Indexed Your Website & Blog Pages under 15 minutes

Google.com image

Yes, you can get Google indexed your website less than 15 minutes.

How if Google ranks your new blog posts, web pages every time you publish them? How if people can find your fresh blog post on Google SERPs within a few seconds?

In today’s SEO tutorial, you will not only learn why Google does not index your website and how to get search engines to crawl your site, but also how to get search engines to index your site less than 15 minutes. (I will show you how to do it step by step later)

I have tried a few methods to faster index my websites and new blog posts. I have found 3 effective methods to get Google to index your website and updated web pages faster. In some cases, Google indexed URLs less than 5 minutes.

Yes, less than 5 minutes.

Before we start learning how to index a new website and web pages on Google, let’s learn what is Google indexing and reasons why Google does not to crawl and index your web pages. These are very important as you can get rid of future Google updates and secure your content.

What is Google Indexing and How does Google Index web pages?

Site crawling and indexing is a complicated process which search engines used to get information on the internet. If you search your Twitter handle on Google, you will find your Twitter profile with the title of your name. Same for other information such as Image search.

Basically, before indexing a website, search engines must crawl the website. Before crawling a website, search engines should be able to follow the link. There are many cases, webmasters don’t want search engines to follow a link. So in those cases, search engines could not be able to index the web page and rank it on SERPs.

Each search engines use programs called web spiders/web crawlers (also known as search engine bots) to crawl a web page. The famous Google’s web crawler is known as Googlebot which crawl the content all around the Internet and help to gather useful information.

For Bing search engine, it’s Bingbot. Yahoo search engine gathers web results from Bing. So Bingbot contributes to Bing and Yahoo search engines to add new content. Baidu use a web spider called Baidubot. Yandex search engine uses Yandexspider to find new content for Yandex web index.

Why Does Google Not Index Your Web Pages?

There are many reasons Google not to index your web pages. As you already know, web indexing is the three step of search engine ranking. First Googlebot follows the link (URL) and crawl the web page to find useful information and then make an index on Google databases. When a person searches something on Google, Google will use different factors (famously known as Google ranking factors) and list the result within a few milliseconds.

You can find whether Google has already indexed your website and web pages easily, simply doing a site index search. Site:problogtricks.com

Ex: site:www.problogtricks.com

9 Reasons why Google doesn’t index your web pages

You are not updating your blog regularly

This is one of the top reasons why Google doesn’t index your web pages. If you don’t add new content and update older posts, how do you convince Google that your blog is not dead? Still live?

The easiest way to Google to index your web pages every day is to update your blog. You don’t need to add more posts every day. Instead update older posts. Add up-to-date content, information and promote it on social networks. So you will not only convince Google that you are an active blogger, but also drive some extra traffic.

I have found that by updating older content, you can increase the search engine ranking of web pages. The reason? Google’s content fresh factor.

When updating existing blog posts, don’t forget to follow these tactics as well.

  • Make sure your content are in topic: Sometimes your post title may mislead readers. So, make sure your article fulfill everything as promised.
  • Make sure your article is grammatically correct: Writing error-full articles for your readers will lose the loyalty and decrease the credibility you have built over time. So use Grammarly or other grammar checker tool to check punctuation and find grammatical mistakes.
  • Use Synonyms to polish your article: Using same meaning words or synonyms helps your article rank for more keywords. Use these synonym finder tools to generate more equivalent words.
  • Add up more long tail keywords: Long tail keywords is so easier to rank rather than head tails. So adding a few more relevant long tail keywords won’t harm your article’s rankings, but will increase targeted traffic.
  • Add relevant links: This will increases the efficiency of Googlebot crawls your blog. Add a few relevant inbound and outbound links in your article. Don’t over use this as more links added it will be harder to consume your content.
  • Update older content: When you start updating posts you will find that most content worked a few years ago is not complement to this time. For an instance, images used a few years ago is not corresponding to this time. So, make sure your readers don’t get outdated content.

Google has penalized your blog

There are only two ways to penalize your website; Manual penalty and algorithmic update.

Do you know Google penalize over 400,000 websites manually in each month?

The main reason why many websites are penalized is that their backlink profile is not too good. Google Panda and Google Penguin are another two very important updates which will help people to find high quality and most relevant content on the Google search results.

If Google has penalized your blog, then you might be convinced that there is a huge, significant drop in traffic. Go to your Analytics program (most people use Google Analytics and others could use one or more of these real time traffic analyzing programs) and filter overall traffic stats during last six months or one year.

If you can see a huge, substantial traffic drop, Go to Google Search Console dashboard and see whether Google has sent you an important message.

Crawl Errors

Having lots of crawling errors is a BIG reason why Google doesn’t crawl your website and get new content into index.

So go to Google search console and look for any crawl error such as Not found (404), Not response, Time out.

These errors will not only lose your customers but also relationship with your website and Googlebot too.

Fix crawl errors as soon as possible. Then follow below third step to make a request to crawl again to Google.

Robots can’t Index your web pages

Have you mistakenly blocked search engine bots from crawling and indexing the web pages?

So Google will not be able to index your web pages. Make sure you did not add robots header meta tags to prevent search engines from indexing certain important web pages and used Robots.TXT file to block search engine bots form followpng the links.

Google webmaster tools provide you useful information about site index. You can use those tools such as Robots TXT testing tool to find whether you mistakenly block important web pages. You can see a live Robots.txt over here.

Haven’t enough quality backlinks pointing to web pages

Even we don’t think that Google Pagerank doesn’t cause to search engine ranking, it still matters in faster indexing web pages. Google indexes websites with higher Google pageranks. The more Pagerank your website has, Google will index your web pages more efficiently.

The low quality and irrelevant backlinks won’t help anymore in increasing the ranking. Instead, they will pave the way for Google penguin penalty for your website.

You can easily index your website on Google by generating new pages for your domain in domain search engines, domain value estimater websites and web page traffic analyzing websites. One good way is to start with SEMrush. And use this free tool to build some backlinks to your website.

Poor interlinking strategies

Have you ever thought why your quality, older web pages don’t get much traffic than they used to be?

The reason is your interlinking stretegy sucks. I have been talking much more about interlinking in Blogger SEO and Tumblr SEO guides. Interlinking helps to decrease website bounce rate, increase user engagement and also share the link juice within other web pages.

Internal linking is so much powerful. Your web pages can rank for hundreds of long tail keywords just alone using proper internal-linking strategies.

Dead Pages

This is a result of poor interlinking and not having a sitemap for your blog. By generating more dead pages, search engines won’t find important web pages that must be ranked on Google SERP.

At least once, you must interlink certain web page to another page. Also, you should create a sitemap/Archive page on your blog, not only for search engines to find all indexable content, but also people to browse all posts deeply.

Embedding posts sitemap widget into 404 error (Not Found) page will also increase the crawl efficiency. If you use WordPress CMS, you can use this Google XML sitemap generator plugin to create a one for your blog.

Your Web host is down!

Choosing a good web hosting company to host your website will make it super easy to blog on. Even though there are lots of free WordPress web hosting services, you shouldn’t every opt for it.

Either use an inexpensive WordPress hosting company such as Bluehost or a WordPress optimized hosting company like WPEngine.

Bonus Offer!

Get your WordPress blog hosted on Bluehost just for $2.95/mo with this Bluehost coupon.

You have de-indexed your web pages showing in Google SERPs

Did you use Google’s “Remove URLs” tool to remove any URL from Google Index? So, you might not have anything to do rather than edit the URL and add a 301 permanent redirect from older post permalink to new permalink and get search engines crawl your site again.

When you work with Google Search console, you should be careful and be aware of what you are doing exactly. Wrong actions could resonate to remove URLs from Google index quickly rather than it takes to index on Google.

Why should you index your content as quickly as possible on search engines?

Now you know a several reasons why Google doesn’t index your web pages and doesn’t rank them on Google SERP. Here are a few reasons why you should get search engines to index your site as quickly as possible.

  • Rank your web pages quickly: why you wait weeks or months to see your fresh content rank on Google SERP? How will it look like if Google rank your new posts within a few minutes in Google search result pages?
  • Prevent penalizing your website from duplicate content issue: Perhaps someone might steal your content and do some works to rank on Google. The most horrible thing is that when their pages start outranking your post in Google SERP. Google might penalize your website for duplicate content issue. Therefore, the best thing to do right after publishing your new blog post is getting Googlebot indexed content as soon as possible.
  • Increase Overall Keyword density: Google uses site relevancy and overall keyword density to rank web pages for search terms. You can find your website’s high density keywords and keyword stemming in Google webmaster tools. By indexing new posts, you can increase the overall keyword density, so increase the search engine ranking of older posts.
  • Convince Google That Your website is active: People love fresh content. So Google also loves websites which update so often and provide quality content. In next time you update your older posts and publish a new post, you should try to get Googlespider indexed your content fast. This will be a good reason Google to know that you update your blog regularly.

3 Effective ways to index your website on Google

#1. Google Ping

The best thing in RSS feed services that most SEOs like is that the Ping services. If you use Feedburner as the RSS feed service for your blog. You can use this method.

Step #1: Sign in to your Feedburner account and go to Feed Title >> Publicize >> Pingshot.

Step #2: Activate the Pingshot service.

Pingshot

Step #3: Noindex RSS feed because, you don’t want to create duplicate content on authority websites.

Noindex RSS feed

Besides Feedburners Pingshot service, you can use an online Ping service to ping new URLs to RSS feed services and blog search engines. Here are few online blog pinging services.

  1. Google Ping
  2. Pingler
  3. Pingomatic

#2: Google Submit Your Content – website Owner

This method is more efficient rather than Pinging URLs to block search engines. What you actually do here is that telling Google to index your new URLs quickly. Google does not guarantee that all submitted URLs will take Google to index. Anyway, I haven’t seen they don’t index submitted URLs.

Go to Google webmasters submit URL web page and add your web page address in the input box.

Submit Your Content - Google Webmasters

Then enter the human verification captcha code and then click on “Submit Request” button.

You can also add updated older post URLs to Google index again. So you can increase the visibility of post, because Google likes fresh content.

#3: Fetch as Google – Google Webmaster Tools

This is another method to get Google to crawl your site and index your website quickly. I really like this method, because I can crawl entire my website anytime I want. Another thing is that I can see at which time Google indexed the web pages.

I submitted my previous post, Image SEO – How to rank Images on Search Engines to Google Index. Result? Google indexed it within a few minutes and when I searched it again on Google, it showed me that before 13 minutes Google indexed the web page. Here’re the steps of properly adding a web page to the Google index in Google webmaster tools.

Story step #1: Make sure your post is not yet indexed by Google. You can easily find it by searching your web page address on Google. In my case, I searched Image SEO post’s permalink first.

Not yet indexed

As you can see the post is not yet indexed by Googlebot. Make sure that you didn’t add the noindex tag on the web page as it can prevent Googlebot from indexing the web page.

Story Step #2: Go to Google Search Console >> Crawl >> Fetch As Google.

Fetch as Google - Google Webmaster Tools

Story Step #3: Enter the post permalink excluding the homepage URL. I entered 2014/11/image-seo-optimization-tips.html in the input field like screenshot below. Then click on Fetch button.

Fetch as Google

Story Step #4: In the next step, you would see a pop up box. Here you have to choose the submit method. To submit a URL to Google Index, you have two methods. They are,

  1. Crawl Only this URL: Google will only crawl the certain permalink and will not crawl other interlinked web pages. Google allows you to submit a ‘crawl only this URL’ request 500 times a month. If you have a small business blog, this amount is enough. If you want more, you can follow the #2 method in this post and #2 method in GWT intelligently.
  2. Crawl this URL and its direct links: Google will crawl the submitted URL and the interlinked links. You can make this request 10 times a month. It’s very useful when you want to re-index your whole website again.
Choose submit method - Fetch as Google

As I have already requested an entire website crawl a few days ago, I needn’t to request it again. So I selected first method, ‘Crawl Only this URL’ and click on “Go” button.

Story Step #5: After submitting the URL to Google index you would see that Google is ready to take the URL to index. Simply click on “Submit Index” button.

submit to index

After the clicking the ‘submit to index’ button, you would see which method you used to fetch as Google.

URL submitted to Index - complete

Story Step #6: Here’s the Google SERP when I search the URL again after 13 minutes of break.

Indexed web page in google serp

Google webmaster Tools’s “Fetch as Google” tool is very useful when it comes to faster indexing a web page on Google. If you post more than 500 posts in a month, you can use Google external webmaster submit a URL tool.

It’s not guaranteed that your submitted URL will be crawled immediately and show it on Google SERP very soon.

However, as an SEOs point of view, these SEO tools are very useful than another crappy tools which promise that they will help index your web pages on Google quickly.

Pro tip: whenever you want index your entire website again, enter your sitemap or archive page URL. So you can easily get Google to index all post pages again easily. Make sure your all post pages include in the archive page.

Conclusion

Internet has become a busier place than any other day. So you have to be careful when it comes to search engine optimization. There are thousands of spam websites who are looking forward to steal your valuable hard-made fresh content. Some are doing it as a hobby; others do it with the intent of penalizing your website from Google.

The best way to protect your blog and make a strong foundation is by getting search engines to index your site as fast as possible. By following above three methods, you can index your website on Google quickly less than ten minutes.

In most cases, i was able to get Google crawl my new posts under five minutes. I don’t have screenshots to prove that now. But, I know that you think that it is possible now.

So, How else can you get Google to index your website fast? Let me know in the comments below. I will respond to every comment.

Chamal Rathnayaka
 

A blogger, Viral Hacker, and Internet Marketer since 2012. Chamal Rathnayaka is the founder of this very site and he's sharing his experience and knowledge on Internet marketing through Pitiya blog. Send him a message on Pitiya.com/contact.

  • Karen Bradly says:

    Great article. I have recently learned how important indexing is and I appreciate you explaining to me some of the reasons why my page is not getting indexed faster. Thanks 🙂

    • Yes, Karen, web page indexing is very important in search engine optimization. Quicker your posts index, you will get trafficfrom search engines quickly. Glad you find this article is useful. Thanks for comment 🙂

  • informative blog on indexing of a website, but if i change one blog one or two times then it wouldn't effect the site performance.

    • Yes, the more you update the blog, Google will index your blog efficiently. Google likes fresh content. Thanks for sharing your ideas!

  • pradeep says:

    hi, thanks Chamal this is really helpful tips for me…now i can do it in fast way..
    thanks

    • Pradeep, glad this guide helped you learn indexing webpages. Every time you any blog post, update existing one, follow this method.

  • Jenny says:

    hi, thanks for your guidance, but I have a question. Do you have any suggestions about how to get index for web 2.0. I know that blogspot, wordpress, tumblr are get index fast but the other site like beep.com get index slowly. Even though I submit it to google, ping, share social site but it gets no index.

    Thank you.

    • If the site is new to internet, then it would take a few minutes to hours or sometimes dates to index in search engines. If you don't see Google index your website, then create some social signals (especially Google+ likes and shares, URL in YouTube description also works pretty well) through content sharing networks. And then verify your website with GWT and fetch as Googlebot. Your web 2.0 website/web pages should be indexed by Google within a few minutes.

      Make sure you didn't block web-spiders from crawling web pages on your site. If you use fetch as Googlebot tool with having no-index tags on header, then it is violating Google terms.

    • Jenny says:

      Many thanks for your help. I will try to share social sites

    • Jenny, Follow above tips. Surely, your website will be listed on Google SERP within a few minutes. Let me know if i can help you in any other way. Thanks!

  • Sam Torres says:

    Chamal, thanks for writing these lines. I was trying to get my posts indexed quicker for a long time. Now, thanks to your method, I am able to do that. Hopefully this works for an already established website.

    • Yes, Sam, these tips work very well for already established websites as well. Do a try and let me know if it doesn’t work.

  • Raja says:

    Chamal Priyadarshana Sir,

    How i can index all my web page within a hour, i have over 200 web pages but only 3 pages are indexed, so how i can index quickly

    • Submit your blog’s sitemap page to Google.

      • Raja says:

        Chamal Priyadarshana Sir,

        How i can get sitemap in my blog and how i can get quick index in my all blog pages, please guide me. Please render steps

  • Hi Chamal,

    I indeed this one is a great article shared. Yes if you want to rank your pages on Google, then you have to submit a sitemap to Google.

    This is a very Crucial part of search engine optimization. Without it your website never rank on Search Engines.

    Once again Thanks for sharing this wonderful article. 🙂

    ~Swapnil Kharche

    • Hi Swapnil,

      Yes, of course, submitting a sitemap is very important in most cases.

      Thanks for sharing your thoughts Swapnil!

  • Helpful article. It is really a common problem that we do not get google index quickly after publishing our posts. But it is not a good sign. Because if google does not index our posts or pages, we will not appear in SERPs. So we need the proper guideline. I think your post is full of perfect information. Thanks for sharing.

    • Yes, you are right munna. If search engines don’t see our pages, how could we expect to receive organic traffic? We can’t.

      Glad you found these information helpful.

  • jasmy Fenze says:

    Very helpful article, It is a common problem that we do not get Google index quickly after publishing our posts. I think your post will surely help to solve this common issue. Thanks a ton.

  • Yelesh says:

    In Search Console I am able to find only three menu items Home, All messages, Other Resources. Intially all the options appeared. Plz help me to fix the problem

    • I believe you’re in the Home page of Google Search Console where you can see all verified websites. It looks like you don’t have verified your website with Google. First submit your website to Google Search Console and then follow the steps.

  • >