Did you create a new article for your blog and used webmaster tool to page index also crawl a website, which is done by most of SEO professionals. Even though you submitted your sitemap.xml to webmaster tools, Google does not guarantee that all your links will be indexed also there is chance that your content is copied from other genuine website which is already indexed and ranked, so to know whether all your pages submitted is allowed or blocked and indexed in Google search engine. Well here is another Great tool by Google Webmaster which gives you option to manually check whether your links is passing robots.txt tester or is it blocked.
How can I get Google to index more of my Sitemap URLs? Matt Cutts Explains in this video
How to use robot.txt tester
1. Login to your webmaster account
2. Click crawl which is fourth option then select robots.txt tester (remember to get your page indexed faster by adding your link using fetch as Google click this link to read an article explained by SEO Professional Deepak Kanakaraju)
3. Select robots.txt tester, so here you would find option to enter you link address as shown in below images
4. Now you can manually check your links with options as Googlebot, also you have choice with Googlebot-news, Googlebot-image but use Googlebot to check your links.
5. In robots.txt you have one more option where you can download, view uploaded version and submit it to webmaster tools
So in these five steps you can check and update your robots.txt file, which helps to know link status of blocked.
Select Ask Google to update to submit your robots.txt file which will be updated in a minute so once it’s updated you will find the confirmation time stamp like this below