Does Googlebot index pages for search
Googlebot indexing and crawling
No, the Googlebot does not index pages for the Google search results. Googlebot is a fetch feature in Google Webmaster Tools.
The Indexer indexes pages for the Google search, not Googlebot.
Googlebot is Googles web crawling robot which retrieves pages found on the world wide web and pages submitted via Webmaster Tools. Googlebot is basically a thief, downloading and copying web pages. At times with the Webmasters permission (Webmaster Tools verification and sitemap submission) and other times without the Webmasters permission.
Googlebot is not a little spider or monkey running around Cyber space as we all imagine. In reality Googlebot does not move anywhere, it works like a web browser, requesting web pages from web servers, downloading, then handing them over to Googles Indexer for indexing.
Pages fetched in Webmaster Tools from your sitemap and the Googlebot fetch feature are purely pages accessible, pages the Googlebot found.
Those found pages are then handed over to the Indexer to analyse (to see if those pages are worthy of the search index, the so called 200+ signals they use)
So don't confuse the two. Googlebot fetches and the Indexer indexes.
That said and done. Google continually crawl the web looking for new links. So in reality they should automatically find your site in an open source directory or for that matter, somewhere else.
To speed up the crawling process its advised to ask Google to crawl your site by opening a Webmaster Tools Account using your gmail account and submitting a sitemap for Googlebots attention. That by itself will speed up the crawling and indexing process.
How to check if my site is indexed by the Google Indexer
You first need to open a Webmaster Tools Account, verify your site and submit a sitemap (under sitemap submit) then in the dashboard of that specific account you will see sitemaps, under that will display the amount of pages submitted to the Googlebot.
Under indexing, left navigation, Google will display a graft of all pages indexed. It could take a couple of weeks before a new sites index pages display, or for that matter, perform in the Google search results.
The site:yoursite.com protocol in the Google search results is not an accurate way of checking indexed pages, it usually automatically display pages the moment you use the fetch feature (fetch as Googlebot is under crawling, left navigation) the site:yoursite.com protocol displays indexed as well as fetched pages.
Always remembering Google don't guarantee that all pages crawled/fetched by Googlebot will be indexed in the search results. Pages need to add value to the web to appear in the index. If pages are scraped, copy, thin or junk, Google wont index them.
To avoid such issues, take the time out and read the Google Quality Guidelines or read our articles, specifically Googles 200 + crawling factors they use when weighing your site for the search results.
Or read Googles article about adding a site to Google for indexing.
Can I fetch HTTPS URL's with fetch as Googlebot?
Yes in Webmaster Tools you can fetch HTTPS URL's with the Googlebot fetch feature, and you can include the protocol, then submit to index.
The HTTPS site first needs to be verified the same as any other site. If you have not proven ownership by verification, the fetch as Googlebot wont work.