By default, a WordPress website includes a robots.txt directive that prevents search engines from crawling and indexing it.
This is helpful while the website is in development, as it prevents the development website from appearing in the search engine results. However, after launching your website, the first thing you should do is to update your robots.txt file to allow the search engines to crawl and index your website.
It’s easy. Log into your WordPress website and go to Settings > Reading. Uncheck the box that discourages the search engines from indexing the website and click save.
Allow 24-48 hours for the search engines to respond to this update.
Still not indexed? If you have a brand new domain and website, the search engines may not pick up on its existence. As a start, verify the website with Google Search Console and Bing Webmaster Tools. You will also want to gain high-quality links that point to your new website. Explore our tips on link building to get started.