I have built and published quite a few websites and never had the following issue:
Google is not indexing my website. Whenever I submit the page (in Google Search Console) it says "blocked by robots.txt" although the robots.txt allows every crawler (User-agent: * and Allow: /). The robots.txt is accessible via mydomain.com/robots.txt and the site's sitemap is accessible via mydomain.com/sitemap.
I have tried it with two different hosting providers: Dreamhost.com and Fastcomet.com. The issue persists however, and I cannot see why. The domains are registered with Namecheap.com which I have been using for many other sites since forever.
I use Grav CMS -- a terrific flat-file CMS -- which usually works flawlessly and I don't think that the CMS causes the problem.
Here below is a screenshot of Google's error message inside Google Search Console. Obviously, the robots.txt cannot be the culprit, since crawlers are allowed access.
Lastly, not even the domain is coming up in Google's search results. Usually, Google displays a domain without the accompanying description etc., if it is not allowed to crawl that domain.
