My page has an ajax call that loads fresh html content via a separate url.
Google has now spidered these ajax urls and runs it's search console algorythm over them as if it were a stand alone page.
In the live environment, the html returned from the ajax script picks up styles and viewport declarations from the motherpage and renders as expected.
When Googlebot takes a look and spiders the url as a stand alone page, the resulting html fails on mobile userbility (because the styles and viewport info is not available)
Should I prevent Googlebot from spidering the ajax urls using the robots file or is there a better solution to minimising Search Console errors ?
Hunting around for a solution / answer to this question has come up with nothing directly. The closest possible option is for the ajax rendered html to be sent via JSON - but I might have misunderstood the purpose of using JSON ?