URLs without much traffic, if tested with PageSpeed Insights tool have no own CrUX metrics - the test displays instead the metrics from the origin (domain). About the tested URL the tool mentions:
There is insufficient real-user data for this URL.
CrUX mentions
A page is determined to be sufficiently popular if it has a minimum number of visitors. ... An exact number is not disclosed.
I would like to know: how much data is suficient? Does somebody have experience values or estimations?
Edit:
I try to describe a background of my question:
In general, I'm looking for criteria to baste an index bloat. There are two metrics describing URLs which are counted as good/useful:
- An amount of URLs in search results, not according to the site-query, but rather to the amount displayed by Search Console,
- An amount of URLs with Top100-rankings, displayed by other tools, not always correlating with GSC.
Both metrics not always correlate, but knowing both helps to get an average.
In the Core Web Vitals report delivered by GSC there is, for mobile and desktop, the same amount of URLs (a sum of good, bad and needing improvement).
And this is my idea: belong URLs not listed in the GSC Core Web Vitals report to index bloat?
(Disclosure: I helped write those CrUX docs)
You can look at your analytics data to get a sense for how many unique visitors you typically need to be included in the CrUX dataset, but there are a few reasons why this won't always be accurate:
So in short, it's complicated. It's a byproduct of both the complexity of the CrUX dataset as well as deliberate design choices to protect sensitive data.