I'm struggling with dlp : I have to scan 1000 lines of more than 30000 bigquery table each month. Instead of having to give habilitations to each bigquery to the managed service account of its project, we would rather use one "master" project and so authorize only its dlp managed service account.
It would allow us to manage the jobs from only one project and not having to check status/conf in a lot of different project.
I tried to create trigger for all of our table with a template, but the limits is 1000 triggers
So I'm wondering what could be the right strategy ? do I have to create jobs each month ?
Have you considered Data Profiling? It lets you setup org or project level profiling of ALL your tables so that you don't have to manage the orchestration. It'll give you column level details to identify the likely types found within tables. You can choose the cadence of upgrades based on either table data changing or schema upgrades (or no updates).