Let's say that I have 100 functions or "resources" in an application on GCP, such as:
def login:
...
def forgot_password:
...
def edit_payment:
...
I would like to apply rate-limits at a resource level for all of these, with something as simple as adding it within the function or OpenAPI docs for it, conceptually something like this:
@rate(quantity=10, perMinDuration=60)
def login:
...
@rate(quantity=10, perMinDuration=600)
def forgot_password:
...
The rate would apply at the user-level, and in the case the user was not logged in, it would be applied at the IP level. In other words, for each function/resource, I would create a rate limit (if required) along the following lines:
rateLimit:
by: User
resourceName: login
quantity: 10
duration: 60
It seems that Cloud Armor allows doing something like this: https://cloud.google.com/armor/docs/rate-limiting-overview. However, it would be so tedious to set up, as it seems every single 'rule' needs to be added either in the interface or cli, and if there are 100 resources (some of which might change), it would become a nightmare to manage.
Is there a GCP product (perhaps even Cloud Armor in a way I'm not familiar with) that can be used to enforce rate-limits at the resource x user level similar to the above scenario?
And finally one more thing, these are end-users, none of them will have GCP projects, so this point here I think almost kills any potential usage of Cloud Endpoints to accomplish this:
For using the Cloud Endpoints quotas, you have to use API Keys. Here, the key is use for identifying the project that consumes your API.
For the purposes of this question, assume 1M users (i.e., an amount that makes no sense to create a new project per user).