I have a rails app (version 5.2). Its a public site unfortunately, which means bot-city. We have an extensive taxonomy on about 1000+ resource records. We also have a "drill down" function, which bots also have access to.
We have several taxonomies (country, audience, project, sectors, etc) The drill down options are for each taxonomy value (could be well over 50 links on initial page). The drilldown starts on a selected category page (mysite.org/categoryA/). On this page we show all resources (example 100) for that category as well as all taxonomy links between the 100 resources and any values of the taxonomy values. Clicking on a drilldown link runs a search on the initial 100 records and generates a search result page with a new set of taxonomy links (would be less links as this is a drill down from the original 100 resources). We can go down 4 levels.
Bot activity is killing our site. This is mostly due to bots accessing every drilldown link and then the drilldown links on successive levels. For now ive blocked all searches (url "/searchresults"). But I want to block this link for bots only, allowing human visitors to the site (if they still exist) to run searches and drilldowns.
One idea is to front the entire drilldown section with a Recaptcha form. Awkward but feasible. Every page with drill down links would have a button "drill down" that renders a modal with the Recaptcha challenge.
I am trying to see if there is anything else to try. Its all public access so no logins or user framework is used. I use Cloudflare. For now I have to block ips which is a pain. The Cloudflare rate limiter seems ineffective.
Thanks for any ideas.