Allowing Googlebot to crawl a Rails 7 website restricted to Switzerland

52 Views Asked by At

I have a Rails 7 website that I want to restrict to only visitors from Switzerland. To do this, I check the user's country code in the Application Controller, and block access if the user is not from Switzerland. However, I also want to allow Googlebot to crawl the site for SEO purposes. To accomplish this, I've added a check in the block_unless_swiss method to allow access if the user agent contains the string 'Googlebot'. Is this the correct way to allow Googlebot to crawl a restricted website, or is there a better approach?

class ApplicationController < ActionController::Base
  before_action :block_unless_swiss

  def block_unless_swiss
    return if Rails.env.development?

    # Get the user's country code
    country_code = request.location.country_code
    user_agent = request.user_agent
    # If the user is not from Switzerland is the Googlebot, render an error page
    return if country_code == 'CH' || user_agent.include?('Googlebot')

    render 'errors/403', status: :forbidden
   end
end
0

There are 0 best solutions below