Site links have started giving 403 errors through Facebook debugger

224 Views Asked by At

Over the last couple of days, I've noticed some odd behaviour in Facebook debugger (also used for Threads).

For instance, this URL refuses to show a preview in debugger:

https://www.hot-dinners.com/Features/Hot-Dinners-recommends/new-london-restaurants-opening-in-march-2024

With the error coming back: URL returned a bad HTTP response code (i.e. a 403 error).

It's happening with other pages too - some of which appeared to work in the last crawl. I talked to the host and they say that nothing has changed their end. Can anyone work out if there's an issue with this page that I'm not seeing - or point me in a direction to get more info? Thanks!

Direct link to debugger: https://developers.facebook.com/tools/debug/?q=https%3A%2F%2Fwww.hot-dinners.com%2FFeatures%2FHot-Dinners-recommends%2Fnew-london-restaurants-opening-in-march-2024

3

There are 3 best solutions below

1
itoctopus On

The first step that you need to do is to check the logs - why are they returning 403 for facebook? Is the scraper IP blocked? Is the bot blocked in the .htaccess file? Is it a modsecurity rule that was recently installed by your host? Do you have a security plugin that might be causing it? If you do, try disabling this plugin for a few moments and see if this works.

If you can't find the traffic anywhere in your logs, then it means that the traffic is blocked at a higher level, possibly by your hosting platform's data center firewall. It might also be a hiccup somewhere which will resolve itself.

3
Luis Flores On

It seems to be a FB error, we are experiencing the same issue since yesterday. Our server and pages are ok.

1
Gavin Hanly On

General update to this. The issue was indeed on Facebook's side. You need to explicitly allow "facebookexternalhit" in your robots.txt. It will take a few hours for Facebook to crawl the new robots file but it will eventually work.