Laravel Goutte Crawler timeout not works with "Too many redirects"

123 Views Asked by At

I am doing parsing remote websites with Laravel Goutte, and if remote website have error "Too many redirects" (I see it in Google Chrome), crawler just freeze and nothing happend.

use Goutte\Client;
use Symfony\Component\HttpClient\HttpClient;

 try {
     $client = new Client(HttpClient::create(['timeout' => 5, 'max_redirects' => 1]));
     $client->request('GET', 'https://' . $domain_name);
}catch (\Exception $e){
     $this->error($e->getMessage());
}

I expect that if website will be not reached in 5 sec query will be droped and error will be catched, but for a very long time nothing happend (it just freeze on request row). How I can handle websites with "Too many redirects" errors?

p.s. I don't know if I can post domain(tried with many domains that give this error) where I do query, if yes, I can post it in comments.

0

There are 0 best solutions below