How to solve HTTP error 429 "too many requests"

1.3k Views Asked by At

i tried the following code but after some scrapes shows the following error HTTP 429 aka too many requests. Is there any kind of delaying the algorith in order to make the job done? Note the code runs for a single page.and all the functions i created are running correctly

    CARS<-data.frame()
for (page_result in seq(from=1, to=2)) {
  link<-paste0("https://www.car.gr/classifieds/cars/?condition=used&modified=15&offer_type=sale&pg=",
               page_result,"&registration-from=2000&significant_damage=f")
  page<-curl::curl(link) %>% read_html(link)
  
  page<-read_html(link)
  
  Title<-page %>% html_nodes(".title") %>% html_text2()
  
  Price<-page %>% html_nodes(".price-fmt") %>% html_text2()
  
  car_links<-page %>% html_nodes(".row-anchor") %>% 
    html_attr("href") %>% paste0("https://www.car.gr", .)
  
  color<-sapply(car_links,get_color)
  gearing_system<-sapply(car_links,get_gearing_sys)
  HP<-sapply(car_links,get_car_HP)
  CC<-sapply(car_links,get_car_CC)
  FUEL<-sapply(car_links,get_car_fuel)
  KM<-sapply(car_links,get_car_km)
  DATE<-sapply(car_links,get_car_km)
  CATEGORY<-sapply(car_links,get_car_category)  
  CARS <- rbind(CARS,Title,Price,color,gearing_system,HP,CC,FUEL,KM,DATE)
  
  print("Page",page_result,"of","2")
}
0

There are 0 best solutions below