I am webscraping using RSelenium. After hanging on a website for some minutes, I get the following error message:
"_Error in .Call(R_curl_fetch_memory, enc2utf8(url), handle, nonblocking) : reached elapsed time limit"
It always hang during the following operation:
remDr$navigate(URL)
After applying an error handler, I want to close the current window and start a new webdriver. Unfortunately, I can't connect to the current window to perform any operations, so I also can't find a way to close the current window because the operation was aborted because of an application callback. I suspect that the connection is reset.
Since the website I am scraping hangs quite often, after a while tens of windows are open, which slows down everything.
I don't know if it is relevant, but the setting for the webdriver is as follows:
prefs = list("profile.managed_default_content_settings.images" = 2L, "profile.default_content_settings.popups" = 0L,
"excludeSwitches","disable-popup-blocking"=TRUE)
cprof <- list(chromeOptions = list(prefs = prefs, w3c=FALSE))
remDr <- remoteDriver(browserName = 'chrome', extraCapabilities = cprof, port=4444L)
Every help is appreciated.
I have attempted the following ways to close the window, but nothing helps:
- remDr$close()
- remDr$quit()
- remDr$closeWindow()
- remDr$closeall()
I get a message that no connection could be made to server.
If you inspect the network section, you will see that the site fetches data from their API. You can do the same as such: