I am trying to get all links with pagination. I am using selenium. Since there is more than ten thousand data, it will take time to get all links and some informations in it. I want to use pagination. After taking ten linkS, it should click next button. and take other ten links. If you have any suggestion instead of using selenium, I am happy to hear that. Here is my code, how can integrate pagination into my code? Thanks for your help
from selenium import webdriver
from selenium.webdriver.chrome.service import Service
from selenium.webdriver.common.by import By
import pandas as pd
import undetected_chromedriver as uc
import time
website='https://clinicaltrials.gov/ct2/results?cond=&term=&cntry=TR&state=&city=&dist='
path=r"C:\Users\kaant\Downloads\chromedriver.exe"
service=Service(executable_path=path)
#driver=uc.Chrome(service=service)
#driver.get(website)
options = uc.ChromeOptions()
options.headless = True
driver = uc.Chrome()
driver.get(website)
driver.maximize_window()
time.sleep(2)
b=driver.find_elements(By.XPATH, value=".//td/a")
country_links = [element.get_attribute("href") for element in
driver.find_elements(By.XPATH, value=".//td/a")]
cc=country_links[1:-1:2]
cc.append(country_links[-1])
then it should go to next page and do the same
You can use their Ajax API to download the data, for example:
Prints: