I am trying to search multiple terms on Wikidata and then parse the results locally using Python.
I am currently looping through a list of terms and running the following piece of code:
import requests
term_list = ["term a", "term b"]
for search_term in term_list:
base_url = "https://www.wikidata.org/w/api.php"
payload = {
"action": "query",
"list": "search",
"srsearch": search_term,
"language": "en",
"format": "json",
"origin": "*",
}
res = requests.get(base_url, params=payload)
This takes a lot of time, as each iteration makes new requests.
Is there a way I could send a batch of terms simultaneously to the Wikidata API, thereby saving me time and saving resources to the API?
edit
By digging deeper in Phabricator, it seems that I can't actually do it (https://phabricator.wikimedia.org/T194016). If anyone has more information on it, it would be very useful.