According to the plan you chose, you will have access to a specific number of concurrent request. This means that you'll be able to only do a specific number of request at the same time.

For example, if you need to make 100 requests and have an allowed concurrency of 5, it means that you can send 5 requests at the same time. The simplest way for you to take advantage of this concurrency is to set up 5 workers / threads and having each of them send 20 requests.

Below you'll find some resources that can help you doing that.

Python

import requests
from multiprocessing.dummy import Pool as ThreadPool

def request_scrapingbee(url):
r = requests.get(
url="https://app.scrapingbee.com/api/v1/",
params={
"api_key": "",
"url": url,
},
)
response = {
"statusCode": r.status_code,
"body": r.text,
"url": url,
}
return response

concurrency = 2
pool = ThreadPool(concurrency)

urls = ["", ""]
results = pool.map(request_scrapingbee, urls)
pool.close()
pool.join()

for result in results:
print(result)
Was this article helpful?
Cancel
Thank you!