According to the plan you chose, you will have access to a specific number of concurrent request. This means that you'll be able to only do a specific number of request at the same time.

For example, if you need to make 100 requests and have an allowed concurrency of 5, it means that you can send 5 requests at the same time. The simplest way for you to take advantage of this concurrency is to set up 5 workers / threads and having each of them send 20 requests.

Below you'll find some resources that can help you doing that.


import requests
from multiprocessing.dummy import Pool as ThreadPool

def request_scrapingbee(url):
r = requests.get(
"api_key": "",
"url": url,
response = {
"statusCode": r.status_code,
"body": r.text,
"url": url,
return response

concurrency = 2
pool = ThreadPool(concurrency)

urls = ["", ""]
results =, urls)

for result in results:
Was this article helpful?
Thank you!