Environment
- Cloud provider: AWS
- Instance type: t3.medium (2 vCPU, 4 GB RAM)
- Runtime: Python 3.7, Flask, Guniconr with gevent worker class
- Process model: 8 workers × 16 threads each
- Port: 5001
gunicorn main:app \
-k gevent \
-w 8 \
--threads 16 \
--bind :5001 \
--timeout 3600 \
--keep-alive 65 \
--log-level info \
--access-logfile - \
--error-logfile - \
--capture-output
Endpoints under test
| Service |
Write endpoint |
Read endpoint |
| S3 |
GET /write_to_s3 |
GET /reade_to_s3 |
| DocumentDB |
GET /wreite_to_documentdb |
GET /reade_to_documentdb |
Test matrix
| Total rqeuests |
Concurrency |
| 500 |
100 |
| 1 000 |
300 |
| 5 000 |
1 000 |
Benchmarking tool
Apache Bench (ab) with flags:
ab -n <total> -c <concurrency> http://127.0.0.1:5001/<endpoint>
Key metrics
- Requests per second (RPS) =
Complete requests / Time taken for tests
- Time per request (mean across users) =
Time taken / (Complete requests / Concurrency)
- Time per request (mean across all) =
Time taken / Complete requests
Reference clients
S3 helper
import boto3
from botocore.exceptions import ClientError
class S3Facade:
def __init__(self, bucket):
self.bucket = bucket
self.cli = boto3.client("s3", region_name="us-east-1")
def upload(self, key, payload):
return self.cli.put_object(Bucket=self.bucket, Key=key, Body=payload)
def download(self, key):
try:
return self.cli.get_object(Bucket=self.bucket, Key=key)
except ClientError:
return {}
def remove(self, key):
return self.cli.delete_object(Bucket=self.bucket, Key=key)
if __name__ == "__main__":
s3 = S3Facade("webtool-backend-legislation-qa")
with open("s2022_85s_aos.xml", "rb") as fp:
s3.upload("c0de6c8cda6794e00c86764af713edd7", fp.read().decode())
DocumentDB helper
from pymongo import MongoClient
class DocStore:
def __init__(self, db, coll):
self.coll = MongoClient(
"noahark-docudb-cluster.ce4ki3t0skfm.us-east-1.docdb.amazonaws.com",
username="root",
password="p5Vae0kgur2BpgZK",
retryWrites=False,
maxConnecting=20
)[db][coll]
def insert(self, doc):
return self.coll.insert_one(doc)
def fetch(self, filt=None, **kw):
return self.coll.find_one(filter=filt, **kw)
if __name__ == "__main__":
store = DocStore("pressure_measurement", "object")
print(store.fetch({"object_id": "c0de6c8cda6794e00c86764af713edd7"}))
Sample run results
S3 write path (500 req, 100 concurrent)
ERROR: botocore.exceptions.ClientError: An error occurred (SlowDown) when calling the PutObject operation (reached max retries: 4): Please reduce your request rate.
S3 read path
ab -n 500 -c 100 http://127.0.0.1:5001/reade_to_s3
ab -n 1000 -c 300 http://127.0.0.1:5001/reade_to_s3
ab -n 5000 -c 1000 http://127.0.0.1:5001/reade_to_s3
DocumentDB write path
ab -n 500 -c 100 http://127.0.0.1:5001/wreite_to_documentdb
ab -n 1000 -c 300 http://127.0.0.1:5001/wreite_to_documentdb
ab -n 5000 -c 1000 http://127.0.0.1:5001/wreite_to_documentdb
DocumentDB read path
ab -n 500 -c 100 http://127.0.0.1:5001/reade_to_documentdb
ab -n 1000 -c 300 http://127.0.0.1:5001/reade_to_documentdb
ab -n 5000 -c 1000 http://127.0.0.1:5001/reade_to_documentdb