I have a FastAPI application utilizing FastAPI, Uvicorn, SQLAlchemy, and MySQL. When I send numerous requests to the API, I encounter the error:
sqlalchemy.exc.TimeoutError: QueuePool limit of size 10 overflow 20 reached
Configuration Details: I've configured the database settings as follows:
engine = create_engine(settings.DATABASE_URI, pool_pre_ping=True, pool_size=10, max_overflow=20)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Endpoint Database Access: Database access within endpoints is managed using a dependency named get_db() as follows:
from database import SessionLocal
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
I expected the application to handle requests sequentially, given that I'm using only one worker thread in Uvicorn. However, despite this setup, I'm puzzled by the presence of more than one active connection in the pool concurrently. I've reviewed my configuration and usage patterns but haven't been able to pinpoint why this concurrency issue is occurring.
On the back of @FiddlingAway's comment. If your connection hits an error for example, you might not be closing your connection and they build up. You can use a context manager to better handle your database connection and ensure it's always closed after you've finished with it: