Skip to content

Issue with Django DATABASES CONN_MAX_AGE #428

@martinlehoux

Description

@martinlehoux

From time to time, my postgres database is cluttered with idle connections, and it can lead to new connections failures, which make my application unavailable.

So my solution was to setup idle_session_timeout and idle_in_transaction_session_timeout to 15min on my database, which should be OK because Django is configured with CONN_MAX_AGE = 600 (10min).

But now my worker, which runs with --pool gevent, has failures from time to time, and the stack trace shows

# celery/worker/request.py
self.task.backend.mark_as_revoked(...)
...
# django_celery_results/backends/database.py
self.TaskModel._default_manager.store_result(**task_props)
...
# psycopg/connection.py
def _check_connection_ok(...):
	raise e.OperationalError("the connection is closed")

My understanding is that the connection has been closed by the database before the application has closed it. My investigation showed me that Django manages the connection life-cycle with 2 signals

# django/db/__init__.py
def close_old_connections(**kwargs):
    for conn in connections.all(initialized_only=True):
        conn.close_if_unusable_or_obsolete()


signals.request_started.connect(close_old_connections)
signals.request_finished.connect(close_old_connections)

But I couldn't find such a mechanism in Django Celery Results. So my questions:

  • Is there a mechanism to manage postgres connections life-cycle in Django Celery Result?
  • If there is, is there something I'm missing to make CONN_MAX_AGE work?
  • If not, wouldn't it be good to make it work? With some help, I guess I should be able to do something resembling what Django does with signals.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions