Skip to content

Lock isn't deleted if task killed by hard-time limit #40

Open
@Strawl

Description

@Strawl

If i set a hard time limit and the task exceeds that limit the task gets killed. But Celery Singleton doesn't delete it, so it keeps it from creating new tasks.

Proof:
Hard time limit exceeded:

[2021-10-01 07:33:44,026: ERROR/MainProcess] Hard time limit (300s) exceeded for tasks.<some_task>[04c8d60c-c7c7-43f0-ad8c-0d478da732a0]
[2021-10-01 07:33:44,138: ERROR/MainProcess] Process 'ForkPoolWorker-3' pid:3944596 exited with 'signal 9 (SIGKILL)'

Then celery says this task still exists:

Traceback (most recent call last):
File "/home/build/dev/ansible-integration/project_overview/env/lib/python3.6/site-packages/celery/app/trace.py", line 515, in trace_task
priority=task_priority
File "/home/build/dev/ansible-integration/project_overview/env/lib/python3.6/site-packages/celery/canvas.py", line 219, in apply_async
return _apply(args, kwargs, **options)
File "/home/build/dev/ansible-integration/project_overview/env/lib/python3.6/site-packages/celery_singleton/singleton.py", line 116, in apply_async
return self.on_duplicate(existing_task_id)
File "/home/build/dev/ansible-integration/project_overview/env/lib/python3.6/site-packages/celery_singleton/singleton.py", line 141, in on_duplicate
task_id=existing_task_id,
celery_singleton.exceptions.CelerySingletonException: Attempted to queue a duplicate of task ID 04c8d60c-c7c7-43f0-ad8c-0d478da732a0

For now ill try using soft time limits, which should solve my problem all together, that way my tasks can finish up normally (so it is deleted from that celery-singleton storage or whatever). I don't think this is really that big of an issue, just please address this somewhere so people know.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions