-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Labels
Description
Problem
DownloadTracker._downloads dictionary grows unbounded. Completed downloads remain in memory forever, causing memory leak in long-running applications.
Current behavior:
# tracker.py - downloads never removed
self._downloads[url] = DownloadInfo(...) # Stays foreverImpact on long-running apps:
- 100,000 downloads × ~500 bytes/info = ~50MB minimum
- Grows indefinitely with no cleanup mechanism
- Speed metrics are cleared, but DownloadInfo persists
Proposed Solutions
Option A: Manual clear method (simplest, non-breaking)
def clear_completed(self, max_age: timedelta | None = None) -> int:
"""Remove completed downloads from memory.
Args:
max_age: Only clear downloads completed before this duration ago.
None = clear all completed.
Returns:
Number of downloads cleared
"""
# ImplementationOption B: Automatic cleanup with retention policy
class DownloadTracker:
def __init__(
self,
retention_policy: RetentionPolicy | None = None, # New param
logger: "loguru.Logger" = get_logger(__name__),
):
# Auto-cleanup completed downloads after N hoursOption C: Bounded cache with LRU eviction
Use functools.lru_cache pattern for download info storage.
Leaning towards
Option A for 0.x releases, then consider Option B for 1.0 if usage patterns warrant it.
Tasks
- Implement
clear_completed()method - Add
clear_failed()method (optional) - Add tests for clearing logic
- Document memory management in README
- Add example in docs showing periodic cleanup
Priority
Medium - Not critical for 1.0 if documented, but should be addressed.
Labels
enhancement, memory, medium-priority