diff --git a/README.md b/README.md index 59f1b98..28923a7 100644 --- a/README.md +++ b/README.md @@ -38,6 +38,7 @@ Looking to **upgrade from V1 to V2**? Look [here](#upgrading-from-v1-to-v2) - [REMOVE_SLOW](#remove_slow) - [REMOVE_STALLED](#remove_stalled) - [REMOVE_UNMONITORED](#remove_unmonitored) + - [REMOVE_DONE_SEEDING](#remove_done_seeding) - [SEARCH_CUTOFF_UNMET](#search_unmet_cutoff) - [SEARCH_MISSING](#search_missing) - [DETECT_DELETIONS](#detect_deletions) @@ -49,8 +50,6 @@ Looking to **upgrade from V1 to V2**? Look [here](#upgrading-from-v1-to-v2) - [WHISPARR](#whisparr) - [Downloaders](#download-clients) - [QBITTORRENT](#qbittorrent) - - [SABNZBD](#sabnzbd) -- [Disclaimer](#disclaimer) ## Overview @@ -69,6 +68,7 @@ Feature overview: - Removing downloads that are repeatedly have been found to be slow (remove_slow) - Removing downloads that are stalled (remove_stalled) - Removing downloads belonging to movies/series/albums etc. that have been marked as "unmonitored" (remove_unmonitored) +- Removing completed downloads from your download client that match certain criteria (remove_done_seeding) - Periodically searching for better content on movies/series/albums etc. where cutoff has not been reached yet (search_cutoff_unmet) - Periodically searching for missing content that has not yet been found (search_missing) @@ -227,6 +227,11 @@ services: # As written above, these can also be set as Job Defaults so you don't have to specify them as granular as below. # REMOVE_BAD_FILES: | # keep_archives: True + # REMOVE_DONE_SEEDING: | + # target_tags: + # - "Obsolete" + # target_categories: + # - "autobrr" # REMOVE_FAILED_DOWNLOADS: True # REMOVE_FAILED_IMPORTS: | # message_patterns: @@ -326,9 +331,9 @@ Decluttarr v2 is a major update with a cleaner config format and powerful new fe - ๐Ÿงผ **Bad files handling**: Added ability to not download potentially malicious files and files such as trailers / samples - ๐ŸŒ **Adaptive slowness**: Slow downloads-removal can be dynamically turned on/off depending on overall bandwidth usage - ๐Ÿ“„ **Log files**: Logs can now be retrieved from a log file -- ๐Ÿ“Œ **Removal behavior**: Rather than removing downloads, they can now also be tagged for later removal (ie. to allow for seed targets to be reached first). This can be done separately for private and public trackers +- ๐Ÿ—‘๏ธ **Removal behavior**: Rather than removing downloads, they can now also be tagged for later removal (ie. to allow for seed targets to be reached first). This can be done separately for private and public trackers - ๐Ÿ“Œ **Deletion detection**: If movies or tv shows get deleted (for instance via Plex), decluttarr can notice that and refresh the respective item - +- โ›“๏ธ **Being a good seeder**: A new job allows you to wait with the removal until your seed goals have been achieved --- ### โš ๏ธ Breaking Changes @@ -407,7 +412,7 @@ Configures the general behavior of the application (across all features) - Allows you to configure download client names that will be skipped by decluttarr Note: The names provided here have to 100% match with how you have named your download clients in your *arr application(s) - Type: List of strings -- Is Mandatory: No (Defaults to [], i.e. nothing ignored]) +- Is Mandatory: No (Defaults to [], i.e. nothing ignored) #### PRIVATE_TRACKER_HANDLING / PUBLIC_TRACKER_HANDLING @@ -496,6 +501,30 @@ This is the interesting section. It defines which job you want decluttarr to run - This may be helpful if you use a tool such as [unpackerr](https://github.com/Unpackerr/unpackerr) that can handle it - However, you may also find that these packages may contain bad/malicious files (which will not removed by decluttarr) +#### REMOVE_DONE_SEEDING + +- Removes downloads that are completed and are done with seeding from the download client's queue when they meet your selection criteria (tags and/or categories). +- "Done Seeding" means that the Ratio limit or Seeding Time limit for your download has been reached +- The limits are taken from your global settings in your download client, or the download-specific overrides +- Type: Boolean or Dict +- Permissible Values: + - If Bool: True, False + - If Dict: + - `target_tags`: List of tag names to match + - `target_categories`: List of category names to match +- Matching logic: + - Requires at least one of `target_tags` or `target_categories`. If neither is provided, the configured obsolete tag will be used as target_tag + - A torrent must be completed AND match (category IN `target_categories`) OR (has any tag IN `target_tags`) + - If both tags and categories are provided, the condition is OR between them +- Is Mandatory: No (Defaults to False) +- Notes: + - This job currently only supports qBittorrent + - Works great together with `obsolete_tag`: have other jobs tag torrents (e.g., "Obsolete") and let this job remove them once completed + - Why not set "Remove torrent and its files" upon reaching seeding goals in download client? + - This setting is discouraged by *arrs and you will get warnings about it + - You get more granular control + - You can use this job to clean up after other apps like autobrr that do not have any torrent management features + #### REMOVE_FAILED_DOWNLOADS - Steers whether downloads that are marked as "failed" are removed from the queue @@ -584,6 +613,7 @@ This is the interesting section. It defines which job you want decluttarr to run - Permissible Values: True, False - Is Mandatory: No (Defaults to False) + #### SEARCH_UNMET_CUTOFF - Steers whether searches are automatically triggered for items that are wanted and have not yet met the cutoff diff --git a/config/config_example.yaml b/config/config_example.yaml index 59bcb15..71ec755 100644 --- a/config/config_example.yaml +++ b/config/config_example.yaml @@ -17,6 +17,11 @@ job_defaults: jobs: remove_bad_files: # keep_archives: true + remove_done_seeding: + # target_tags: + # - "Obsolete" + # target_categories: + # - "autobrr" remove_failed_downloads: remove_failed_imports: message_patterns: diff --git a/main.py b/main.py index 464468d..6b989ad 100644 --- a/main.py +++ b/main.py @@ -67,9 +67,11 @@ async def main(): await job_manager.run_jobs(arr) logger.verbose("") + # Run download client jobs (these run independently of *arr instances) + await job_manager.run_download_client_jobs() + # Wait for the next run await wait_next_run() - return if __name__ == "__main__": diff --git a/src/job_manager.py b/src/job_manager.py index ba8c819..978d10b 100644 --- a/src/job_manager.py +++ b/src/job_manager.py @@ -1,5 +1,6 @@ # Cleans the download queue from src.jobs.remove_bad_files import RemoveBadFiles +from src.jobs.remove_done_seeding import RemoveDoneSeeding from src.jobs.remove_failed_downloads import RemoveFailedDownloads from src.jobs.remove_failed_imports import RemoveFailedImports from src.jobs.remove_metadata_missing import RemoveMetadataMissing @@ -9,6 +10,7 @@ from src.jobs.remove_stalled import RemoveStalled from src.jobs.remove_unmonitored import RemoveUnmonitored from src.jobs.search_handler import SearchHandler +from src.settings._download_clients import DOWNLOAD_CLIENT_TYPES from src.utils.log_setup import logger from src.utils.queue_manager import QueueManager @@ -25,6 +27,39 @@ async def run_jobs(self, arr): await self.removal_jobs() await self.search_jobs() + async def run_download_client_jobs(self): + """Run jobs that operate on download clients directly.""" + if not await self._download_clients_connected(): + return None + + items_detected = 0 + for download_client_type in DOWNLOAD_CLIENT_TYPES: + download_clients = getattr( + self.settings.download_clients, + download_client_type, + [], + ) + + for client in download_clients: + # Get jobs for this client + download_client_jobs = self._get_download_client_jobs_for_client( + client, + download_client_type, + ) + + if not any(job.job.enabled for job in download_client_jobs): + continue + + logger.info( + f"*** Running jobs on {client.name} ({client.base_url}) ***", + ) + + for download_client_job in download_client_jobs: + if download_client_job.job.enabled: + items_detected += await download_client_job.run() + + return items_detected + async def removal_jobs(self): # Check removal jobs removal_jobs = self._get_removal_jobs() @@ -72,7 +107,7 @@ async def search_jobs(self): async def _queue_has_items(self): logger.debug( - f"job_manager.py/_queue_has_items (Before any removal jobs): Checking if any items in full queue" + "job_manager.py/_queue_has_items (Before any removal jobs): Checking if any items in full queue", ) queue_manager = QueueManager(self.arr, self.settings) full_queue = await queue_manager.get_queue_items("full") @@ -99,11 +134,11 @@ async def _download_clients_connected(self): async def _check_client_connection_status(self, clients): for client in clients: logger.debug( - f"job_manager.py/_check_client_connection_status: Checking if {client.name} is connected" + f"job_manager.py/_check_client_connection_status: Checking if {client.name} is connected", ) if not await client.check_connected(): logger.warning( - f">>> {client.name} is disconnected. Skipping queue cleaning on {self.arr.name}." + f">>> {client.name} is disconnected. Skipping queue cleaning on {self.arr.name}.", ) return False return True @@ -133,3 +168,26 @@ def _get_removal_jobs(self): removal_job_class(self.arr, self.settings, removal_job_name), ) return jobs + + def _get_download_client_jobs_for_client(self, client, client_type): + """ + Return a list of download client job instances for a specific download client. + + Each job is included if the corresponding attribute exists and is truthy in settings.jobs. + """ + download_client_job_classes = { + "remove_done_seeding": RemoveDoneSeeding, + } + + jobs = [] + for job_name, job_class in download_client_job_classes.items(): + if getattr(self.settings.jobs, job_name, False): + jobs.append( + job_class( + client, + client_type, + self.settings, + job_name, + ), + ) + return jobs diff --git a/src/jobs/download_client_removal_job.py b/src/jobs/download_client_removal_job.py new file mode 100644 index 0000000..a6ad8ca --- /dev/null +++ b/src/jobs/download_client_removal_job.py @@ -0,0 +1,119 @@ +from abc import ABC, abstractmethod + +from src.utils.log_setup import logger + + +class DownloadClientRemovalJob(ABC): + """Base class for removal jobs that run on download clients directly.""" + + job_name = None + + def __init__( + self, + download_client: object, + download_client_type: str, + settings: object, + job_name: str, + ) -> None: + self.download_client = download_client + self.download_client_type = download_client_type + self.settings = settings + self.job_name = job_name + self.job = getattr(self.settings.jobs, self.job_name) + + async def run(self) -> int: + """Run the download client job.""" + if not self.job.enabled: + return 0 + + logger.debug( + f"download_client_job.py/run: Launching job '{self.job_name}' on {self.download_client.name} " + f"({self.download_client_type})", + ) + + all_items = await self._get_all_items() + if not all_items: + return 0 + + items_to_remove = await self._get_items_to_remove(all_items) + + # Filter out protected items + items_to_remove = self._filter_protected_items(items_to_remove) + + if not items_to_remove: + logger.debug(f"No items to remove for job '{self.job_name}'.") + return 0 + + # Remove the affected items + await self._remove_items(items_to_remove) + + return len(items_to_remove) + + async def _get_all_items(self) -> list: + """Get all items from the download client.""" + try: + if self.download_client_type == "qbittorrent": + return await self.download_client.get_qbit_items() + if self.download_client_type == "sabnzbd": + return await self.download_client.get_history_items() + except Exception as e: + logger.error( + f"Error fetching items from {self.download_client.name}: {e}", + ) + return [] + + def _filter_protected_items(self, items: list) -> list: + """Filter out items that are protected by tags or categories.""" + protected_tag = getattr(self.settings.general, "protected_tag", None) + if not protected_tag: + return items + + filtered_items = [] + for item in items: + is_protected = False + item_name = item.get("name", "unknown") + if self.download_client_type == "qbittorrent": + tags = item.get("tags", "").split(",") + tags = [tag.strip() for tag in tags if tag.strip()] + category = item.get("category", "") + if protected_tag in tags or protected_tag == category: + is_protected = True + elif self.download_client_type == "sabnzbd": + category = item.get("category", "") + if protected_tag == category: + is_protected = True + + if is_protected: + logger.debug(f"Ignoring protected item: {item_name}") + else: + filtered_items.append(item) + + return filtered_items + + @abstractmethod + async def _get_items_to_remove(self, items: list) -> list: + """Return a list of items to remove from the download client.""" + + async def _remove_items(self, items: list) -> None: + """Remove the affected items from the download client.""" + if self.settings.general.test_run: + logger.info("Test run is enabled. Skipping actual removal.") + for item in items: + item_name = item.get("name", "unknown") + logger.info(f"Would have removed download: {item_name}") + return + + for item in items: + item_name = item.get("name", "unknown") + try: + if self.download_client_type == "qbittorrent": + download_hash = item["hash"] + await self.download_client.remove_download(download_hash) + elif self.download_client_type == "sabnzbd": + nzo_id = item["nzo_id"] + await self.download_client.remove_download(nzo_id) + + logger.info(f"Removed download: {item_name}") + + except Exception as e: + logger.error(f"Failed to remove {item_name}: {e}") diff --git a/src/jobs/remove_done_seeding.py b/src/jobs/remove_done_seeding.py new file mode 100644 index 0000000..e83541f --- /dev/null +++ b/src/jobs/remove_done_seeding.py @@ -0,0 +1,104 @@ +"""Removes completed torrents that have specific tags/categories.""" + +from typing import ClassVar + +from src.jobs.download_client_removal_job import DownloadClientRemovalJob +from src.utils.log_setup import logger + +COMPLETED_STATES = [ + "stoppedUP", + "pausedUP", # Older qBittorrent versions +] + + +class RemoveDoneSeeding(DownloadClientRemovalJob): + """Job to remove completed torrents that match specific tags or categories.""" + + SUPPORTED_CLIENTS: ClassVar[list[str]] = ["qbittorrent"] + + async def run(self) -> int: + if self.download_client_type not in self.SUPPORTED_CLIENTS: + logger.debug( + f"remove_done_seeding.py/run: Skipping job '{self.job_name}' for unsupported client {self.download_client.name}.", + ) + return 0 + + return await super().run() + + async def _get_items_to_remove(self, items: list) -> list: + """ + Filters a list of items from a download client and returns those + that should be removed based on completion status and other criteria. + """ + target_tags, target_categories = self._get_targets() + + if not target_tags and not target_categories: + logger.debug( + "remove_done_seeding.py/_get_items_to_remove: No target tags or categories specified for remove_done_seeding job.", + ) + return [] + + items_to_remove = [ + item + for item in items + if self._is_completed(item) + and self._meets_target_criteria(item, target_tags, target_categories) + ] + + for item in items_to_remove: + logger.debug( + f"remove_done_seeding.py/_get_items_to_remove: Found completed item to remove: {item.get('name', 'unknown')}", + ) + + return items_to_remove + + def _get_limit(self, item: dict, specific_key: str, global_key: str) -> float: + """Get a limit from item, falling back to a global key.""" + limit = item.get(specific_key, -1) + if limit <= 0: + limit = item.get(global_key, -1) + return limit + + def _is_completed(self, item: dict) -> bool: + """Check if an item has met its seeding goals.""" + state = item.get("state", "") + if state not in COMPLETED_STATES: + return False + + # Additional sanity checks for ratio and seeding time + ratio = item.get("ratio", 0) + seeding_time = item.get("seeding_time", 0) + + ratio_limit = self._get_limit(item, "ratio_limit", "max_ratio") + seeding_time_limit = self._get_limit( + item, + "seeding_time_limit", + "max_seeding_time", + ) + + ratio_limit_met = ratio >= ratio_limit > 0 + seeding_time_limit_met = seeding_time >= seeding_time_limit > 0 + + return ratio_limit_met or seeding_time_limit_met + + def _meets_target_criteria( + self, + item: dict, + target_tags: list, + target_categories: list, + ) -> bool: + """Check if an item has the required tags or categories for removal.""" + item_category = item.get("category", "") + if item_category in target_categories: + return True + + tags = item.get("tags", "").split(",") + item_tags = {tag.strip() for tag in tags if tag.strip()} + + return bool(item_tags.intersection(target_tags)) + + def _get_targets(self) -> tuple[list, list]: + """Get the list of tags and categories to look for from job settings.""" + tags = getattr(self.job, "target_tags", []) + categories = getattr(self.job, "target_categories", []) + return tags, categories diff --git a/src/settings/_download_clients_qbit.py b/src/settings/_download_clients_qbit.py index 9332f22..ddcb74a 100644 --- a/src/settings/_download_clients_qbit.py +++ b/src/settings/_download_clients_qbit.py @@ -61,7 +61,7 @@ def __init__( self.name = name if not self.name: logger.verbose( - "No name provided for qbittorrent client, assuming 'qBitorrent'. If the name used in your *arr is different, please correct either the name in your *arr, or set the name in your config" + "No name provided for qbittorrent client, assuming 'qBitorrent'. If the name used in your *arr is different, please correct either the name in your *arr, or set the name in your config", ) self.name = "qBittorrent" @@ -82,7 +82,7 @@ def _connection_error(): try: logger.debug( - "_download_clients_qBit.py/refresh_cookie: Refreshing qBit cookie" + "_download_clients_qBit.py/refresh_cookie: Refreshing qBit cookie", ) endpoint = f"{self.api_url}/auth/login" data = { @@ -113,11 +113,14 @@ async def fetch_version(self): logger.debug("_download_clients_qBit.py/fetch_version: Getting qBit Version") endpoint = f"{self.api_url}/app/version" response = await make_request( - "get", endpoint, self.settings, cookies=self.cookie + "get", + endpoint, + self.settings, + cookies=self.cookie, ) self.version = response.text[1:] # Remove the '_v' prefix logger.debug( - f"_download_clients_qBit.py/fetch_version: qBit version={self.version}" + f"_download_clients_qBit.py/fetch_version: qBit version={self.version}", ) async def validate_version(self): @@ -138,7 +141,7 @@ async def validate_version(self): async def create_tag(self, tag: str): """Ensure a tag exists in qBittorrent; create it if it doesn't.""" logger.debug( - "_download_clients_qBit.py/create_tag: Checking if tag '{tag}' exists (and creating it if not)" + "_download_clients_qBit.py/create_tag: Checking if tag '{tag}' exists (and creating it if not)", ) url = f"{self.api_url}/torrents/tags" response = await make_request("get", url, self.settings, cookies=self.cookie) @@ -169,7 +172,7 @@ async def set_unwanted_folder(self): """Set the 'unwanted folder' setting in qBittorrent if needed.""" if self.settings.jobs.remove_bad_files: logger.debug( - "_download_clients_qBit.py/set_unwanted_folder: Checking preferences and setting use_unwanted_folder if not already set" + "_download_clients_qBit.py/set_unwanted_folder: Checking preferences and setting use_unwanted_folder if not already set", ) endpoint = f"{self.api_url}/app/preferences" response = await make_request( @@ -197,7 +200,7 @@ async def check_qbit_reachability(self): """Check if the qBittorrent URL is reachable.""" try: logger.debug( - "_download_clients_qBit.py/check_qbit_reachability: Checking if qbit is reachable" + "_download_clients_qBit.py/check_qbit_reachability: Checking if qbit is reachable", ) endpoint = f"{self.api_url}/auth/login" data = { @@ -223,7 +226,7 @@ async def check_qbit_reachability(self): async def check_connected(self): """Check if the qBittorrent is connected to internet.""" logger.debug( - "_download_clients_qBit.py/check_qbit_reachability: Checking if qbit is connected to the internet" + "_download_clients_qBit.py/check_qbit_reachability: Checking if qbit is connected to the internet", ) qbit_connection_status = ( ( @@ -268,7 +271,7 @@ async def get_protected_and_private(self): # Fetch all torrents logger.debug( - "_download_clients_qBit/get_protected_and_private: Checking if torrents have protected tag" + "_download_clients_qBit/get_protected_and_private: Checking if torrents have protected tag", ) qbit_items = await self.get_qbit_items() @@ -287,7 +290,7 @@ async def get_protected_and_private(self): private_downloads.append(qbit_item["hash"].upper()) else: logger.debug( - "_download_clients_qBit/get_protected_and_private: Checking if torrents are private (only done for old qbit versions)" + "_download_clients_qBit/get_protected_and_private: Checking if torrents are private (only done for old qbit versions)", ) qbit_item_props = await make_request( "get", @@ -325,7 +328,7 @@ async def set_tag(self, tags, hashes): tags_str = ",".join(tags) logger.debug( - "_download_clients_qBit/set_tag: Setting tag(s) {tags_str} to {hashes_str}" + "_download_clients_qBit/set_tag: Setting tag(s) {tags_str} to {hashes_str}", ) # Prepare the data for the request @@ -374,7 +377,7 @@ async def get_torrent_files(self, download_id): async def set_torrent_file_priority(self, download_id, file_id, priority=0): logger.debug( - "_download_clients_qBit/set_torrent_file_priority: Setting download priority for torrent file" + "_download_clients_qBit/set_torrent_file_priority: Setting download priority for torrent file", ) data = { "hash": download_id.lower(), @@ -416,5 +419,22 @@ async def warn_no_bandwidth_limit_set(self): "๐Ÿ’ก Tip: No global download speed limit is set in your qBittorrent instance. " "If you configure one, the 'remove_slow' check will automatically disable itself " "when your bandwidth is fully utilized. This prevents slow downloads from being mistakenly removed โ€” " - "not because they lack seeds, but because your own download capacity is saturated." + "not because they lack seeds, but because your own download capacity is saturated.", ) + + async def remove_download(self, download_hash: str, delete_files: bool = True): + """Remove a torrent from qBittorrent.""" + logger.debug( + f"_download_clients_qBit/remove_download: Removing torrent {download_hash}", + ) + data = { + "hashes": download_hash.lower(), + "deleteFiles": "true" if delete_files else "false", + } + await make_request( + "post", + f"{self.api_url}/torrents/delete", + self.settings, + data=data, + cookies=self.cookie, + ) diff --git a/src/settings/_jobs.py b/src/settings/_jobs.py index a873744..c71eaab 100644 --- a/src/settings/_jobs.py +++ b/src/settings/_jobs.py @@ -13,6 +13,7 @@ class JobParams: min_speed: int max_concurrent_searches: int min_days_between_searches: int + target_tags: list def __init__( self, @@ -23,6 +24,7 @@ def __init__( min_speed=None, max_concurrent_searches=None, min_days_between_searches=None, + target_tags=None, ): self.enabled = enabled self.keep_archives = keep_archives @@ -31,6 +33,7 @@ def __init__( self.min_speed = min_speed self.max_concurrent_searches = max_concurrent_searches self.min_days_between_searches = min_days_between_searches + self.target_tags = target_tags # Remove attributes that are None to keep the object clean self._remove_none_attributes() @@ -51,9 +54,11 @@ class JobDefaults: min_days_between_searches: int = 7 min_speed: int = 100 message_patterns = ["*"] + target_tags = [] - def __init__(self, config): + def __init__(self, config, settings): job_defaults_config = config.get("job_defaults", {}) + self.target_tags.append(settings.general.obsolete_tag) self.max_strikes = job_defaults_config.get("max_strikes", self.max_strikes) self.max_concurrent_searches = job_defaults_config.get("max_concurrent_searches", self.max_concurrent_searches) self.min_days_between_searches = job_defaults_config.get( @@ -66,14 +71,15 @@ def __init__(self, config): class Jobs: """Represent all jobs explicitly.""" - def __init__(self, config): - self.job_defaults = JobDefaults(config) + def __init__(self, config, settings): + self.job_defaults = JobDefaults(config, settings) self._set_job_defaults() self._set_job_configs(config) del self.job_defaults def _set_job_defaults(self): self.remove_bad_files = JobParams(keep_archives=self.job_defaults.keep_archives) + self.remove_done_seeding = JobParams(target_tags=self.job_defaults.target_tags) self.remove_failed_downloads = JobParams() self.remove_failed_imports = JobParams( message_patterns=self.job_defaults.message_patterns, diff --git a/src/settings/_user_config.py b/src/settings/_user_config.py index 40ba41b..6cc4ac8 100644 --- a/src/settings/_user_config.py +++ b/src/settings/_user_config.py @@ -28,6 +28,7 @@ ], "jobs": [ "REMOVE_BAD_FILES", + "REMOVE_DONE_SEEDING", "REMOVE_FAILED_DOWNLOADS", "REMOVE_FAILED_IMPORTS", "REMOVE_METADATA_MISSING", @@ -73,6 +74,7 @@ def _load_from_env() -> dict: Returns: dict: Config sections with parsed env var values. + """ config = {} diff --git a/src/settings/settings.py b/src/settings/settings.py index f60e66b..e13a143 100644 --- a/src/settings/settings.py +++ b/src/settings/settings.py @@ -16,7 +16,7 @@ def __init__(self): self.envs = Envs() config = get_user_config(self) self.general = General(config) - self.jobs = Jobs(config) + self.jobs = Jobs(config, self) self.download_clients = DownloadClients(config, self) self.instances = ArrInstances(config, self) configure_logging(self) diff --git a/tests/jobs/test_remove_completed.py b/tests/jobs/test_remove_completed.py new file mode 100644 index 0000000..b7a2681 --- /dev/null +++ b/tests/jobs/test_remove_completed.py @@ -0,0 +1,254 @@ + # pylint: disable=W0212 +"""Tests for the remove_done_seeding job.""" + +from unittest.mock import AsyncMock, MagicMock, patch + +import pytest + +from src.jobs.remove_done_seeding import COMPLETED_STATES, RemoveDoneSeeding + + +def create_mock_settings(target_tags=None, target_categories=None): + """Create mock settings for testing.""" + settings = MagicMock() + settings.jobs = MagicMock() + settings.jobs.remove_done_seeding.enabled = True + settings.jobs.remove_done_seeding.target_tags = target_tags or [] + settings.jobs.remove_done_seeding.target_categories = target_categories or [] + settings.general = MagicMock() + settings.general.protected_tag = "protected" + return settings + + +def create_mock_download_client(items, client_name="mock_client_name"): + """Create a mock download client.""" + client = MagicMock() + client.get_qbit_items = AsyncMock(return_value=items) + client.name = client_name + return client + + +# Default item properties for tests +ITEM_DEFAULTS = { + "progress": 1, + "ratio": 0, + "ratio_limit": -1, + "seeding_time": 0, + "seeding_time_limit": -1, + "tags": "", + "category": "movies", + "state": "stoppedUP", +} + + +@pytest.mark.asyncio +@pytest.mark.parametrize( + ("item_properties", "target_tags", "target_categories", "should_be_removed"), + [ + # Ratio limit met, matching tag and category + ( + {"ratio": 2, "ratio_limit": 2, "tags": "tag1"}, + ["tag1"], + ["movies"], + True, + ), + # Seeding time limit met, matching tag and category + ( + {"seeding_time": 100, "seeding_time_limit": 100, "tags": "tag1"}, + ["tag1"], + ["movies"], + True, + ), + # Neither limit met + ({"ratio": 1, "ratio_limit": 2}, ["tag1"], ["movies"], False), + # Progress less than 1 (should not be considered completed) + ( + {"progress": 0.5, "state": "downloading"}, + ["tag1"], + ["movies"], + False, + ), + # No matching tags or categories + ( + {"ratio": 2, "ratio_limit": 2, "tags": "other", "category": "tv"}, + ["tag1"], + ["movies"], + False, + ), + # Matching category, but not completed + ({"category": "tv", "state": "downloading"}, [], ["tv"], False), + # Matching tag, but not completed + ({"tags": "tag2", "state": "downloading"}, ["tag2"], [], False), + # Matching category and completed (ratio) + ( + {"ratio": 2, "ratio_limit": 2, "category": "tv"}, + [], + ["tv"], + True, + ), + # Matching tag and completed (seeding time) + ( + {"seeding_time": 100, "seeding_time_limit": 100, "tags": "tag2"}, + ["tag2"], + [], + True, + ), + # No targets specified + ({"ratio": 2, "ratio_limit": 2}, [], [], False), + # Item with multiple tags, one is a target + ( + {"tags": "tag1,tag2", "ratio": 2, "ratio_limit": 2}, + ["tag2"], + [], + True, + ), + # Item with a tag that is a substring of a target tag (should not match) + ({"tags": "tag", "ratio": 2, "ratio_limit": 2}, ["tag1"], [], False), + # Item with a category that is a substring of a target (should not match) + ( + {"category": "movie", "ratio": 2, "ratio_limit": 2}, + [], + ["movies"], + False, + ), + # Test with another completed state + ( + {"ratio": 2, "ratio_limit": 2, "state": "pausedUP"}, + ["tag1"], + ["movies"], + True, + ), + ], +) +async def test_remove_done_seeding_logic( + item_properties: dict, + target_tags: list, + target_categories: list, + should_be_removed: bool, +): + """Test the logic of the remove_done_seeding job with various scenarios.""" + item = {**ITEM_DEFAULTS, **item_properties, "name": "test_item"} + + settings = create_mock_settings(target_tags, target_categories) + client = create_mock_download_client([item]) + + job = RemoveDoneSeeding(client, "qbittorrent", settings, "remove_done_seeding") + + items_to_remove = await job._get_items_to_remove(await client.get_qbit_items()) + + if should_be_removed: + assert len(items_to_remove) == 1 + assert items_to_remove[0]["name"] == "test_item" + else: + assert len(items_to_remove) == 0 + + +@pytest.mark.asyncio +async def test_remove_done_seeding_skipped_for_sabnzbd(): + """Test that the remove_done_seeding job is skipped for SABnzbd clients.""" + settings = create_mock_settings() + client = create_mock_download_client([], client_name="mock_client_name") + job = RemoveDoneSeeding(client, "sabnzbd", settings, "remove_done_seeding") + + # Test that the job returns 0 for unsupported clients + result = await job.run() + assert result == 0 + + # Verify that no client methods were called since the job should be skipped + client.get_qbit_items.assert_not_called() + + +@pytest.mark.asyncio +async def test_remove_done_seeding_test_run_enabled(): + """Test that no items are actually removed when test_run is enabled.""" + item = { + **ITEM_DEFAULTS, + "ratio": 2, + "ratio_limit": 2, + "name": "test_item", + "tags": "tag1", + } + settings = create_mock_settings(target_tags=["tag1"]) + settings.general.test_run = True + client = create_mock_download_client([item]) + job = RemoveDoneSeeding(client, "qbittorrent", settings, "remove_done_seeding") + + with patch.object( + client, + "remove_download", + new_callable=AsyncMock, + ) as mock_client_remove: + result = await job.run() + + # The job should still report the number of items it would have removed + assert result == 1 + + # But no actual removal should occur on the client + mock_client_remove.assert_not_called() + + +@pytest.mark.asyncio +@pytest.mark.parametrize("protected_on", ["tag", "category"]) +async def test_remove_done_seeding_with_protected_item(protected_on): + """Test that items with a protected tag or category are not removed.""" + item_properties = {"ratio": 2, "ratio_limit": 2, "name": "protected_item"} + target_tags = ["tag1"] + target_categories = ["movies"] + + if protected_on == "tag": + item_properties["tags"] = "protected" + # Also add a targetable tag to ensure it's the protection that stops it + item_properties["tags"] += ",tag1" + else: + item_properties["category"] = "protected" + target_categories = ["protected"] + + item = {**ITEM_DEFAULTS, **item_properties} + + settings = create_mock_settings( + target_tags=target_tags, + target_categories=target_categories, + ) + client = create_mock_download_client([item]) + job = RemoveDoneSeeding(client, "qbittorrent", settings, "remove_done_seeding") + + with patch.object( + job, + "_remove_items", + new_callable=AsyncMock, + ) as mock_remove: + result = await job.run() + assert result == 0 # No items should be removed + mock_remove.assert_not_called() + + +@pytest.mark.asyncio +async def test_is_completed_logic(): + """Test the internal _is_completed logic with different states and limits.""" + job = RemoveDoneSeeding( + MagicMock(), "qbittorrent", MagicMock(), "remove_done_seeding" + ) + + # Completed states + for state in COMPLETED_STATES: + # Ratio met + assert job._is_completed( + {"state": state, "ratio": 2, "ratio_limit": 2}, + ), f"Failed for state {state} with ratio met" + # Seeding time met + assert job._is_completed( + {"state": state, "seeding_time": 100, "seeding_time_limit": 100}, + ), f"Failed for state {state} with seeding time met" + # Neither met + assert not job._is_completed( + {"state": state, "ratio": 1, "ratio_limit": 2}, + ), f"Failed for state {state} with neither limit met" + # Limits not set + assert not job._is_completed( + {"state": state, "ratio": 1, "ratio_limit": -1}, + ), f"Failed for state {state} with no ratio limit" + + # Non-completed states + assert not job._is_completed( + {"state": "downloading", "ratio": 2, "ratio_limit": 1}, + ), "Failed for non-completed state"