Skip to content

Handle duplicately named album downloads #23

@LockeBirdsey

Description

@LockeBirdsey

This is likely a pretty rare edge-case 😅

Background:

Artist has releases all with same names and releases multiple times a year (or year of release can't be determined). For example: https://homeisinyourarms.bandcamp.com/music

Problems:

  1. When using more than 1 download job,

    skip_err!(fs::create_dir_all(&path));
    will likely emit (if you haven't downloaded from this artist before) due to concurrent dir creation (see create_dir_all docs)

  2. If using a single job (or downloading new releases from this artist), the directory will just get overwritten. Additionally it will create a 'false-positive' cache entry.

Solutions:

There aren't really any decent solutions to this problem. The album id could be added to the directory name. Could also test for existing dirs and increment some duplicate counter in the dir name, although this would only work for problem (2). However if the user's end goal is to serve the files with some media server, both of these solutions would only work for for untagged files (I guess WAV/FLAC/AIFF) but that also depends on how that server determines artist/album info.

Workaround:

  1. Run bandsnatch with a single job
  2. Terminate after the first conflicting album is downloaded
  3. Manually edit the dir name to something non-conflicting
  4. Repeat from 1 until all conflicted files are downloaded

Bonus step: edit the tags in the files to accommodate for how your media server interprets the artist/album information

Not a pretty workaround but again this is likely a super rare edge case, so it's also debatable whether this is even worth fixing. If you think it is I can make the PRs.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions