Skip to content

locked in function "get_prompt_embedding" #42

@lxxie298

Description

@lxxie298

run output:

[rank5]: 2025-06-06 16:59:49 | i2v-train | INFO | Initializing models...
[rank5]: 2025-06-06 16:59:49 | i2v-train | INFO | Initializing dataset and dataloader...
[rank3]: 2025-06-06 16:59:49 | i2v-train | INFO | Initializing models...
[rank3]: 2025-06-06 16:59:49 | i2v-train | INFO | Initializing dataset and dataloader...
[rank3]: 2025-06-06 17:00:02 | i2v-train | INFO | Precomputing embedding ...
[rank1]: 2025-06-06 17:00:02 | i2v-train | INFO | Precomputing embedding ...
[rank6]: 2025-06-06 17:00:02 | i2v-train | INFO | Precomputing embedding ...
[rank5]: 2025-06-06 17:00:02 | i2v-train | INFO | Precomputing embedding ...
[rank4]: 2025-06-06 17:00:02 | i2v-train | INFO | Precomputing embedding ...
base i2v
get_prompt_embedding ..
base i2v
get_prompt_embedding ..
base i2v
get_prompt_embedding ..
base i2v
get_prompt_embedding ..
base i2v
get_prompt_embedding ..
lock
lock
lock
lock
lock
[rank2]: 2025-06-06 17:00:05 | i2v-train | INFO | Precomputing embedding ...
[rank0]: 2025-06-06 17:00:05 | i2v-train | INFO | Precomputing embedding ...
[rank7]: 2025-06-06 17:00:05 | i2v-train | INFO | Precomputing embedding ...
base i2v
get_prompt_embedding ..
lock
base i2v
get_prompt_embedding ..
lock
base i2v
get_prompt_embedding ..
lock

problem location :

def get_prompt_embedding(encode_fn: Callable, prompt: str, cache_dir: Path) -> torch.Tensor:
    """Get prompt embedding from cache or create new one if not exists.

    Args:
        encode_fn: Function to project prompt to embedding.
        prompt: Text prompt to be embedded
        cache_dir: Base directory for caching embeddings

    Returns:
        torch.Tensor: Prompt embedding with shape [seq_len, hidden_size]
    """
    print("get_prompt_embedding ..")
    prompt_embeddings_dir = cache_dir / "prompt_embeddings"
    prompt_embeddings_dir.mkdir(parents=True, exist_ok=True)

    prompt_hash = str(hashlib.sha256(prompt.encode()).hexdigest())
    prompt_embedding_path = prompt_embeddings_dir / (prompt_hash + ".safetensors")
    lock = FileLock(str(prompt_embedding_path) + ".lock")
    print("lock") #####################################################
    with lock:
        if prompt_embedding_path.exists():
            print(f"found existed embedding path: {prompt_embedding_path}") #####################################################
            prompt_embedding = load_file(prompt_embedding_path)["prompt_embedding"]
            _logger.debug(
                f"Loaded prompt embedding from {prompt_embedding_path}",
                main_only=False,
            )
        else:
            print("text encoding ...") #####################################################
            print(encode_fn.device)
            prompt_embedding = encode_fn(prompt)
            assert prompt_embedding.ndim == 2
            # shape of prompt_embedding: [seq_len, hidden_size]

            prompt_embedding = prompt_embedding.to("cpu")
            save_file({"prompt_embedding": prompt_embedding}, prompt_embedding_path)
            _logger.info(
                f"Saved prompt embedding to {prompt_embedding_path}",
                main_only=False,
            )

As shown in above output, the code included in "with lock" can not be executed.
How can I fix the problem?

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions