Skip to content

probe: probes should follow soft_probe_prompt_cap #1562

@leondz

Description

@leondz

suffix.GCGCached always runs all stored cached prompts

run.soft_probe_prompt_cap sets a cap on how many distinct prompts to offer per probe

suffix.GCGCached should not issue more prompts than run.soft_probe_prompt_cap

To see the issue, try a config like:

{
	"plugins": {
		"probe_spec": "suffix"
	},
	"run": {
		"soft_probe_prompt_cap": 10,
		"generations": 1,
	}
}

Note that >10 prompts are issued

When addressing this, use a mechanism for randomly selecting prompts like that in probes.base._prune_data()

Metadata

Metadata

Assignees

No one assigned

    Labels

    probesContent & activity of LLM probes

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions