Skip to content

Expert prompt mode (slugs)

Marcus Green edited this page Oct 2, 2025 · 15 revisions

References to slugs in this page mean a keyword wrapped in double square braces which will be treated as instructions to the processing of the text.

By default every prompt in an AI Text question gets some manipulation and "wrapping" before it is delivered to the external LLM.

For example where the prompt is

"Explain if there is anything wrong with the grammar and spelling in the text."

This will be wrapped as

"in [[Yesterday I went to the park]] analyse the part between [[ and ]] as follows: Explain if there is anything wrong with the grammar and spelling in the text. Set marks to null in the json object. Return only a JSON object which enumerates a set of 2 elements.The JSON object should be in this format: {feedback":"string","marks":"number"} where marks is a single number summing all marks. Also show the marks as part of the feedback. translate the feedback to the language en"

This is helpful for people who are novices at prompting but limits what more experienced people can do and also limits what can be done in terms of innovation.

This proposal is for a new slug in the form [[expert]] which will bypass some of the wrapping and send the prompt to the external LLM. It will only work if the prompt includes the [[response]] slug which contains the response from the student. It will require each prompt to add on any information relating to the response being wrapped in json

Example

Given a quesiton text of

"Write an English sentence in the past tense"

And a prompt of

[[expert]] Given the response [[response]] to the question [[questiontext]] give an analysis of the English grammar.

And if a student enters

"Yesterday I go prk",

The prompt actually sent to the LLM would be

"Given the response 'Yesterday I go prk' to the question 'Write an English sentence in the past tense' give an analysis of the English grammar."

Note how there is no reference to json or marking which could be added "by hand" to the prompt.

There will be a new plugin setting called expertmode. By default this is unchecked and the [[expert]] slug will be ignored.

Clone this wiki locally