Skip to content

Selecting a Back End

Marcus Green edited this page Jan 5, 2026 · 5 revisions

AI Text supports 3 "back ends", the code that actually "talks" to the external LLM. These are

  • Core Moodle AI Subsystem
  • local_ai
  • tool ai_manager

Tool AI Connect

tool ai_manager was forked from an existing plugin by Enovation specifically for aitext and can be regarded as Experimental or legacy.

https://github.com/marcusgreen/moodle-tool_aiconnect

It has less testing with AI Text than the other back ends and does not include any of the token counting/costs/restriction abilities of the other back ends. It does however offere simplicity plus a feature to allow a per question selection of models. This could be useful for example you want access to a model that specialises in Math but also one that specialises in Languages, plus different modesl come with differing financial costs. With any of the other back ends this option will not be visible.

Core AI Subsystem

Core Moodle supports the following LLM's

OpenAI Azure Ollama

Plus other providers are available at https://moodle.org/plugins/browse.php?list=category&id=90

Note that many 3rd party LLM systems support the OpenAI API standard so it may be possible to access other systems using Ollama and some configuraiton details.

Local AI Manager

This is developed by ByCS the Education organisation of the German Bavarian state who support the moodle accounts for over 1.5 million school students. It has been built specifically to meet their requirements but it is a comprehensive and impressive back end and can be run "in paralell" with plugins using the Core AI subsystem.

You can find it here https://github.com/bycs-lp/moodle-local_ai_manager

I used an LLM to generate some documentation for it, but treat it with the standard reservations of anything created by an LLM https://github.com/marcusgreen/moodle-local_ai_manager/wiki

Clone this wiki locally