Currently a lot of the effort was spent on supporting different backends (OpenAI, Anthropic, etc). What I'm talking about is the front end of the stack.
I'm wondering if it could be useful to anyone to expose the RLM as a generally compatible API to allow plugging it into existing tools, such as coding agents.
I've looked at LiteLLM which has custom provider support.
@alexzhang13 thoughts?
Edit: I've hacked together an impl (not nearly prod-ready, see here: diff but it worked!