LlamaResearcher is your friendly research companion built on top of Llama 4, powered by Groq, LinkUp, LlamaIndex, Gradio, FastAPI and Redis.
Required: Docker and docker compose
The first step, common to both the Docker and the source code setup approaches, is to clone the repository and access it:
git clone https://github.com/AstraBert/llama-4-researcher.git
cd llama-4-researcher
Once there, you can follow this approach
- Add the
groq_api_key
,internal_api_key
,linkup_api_key
variable and the variables to connect to a Postgres database in the.env.example
file and modify the name of the file to.env
. Get these keys:- On Groq Console
- On Linkup Dashboard
- You can create your own internal key
- You can create your own variables to connect to a Postgres database, or, if you're using Supabase, you can get them clicking on the "Connection" widget at the top of the page.
- You can get your Supabase URL and Supabase API key on Supabase
mv .env.example .env
- In your Supabase Dashboard, go to SQL Editor on the left and open it, so that you can run the following command:
CREATE TABLE IF NOT EXISTS public.users (
id SERIAL PRIMARY KEY,
created_at TIMESTAMP DEFAULT NOW(),
username TEXT DEFAULT NULL,
email TEXT DEFAULT NULL,
password TEXT DEFAULT NULL
);
See the following image for reference:
- You can now launch the containers with the following commands:
docker compose -f compose.local.yaml up llama_redis -d
docker compose -f compose.local.yaml up llama_register -d
docker compose -f compose.local.yaml up llama_app -d
You will see the application running on http://localhost:8000 and the registration page at http://localhost:7860, and you will be able to use both. Depending on your connection and on your hardware, the set up might take some time (up to 15 mins to set up) - but this is only for the first time your run it!
- Redis is used for API rate limiting control
- Supabase manages user registration and sign-in
You must have a Postgres instance running externally, in which you will see the analytics of the searches that LlamaResearcher performs.
- Your request is first deemed safe or not by a guardi model,
llama-3-8b-guard
provided by Groq - If the prompt is safe, we proceed by routing it to the ResearcherAgent, which is a function calling agent
- The ResearcherAgent first expands the query into three sub-queries, that will be used for web search
- The web is deep-searched for every sub-query with LinkUp
- The information retrieved from the web is evaluated for relevancy against the original user prompt
- Once the agent gathered all the information, it writes the final essay and it returns it to the user
Contributions are always welcome! Follow the contributions guidelines reported here.
The software is provided under MIT license.