SDK for building verifiable AI Agents on Flare using Confidential Space Trusted Execution Environments (TEEs).
Warning
Flare AI Kit is currently under active development (alpha stage).
Interfaces, APIs, and functionalities may change frequently and potentially in backward-incompatible ways before a stable release. Use with caution.
- Verifiable execution: Run logic inside Intel TDX TEEs via GCP Confidential Space.
- Multi-agent consensus: Majority/Tournament/Consensus Learning via Google Agent2Agent protocol.
- Agent framework: Built on Google ADK with tool-calling, orchestration and evaluation.
- Flare integration: FTSO, FDC, FAssets + ecosystem dApps (Sceptre, SparkDEX, ...).
- Social connectors: X, Telegram, Farcaster.
The kit is composed of modular engines for agents, social feeds, onchain data, and consensus.
flowchart TD
A["Flare AI Kit"]
%% Agent Framework subgraph
subgraph AgentFramework["Agent Framework"]
B["Google ADK"]
B --o LLM["Gemini<br>GPT<br>Grok<br>+200 models"]
end
%% VectorRAG Engine subgraph
subgraph VectorRAG["VectorRAG Engine"]
C["Qdrant"]
C --o SOURCE[DevHub<br>Flare News<br>Flare Governance]
end
%% Secure Enclave subgraph
subgraph SecureEnclave["Secure Enclave"]
E["Confidential Space"]
E --> HW[Intel TDX]
HW --o RA[RA-verify<br>RA-TLS]
end
%% Ecosystem Engine subgraph
subgraph EcosystemEngine["Ecosystem Engine"]
F[Ecosystem Engine]
F --> PR[Protocols]
PR --o PROTOS["FTSO<br>FDC<br>FAssets"]
F --> AP[Applications]
AP --o APPS[SparkDEX<br>OpenOcean<br>Kinetic<br>Cyclo]
end
%% Social Engine subgraph
subgraph SocialEngine["Social Engine"]
G["Social"]
G --o SOC["X<br>Telegram<br>Farcaster<br>Slack"]
end
%% Consensus Engine subgraph
subgraph ConsensusEngine["Consensus Engine"]
H["Consensus"]
H --o ALGOS[Majority<br>Tournament<br>Clustering]
end
%% Connections to Flare AI Kit central node
A --> B
A --> C
A --> E
A --> F
A --> G
A --> H
Prerequisites
- uv with Python >= 3.12
- Docker.
- (For deployment) Authenticated gcloud CLI.
-
Clone & configure:
git clone --recursive https://github.com/flare-foundation/flare-ai-kit.git cd flare-ai-kit cp .env.example .env # add API keys and settings
-
Install:
uv sync --all-extras
Run the following commands to format, lint, type-check, and test your code before committing.
# Format, lint, type-check, test
uv run ruff format && uv run ruff check --fix && uv run pyright && uv run pytestdocker build -t flare-ai-kit .
docker run --rm --env-file .env flare-ai-kitThe repository includes a parametric Dockerfile for running specific scripts with only the dependencies they need:
# Build and run PDF ingestion script
docker build -t fai-script-pdf \
--build-arg EXTRAS=pdf \
--build-arg SCRIPT=ingest_pdf.py .
docker run --rm -it \
-v "$PWD/scripts/data:/app/scripts/data" \
--env-file .env \
fai-script-pdfAvailable EXTRAS: pdf, rag, a2a, ftso, da, fassets, social, tee, wallet, ingestion
See Docker Scripts Guide for detailed usage instructions.
Prerequisites: Authenticated gcloud CLI.
-
Configure GCP: Set all
GCP__*variables in your.envfile. -
Deploy:
chmod +x gcloud-deploy.sh ./gcloud-deploy.sh # verbose: ./gcloud-deploy.sh -v
See CONTRIBUTING.md. We use Conventional Commits, Ruff/Pyright gates, and pytest. Please include tests and update docs for any user-visible changes.
This project is open-source and licensed under the Apache License 2.0. See LICENSE file.