Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ wheels/
.vscode
.DS_Store

# Python client
*.pem

# sqlite
*.db
*.db
121 changes: 121 additions & 0 deletions examples/ra-tls-client/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,121 @@
# Install and run

```uv sync --frozen --all-extras```

```uv pip install -e "../..[rag,social,tee,a2a,ingestion]"```

```uv run python -m backend.ra_tls_main```

```uv run python -m client.client_test```

# Docker commands

Build the example. Note MUST BE EXECUTED FROM flare-ai-kit ROOT
```docker build --no-cache --progress=plain -t flare-ai-kit-ra-tls-client -f examples/ra-tls-client/Dockerfile .```

Run the container and start the servers
```docker run --rm -p 80:80 -it --env-file .env flare-ai-kit-ra-tls-client```

Clear containers and images
```docker builder prune --all```

List containers
```docker ps ```

Copy self-signed certificate from the docker container
```docker cp <CONTAINER ID>:/app/certs/server_cert.pem ./server_cert.pem```


## Clear all docker images, build, and then run the container
```zsh
docker builder prune --all &&
docker build --no-cache --progress=plain -t flare-ai-kit-ra-tls-client -f examples/ra-tls-client/Dockerfile . > build.log 2>&1 &&
docker run --rm -p 4433:4433 -it --env-file .env flare-ai-kit-ra-tls-client
```

# Debug

Check if the port is exposed
```netstat -a -n | grep 8080```

Connect with curl using self-signed certificate
```curl -v --cacert server_cert.pem https://localhost:8080/api/routes/chat```

# Use self-signed certificate

List docker containers to get the CONTAINER ID and then copy the certificate out from from the container
```
docker ps
docker cp <CONTAINER ID>:/app/certs/server_cert.pem ./server_cert.pem
```

Add the certificate to your browser


#

cat Dockerfile pyproject.toml settings.py supervisord.conf | pbcopy

# Generate certifcates

pip install cryptography
mkdir certs
python generate_cert.py --ip 34.123.45.67 --output-dir certs

# Create TEE
```zsh
export INSTANCE_NAME="flare-ai-sdk-beardofginger"
export TEE_IMAGE_REFERENCE="ghcr.io/simonjonsson87/flare-ai-kit:tee-dev"
export GEMINI_API_KEY="AIzaSyCka9NwqsnJT4w6dNUF-rYznO5U4p3WgP0"
export GEMINI_MODEL="gemini-2.0-flash"
export WEB3_PROVIDER_URL="https://flare-api.flare.network/ext/C/rpc"

gcloud compute instances create $INSTANCE_NAME \
--project=verifiable-ai-hackathon \
--zone=us-central1-c \
--machine-type=n2d-standard-2 \
--network-interface=network-tier=PREMIUM,nic-type=GVNIC,stack-type=IPV4_ONLY,subnet=default \
--metadata=tee-image-reference=$TEE_IMAGE_REFERENCE,\
tee-container-log-redirect=true,\
tee-env-GEMINI_API_KEY=$GEMINI_API_KEY,\
tee-env-GEMINI_MODEL=$GEMINI_MODEL,\
tee-env-WEB3_PROVIDER_URL=$WEB3_PROVIDER_URL,\
tee-env-SIMULATE_ATTESTATION=false \
--maintenance-policy=MIGRATE \
--provisioning-model=STANDARD \
--service-account=confidential-sa@verifiable-ai-hackathon.iam.gserviceaccount.com \
--scopes=https://www.googleapis.com/auth/cloud-platform \
--min-cpu-platform="AMD Milan" \
--tags=flare-ai,http-server,https-server \
--create-disk=auto-delete=yes,\
boot=yes,\
device-name=$INSTANCE_NAME,\
image=projects/confidential-space-images/global/images/confidential-space-debug-250100,\
mode=rw,\
size=11,\
type=pd-standard \
--shielded-secure-boot \
--shielded-vtpm \
--shielded-integrity-monitoring \
--reservation-affinity=any \
--confidential-compute-type=SEV
```


# Running without Docker

Setup .venv
```python3.12 -m venv .venv```

Activate .venv
```source .venv/bin/activate```

Install flare_ai_kit for editing
```pip install -e ../../```

Install libraries
```uv sync --all-extras```

Install modified tlslite-ng
```pip install git+https://github.com/simonjonsson87/tlslite-ng.git```

Loading