Learn how to call an AI agent and pay for it in CELO.
This tutorial is based on the guide for the Olas mech-client.
This tutorial will guide you through integrating Olas into your dApp by setting up a backend, configuring the environment, and connecting to the frontend. By the end, you'll have a functional project that utilizes the Olas mech for generating prompts and interacting with blockchain tools.
Before you start, ensure you have the following installed:
- Python 3.8+
- Node.js and Yarn
- Poetry (for Python dependency management)
- A MetaMask wallet to export a private key
- A Quicknode account (for RPC and WSS endpoints)
This example project is built using the Celo Composer. You can do the quickstart by following this guide.
Follow the guide below to call the mech from any dapp.
Inside your dApp's directory, create a new backend folder and set up a Python project:
poetry new backend
cd backendActivate the Poetry shell:
poetry shellInstall the mech-client package:
poetry add mech-client-
Create a file to store your private key securely:
touch ethereum_private_key.txt
-
Export your private key from MetaMask and save it in
ethereum_private_key.txt. It should be prefunded with some CELO. This agent will only run on Celo Mainnet. -
Add the file to your
.gitignoreto prevent accidental uploads:echo ethereum_private_key.txt >> .gitignore
Create an .env file to store your RPC and WSS endpoints. We recommend using Quicknode:
MECHX_CHAIN_RPC=https://proud-proud-layer.celo-mainnet.quiknode.pro/<your-key>
MECHX_WSS_ENDPOINT=wss://proud-proud-layer.celo-mainnet.quiknode.pro/<your-key>Create a script file:
touch my_script.pyEdit my_script.py:
from mech_client.interact import interact, ConfirmationType
def get_prompt(prompt_text):
agent_id = 2
tool_name = "openai-gpt-3.5-turbo" # Replace with your tool
chain_config = "celo"
private_key_path = "ethereum_private_key.txt"
result = interact(
prompt=prompt_text,
agent_id=agent_id,
tool=tool_name,
chain_config=chain_config,
confirmation_type=ConfirmationType.ON_CHAIN,
private_key_path=private_key_path
)
return resultIf you encounter the error:
ModuleNotFoundError: No module named 'pkg_resources'Resolve it by upgrading setuptools:
pip install --upgrade setuptoolsExecute the script:
python my_script.pyInstall Flask and Flask-CORS:
pip install Flask flask-corsCreate app.py and add the following:
from flask import Flask, jsonify, request
from flask_cors import CORS
from my_script import get_prompt
app = Flask(__name__)
CORS(app)
@app.route('/get-prompt', methods=['GET'])
def get_chat_gpt_request():
prompt = request.args.get('prompt', 'Write a Haiku about web3 hackathons?')
try:
response = get_prompt(prompt)
return jsonify({"success": True, "response": response}), 200
except Exception as e:
return jsonify({"success": False, "error": str(e)}), 500
if __name__ == '__main__':
app.run(debug=True)Inside your Next.js project, create pages/api/get-prompt.ts:
import type { NextApiRequest, NextApiResponse } from 'next';
import axios from 'axios';
export default async function handler(req: NextApiRequest, res: NextApiResponse) {
const prompt = req.query.prompt;
try {
const { data } = await axios.get(`http://127.0.0.1:5000/get-prompt?prompt=${prompt}`);
res.status(200).json(data);
} catch (error) {
res.status(500).json({ message: error.message });
}
}In your React component, define state and handlers:
const [yourPrompt, setYourPrompt] = useState('');
const [response, setResponse] = useState<string | null>(null);
async function fetchPromptData(prompt: string) {
const res = await fetch(`/api/get-prompt?prompt=${encodeURIComponent(prompt)}`);
const data = await res.json();
setResponse(data.response);
}
const handleFetchClick = () => {
if (yourPrompt) fetchPromptData(yourPrompt);
};Add these elements to your component:
<div>
<input
type="text"
value={yourPrompt}
onChange={(e) => setYourPrompt(e.target.value)}
/>
<button onClick={handleFetchClick}>Fetch Prompt</button>
{response && <p>Response: {response}</p>}
</div>- Testing: Use Postman or
curlto test your API before frontend integration. - Deployment: Consider hosting your backend using services like Heroku or AWS.
- Extensions: Explore more tools available in the Olas Mech library.
Let me know if you'd like more adjustments!