Skip to content

Commit d34cb37

Browse files
committed
feat(models): add model tuning support
1 parent 2f8c623 commit d34cb37

File tree

11 files changed

+205
-66
lines changed

11 files changed

+205
-66
lines changed

.env.example

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,7 @@
11
# Local variables
2-
GEMINI_API_KEY=your-api-key
2+
GEMINI_API_KEY=your-api-key
3+
# Tuning parameters
4+
TUNING_SOURCE_MODEL=models/gemini-1.5-flash-001-tuning
5+
TUNING_EPOCH_COUNT=100
6+
TUNING_BATCH_SIZE=4
7+
TUNING_LEARNING_RATE=0.001

README.md

Lines changed: 20 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,20 @@
1-
# Flare AI Social
1+
# Flare AI Social
2+
3+
## 🏗️ Build & Run Instructions
4+
5+
You can deploy Flare AI DeFAI using Docker (recommended) or set up the backend and frontend manually.
6+
7+
1. **Prepare the Environment File:**
8+
Rename `.env.example` to `.env` and update the variables accordingly.
9+
10+
2. **Tune a new model**
11+
12+
```bash
13+
uv run start-tuning
14+
```
15+
16+
3. **Test the new model**
17+
18+
```bash
19+
uv run start-social
20+
```

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ dependencies = [
2020

2121
[project.scripts]
2222
start-social = "flare_ai_social.main:start"
23+
start-tuning = "flare_ai_social.tune_model:start"
2324

2425
[build-system]
2526
requires = ["hatchling"]

src/data/training_data.json

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
[
2+
{
3+
"text_input": ".@Ripple x @Chainlink: $RLUSD has adopted the Chainlink standard for verifiable data to fuel DeFi adoption with real-time, secure pricing data. The future of stablecoins is here",
4+
"output": "I'm at a loss as to why this would be interesting. Ripple needs an oracle so that RLUSD can be used in lending protocols on Ethereum. Flare doesn't provide oracles to other networks yet. It's something that may come but there are much bigger opportunities to pursue first: FAssets and native staking for XRP and BTC to name but two."
5+
},
6+
{
7+
"text_input": "Man I hope I didn't make a mistake in going heavy with #flare and the @HugoPhilion team. Looking like U.S. based #Crypto like #XRP will get tax breaks on capital gains. Hugo, any plans to move to the USA?",
8+
"output": "No one has any knowledge of the rules around this yet. Rest assured if there is an opportunity for us to make sure that our American community aren't penalised for holding Flare relative to other tokens we will take it."
9+
},
10+
{
11+
"text_input": "Where can I short $TRUMP Coin? 😂",
12+
"output": "On Flare pretty soon you will be able to go long or short with leverage via @SparkDexAI perps."
13+
},
14+
{
15+
"text_input": "Uhhh, sorry guys, did we forget we are building the tech for the future? Some decentralized system that will benefit the people of this planet and save them from tyranical govs. banks and powerful multinationals? It wasn't meant to be used for thousands of memecoins... hello...",
16+
"output": "I understand that it looks like the market is going in the wrong direction but zoom out. Billions of dollars of value are assigned to things that often seem frivolous, such as art works, jewellery and fashion. There is no reason to think that the same human impulses that give these items value will not also be at play in a decentralized setting. Flare exists to allow people to build what they wish in the best way possible with maximum decentralization, whether that is perps on a hot meme coin, institutional finance or the future of AI. We are here for all of it."
17+
},
18+
{
19+
"text_input": "Regarding enshrined Oracles like Flare, Supra, et al... Conceptually I love it. It's a strong business assumption. But as a dev there are definitely scenarios where I'd want features of other L1s that the enshrined L1s do not offer. Whomever lands first with an Irresistible L1, epic marketing, easy onboarding, and a lost cost, easy-to-use, trust-less Oracle system - may just end up own a gigantic eco-system. The potential is huge. I can see devs flocking here 📈",
20+
"output": "What else do you want to see on Flare?"
21+
},
22+
{
23+
"text_input": "For all my coffee snobs out there what is the absolute best coffee/espresso machine?",
24+
"output": "Breville Touch if you want an easy life. Decent DE1Pro if you want to go mental."
25+
}
26+
]

src/flare_ai_social/__init__.py

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +0,0 @@
1-
def main() -> None:
2-
print("Hello from flare-ai-social!")

src/flare_ai_social/ai/base.py

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
from abc import ABC, abstractmethod
22
from dataclasses import dataclass
3-
from typing import Any, Literal, Protocol, TypedDict, runtime_checkable
3+
from typing import Any, Protocol, TypedDict, runtime_checkable
44

55
import httpx
66
import requests
@@ -74,19 +74,23 @@ def send_message(self, msg: str) -> ModelResponse:
7474
"""
7575

7676

77+
class Message(TypedDict):
78+
role: str
79+
content: str
80+
81+
7782
class CompletionRequest(TypedDict):
7883
model: str
7984
prompt: str
80-
81-
82-
class Message(TypedDict):
83-
role: Literal["user", "assistant", "system"]
84-
content: str
85+
max_tokens: int
86+
temperature: float
8587

8688

8789
class ChatRequest(TypedDict):
8890
model: str
8991
messages: list[Message]
92+
max_tokens: int
93+
temperature: float
9094

9195

9296
class BaseRouter:

src/flare_ai_social/main.py

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,9 +7,7 @@
77
genai.configure(api_key=settings.gemini_api_key)
88

99

10-
def start(tuned_model_id: str) -> None:
11-
tuned_model_id = "pugo-hillion"
12-
10+
def start(tuned_model_id: str = "pugo-hillion") -> None:
1311
tuned_models = [m.name for m in genai.list_tuned_models()]
1412
logger.info("available tuned models", tuned_models=tuned_models)
1513

@@ -27,4 +25,4 @@ def start(tuned_model_id: str) -> None:
2725

2826

2927
if __name__ == "__main__":
30-
start(tuned_model_id="pugo-hillion")
28+
start()
Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
1-
from .dataset import training_dataset
21
from .templates import SYSTEM_INSTRUCTION
32

4-
__all__ = ["SYSTEM_INSTRUCTION", "training_dataset"]
3+
__all__ = ["SYSTEM_INSTRUCTION"]

src/flare_ai_social/prompts/dataset.py

Lines changed: 0 additions & 26 deletions
This file was deleted.

src/flare_ai_social/settings.py

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
from pathlib import Path
2+
13
import structlog
24
from pydantic_settings import BaseSettings, SettingsConfigDict
35

@@ -11,6 +13,18 @@ class Settings(BaseSettings):
1113

1214
# API key for accessing Google's Gemini AI service
1315
gemini_api_key: str = ""
16+
# Tuning dataset path
17+
tuning_dataset_path: Path = (
18+
Path(__file__).parent.parent / "data" / "training_data.json"
19+
)
20+
# Base model to tune upon
21+
tuning_source_model: str = "models/gemini-1.5-flash-001-tuning"
22+
# Number of epochs to tune for
23+
tuning_epoch_count: int = 100
24+
# Batch size
25+
tuning_batch_size: int = 4
26+
# Learning rate
27+
tuning_learning_rate: float = 0.001
1428

1529
model_config = SettingsConfigDict(
1630
env_file=".env",

0 commit comments

Comments
 (0)