Skip to content

Commit 23399e1

Browse files
refactoring
1 parent 8ef2817 commit 23399e1

File tree

13 files changed

+105
-214
lines changed

13 files changed

+105
-214
lines changed

.gitignore

+2-2
Original file line numberDiff line numberDiff line change
@@ -165,8 +165,8 @@ node_modules
165165

166166
scratch
167167
my_history.txt
168-
./my_config.yaml
168+
/my_config.yaml
169169
my_calendar.yaml
170170
prompts/*
171-
!prompts/private_prompt.txt
171+
!prompts/system_prompt.txt
172172
calendar_cache.pkl

README.md

+26-32
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,4 @@
1-
Here is the GitHub documentation for the AI Assistant project with proper Markdown formatting:
2-
3-
---
4-
5-
# Computer
6-
7-
I named this project "computer", because in the long run I want to implement hotword "Hei Computer" to trigger start of conversation.
8-
1+
# Bobik - AI Assistant
92

103
This project wraps most of langchain features and agent functionality and abstracts them into configuration file.
114
As a user, you need to set up necessary api keys in .env file and define your models in my_config.yaml file.
@@ -43,6 +36,7 @@ Features:
4336
--- google search
4437
--- wikipedia
4538
- piping stdin input as question
39+
- pasting input from clipboard - Use following template "my question here: <paste>"
4640
- llm providers
4741
-- openai
4842
-- groq
@@ -68,7 +62,7 @@ To install the AI Assistant, follow these steps:
6862
2. Clone the repository from GitHub.
6963
3. Install the required packages using `pip install -r requirements.txt`.
7064
4. See `examples` to set up necessary environment variables and config yaml file.
71-
5. Run the `computer.py` script to start the AI Assistant.
65+
5. Run the `run.py` script to start the AI Assistant.
7266

7367
## Windows
7468
Main functionality on Windows is working fine.
@@ -94,10 +88,10 @@ See `examples` folder for more info.
9488
#### State change
9589
The App pre-parsers can be used to change the model, input or output method and more things with the first words in the message. Here are some examples:
9690

97-
- To change the model to `gpt-3.5-turbo`, you can use the following pre-parser command: `computer.py gpt-3.5-turbo speak listen tell me a story`.
98-
- To change the input and output method to `voice`, you can use the following pre-parser command: `computer.py verbal`.
99-
- To change only input: `computer.py listen`. Remember to have same name here that is configured in `my_config.yaml` file under `io_input`, `io_output` and `models`.
100-
- Or just use default method that mostly should be set up as `text`: `computer.py When I will have time for a jogging session. Check my calendar events and weather so it is not raining and I have no meetings then.`.
91+
- To change the model to `gpt-3.5-turbo`, you can use the following pre-parser command: `run.py gpt-3.5-turbo speak listen tell me a story`.
92+
- To change the input and output method to `voice`, you can use the following pre-parser command: `run.py verbal`.
93+
- To change only input: `run.py listen`. Remember to have same name here that is configured in `my_config.yaml` file under `io_input`, `io_output` and `models`.
94+
- Or just use default method that mostly should be set up as `text`: `run.py When I will have time for a jogging session. Check my calendar events and weather so it is not raining and I have no meetings then.`.
10195

10296
Note: These pre-parser commands should be included at the beginning of the first message.
10397

@@ -141,14 +135,14 @@ class MotivationTool(BaseTool):
141135
```
142136
3. Register the custom tool with the `ToolLoader` class by adding the following line to the `app/tool_loader.py` file:
143137
```python
144-
computer = Computer()
145-
# ... see computer.py for more details
146-
computer.load_config_and_state()
147-
computer.load_options()
148-
computer.load_state_change_parser()
149-
computer.load_manager()
150-
computer.tool_provider.add_tool(MotivationTool())
151-
computer.start(False, "I need something to cheer me up.")
138+
app = App()
139+
# ... see app.py for more details
140+
app.load_config_and_state()
141+
app.load_options()
142+
app.load_state_change_parser()
143+
app.load_manager()
144+
app.tool_provider.add_tool(MotivationTool())
145+
app.start(False, "I need something to cheer me up.")
152146
```
153147

154148
It is good idea to create your own main python script. Then origin future updates will not destroy your code.
@@ -157,14 +151,14 @@ It is good idea to create your own main python script. Then origin future update
157151

158152
### Using the Tool as Is
159153

160-
To use the AI Assistant as is, simply run the `computer.py` script and start interacting with it through the command line interface.
154+
To use the AI Assistant as is, simply run the `app.py` script and start interacting with it through the command line interface.
161155
It will run in the loop using default `agent` (depends on model configuration) mode with memory attached.
162156

163157
### Using the Tool as a Library
164158

165159
You can also use the AI Assistant as a library in your own Python projects.
166160
To do this, you need to import the necessary classes and functions from the `app` package and
167-
create an instance of the `Computer` or `ConversationManager` class.
161+
create an instance of the `App` or `ConversationManager` class.
168162

169163
The `ConversationManager` class is responsible for managing the conversation between the user and the AI Assistant.
170164
It provides various methods for sending messages, handling events, and customizing the behavior of the AI Assistant.
@@ -182,21 +176,21 @@ Here are some examples of how you can extend the code with your own tools:
182176
Here are some examples of how you can execute the project:
183177

184178
- Using the `--quiet` mode:
185-
If you want to run the AI Assistant in quiet mode, you can use the `--quiet` flag when running the `computer.py` script.
179+
If you want to run the AI Assistant in quiet mode, you can use the `--quiet` flag when running the `app.py` script.
186180
In quiet mode, the AI Assistant will not print any messages to the console, except for LLM final answer.
187181
```bash
188-
python computer.py --quiet
182+
python app.py --quiet
189183
```
190184
- Using different models:
191-
If you want to use a different language model than the one that is enabled by default, you can use the model name when running the `computer.py`.
185+
If you want to use a different language model than the one that is enabled by default, you can use the model name when running the `app.py`.
192186
The question pre-parser will hard match first parameters with available configuration and will change application state.
193187
After that application will realize that state was changed and will reload the configuration, model and agent.
194188

195189
- Switching models during usage:
196190
During usage, you can change models and history will be kept. So you can ask a story from one model and then switch to another model and ask for summary.
197191
This works only when `--once` parameter is not used.
198192
```bash
199-
computer.py --quiet groq tell me 3 sentence story
193+
app.py --quiet groq tell me 3 sentence story
200194
> Here is a 3 sentence story: As the sun set over the Berlin skyline, a young artist named Lena sat on the banks of the Spree River, her paintbrush dancing across the canvas as she tried to capture the vibrant colors of the city. Meanwhile, a group of friends laughed and chatted as they strolled along the riverbank, enjoying the warm summer evening. In the distance, the sounds of a street performer's guitar drifted through the air, adding to the lively atmosphere of the city.
201195
> Master: gpt4o
202196
> Master: summarize the story in 1 sentence
@@ -205,7 +199,7 @@ computer.py --quiet groq tell me 3 sentence story
205199
206200
There are also built in tools that will understand that you want to change model from the message.
207201
```bash
208-
computer.py groq
202+
app.py groq
209203
phrase 'groq' detected.
210204
Changed model to groq
211205
Got 1 args: ['groq']
@@ -223,7 +217,7 @@ Thought: Do I need to use a tool? No
223217
AI: Model switched to gpt3. I'm ready to assist you. How can I help you today?
224218

225219
> Finished chain.
226-
Computer: Model switched to gpt3. I'm ready to assist you. How can I help you today?
220+
Bobik: Model switched to gpt3. I'm ready to assist you. How can I help you today?
227221
text → gpt3 (gpt-3.5-turbo) → write
228222
Master: |
229223
```
@@ -235,15 +229,15 @@ Same thing is implemented for:
235229
- turn on/off tools (agent vs no agent)
236230
237231
### Using the `--once` parameter:
238-
If you want to run the AI Assistant only once and then exit, you can use the `--once` flag when running the `computer.py` script.
232+
If you want to run the AI Assistant only once and then exit, you can use the `--once` flag when running the `app.py` script.
239233
The `--once` flag should be followed by the message that you want to send to the AI Assistant.
240234
```bash
241-
python computer.py --once "What's the weather like today?"
235+
python app.py --once "What's the weather like today?"
242236
```
243237
The `--once` parameter is useful when you want to use the AI Assistant to perform a specific task and then exit, without having to interact with it through the command line interface.
244238
It is good idea to combine `--once` together with `--quiet` parameter. Then you will get only answer without any additional information.
245239
```bash
246-
cat my_code.py | ./computer.py --once --quiet Reformat given python code > my_code_reformatted.py
240+
cat my_code.py | ./app.py --once --quiet Reformat given python code > my_code_reformatted.py
247241
```
248242
249243
It is good idea to prepare specific model for these kind of tasks and in those tools use specialized `prompt`

computer.py app/app.py

+1-30
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,3 @@
1-
#!/usr/bin/env python
2-
# -*- coding: utf-8 -*-
3-
4-
# get rid of deprecation warning stdout.
5-
import warnings ; warnings.warn = lambda *args,**kwargs: None
6-
71
import asyncio
82
import yaml
93
import sys
@@ -24,7 +18,7 @@
2418
load_dotenv()
2519

2620

27-
class Computer:
21+
class App:
2822
def __init__(self, config_file: str = ""):
2923
self.config_file = config_file
3024
self.manager = None
@@ -149,26 +143,3 @@ def stdin_input(self) -> str:
149143
stdin_input = "\n\n" + stdin_input
150144

151145
return stdin_input
152-
153-
154-
def computer_run():
155-
app = Computer()
156-
app.load_config_and_state()
157-
app.load_options()
158-
app.load_state_change_parser()
159-
160-
loop, quiet, first_question = app.process_arguments()
161-
stdin_input = app.stdin_input()
162-
163-
if quiet:
164-
app.state.is_quiet = quiet
165-
166-
app.load_manager()
167-
if not loop:
168-
app.manager.reload_agent()
169-
170-
app.start(loop, first_question + stdin_input)
171-
172-
173-
if __name__ == "__main__":
174-
computer_run()

app/io_input.py

+8
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
import asyncio
22
import logging
3+
import pyperclip
34
import os
45
if os.name == 'nt':
56
from pyreadline import Readline
@@ -111,6 +112,13 @@ async def get_input(self):
111112
)
112113
else:
113114
text = input(f"{self.config.user_name}: ")
115+
split_text = text.split(":")
116+
if len(split_text) > 1:
117+
print("###", )
118+
clipboard_content = pyperclip.paste()
119+
if clipboard_content != "" and text != clipboard_content and split_text[1].strip() in clipboard_content:
120+
text = f"{text[0]}. Use clipboard."
121+
114122
self.handle_full_sentence(text)
115123

116124
def handle_full_sentence(self, text):

app/llm_provider.py

+2
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,7 @@ def get_model(self):
6868
temperature=self.state.temperature,
6969
base_url=self.config.lmstudio_provider_settings["base_url"],
7070
openai_api_key="not-needed",
71+
max_tokens=4096,
7172
)
7273
return model
7374

@@ -76,6 +77,7 @@ def get_model(self):
7677
model=model_name,
7778
temperature=self.state.temperature,
7879
base_url=self.config.ollama_settings["url"],
80+
#max_tokens=8192,
7981
)
8082

8183
raise ValueError(f"model {self.state.llm_model} was not found. Probably it dont have api key set.")

app/tools/my_calendar.py

+4-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,10 @@ class CalendarEventTool(BaseTool):
1111
"""Tool for fetching calendar events."""
1212

1313
name: str = "calendar_events"
14-
description: str = "Use this tool to fetch upcoming calendar events. Tool have one optional argument 'date'."
14+
description: str = (
15+
"Use this tool to fetch upcoming calendar events. "
16+
"Tool have one optional argument that can have values like 'now', 'today', 'tomorrow' or specific date in format 'YYYY-MM-DD'."
17+
)
1518
calendar: Calendar = None
1619

1720
def _run(

app/tools/weather.py

+19-19
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ class WeatherTool(BaseTool):
1414
description: str = (
1515
"A wrapper around Weather Search. "
1616
"Useful for when you need to know current or upcoming weather. "
17-
"Tool have one optional argument 'date'."
17+
"Tool have one optional argument that can have values like 'now', 'today', 'tomorrow' or specific date in format 'YYYY-MM-DD'."
1818
)
1919
cache: dict = {}
2020
config: Configuration = None
@@ -52,25 +52,25 @@ def _run(
5252
weather_info.append("Location: ".join(location_info))
5353
weather_info.append("")
5454

55-
weather_info.append("# Current Weather:")
56-
for current_condition in data['current_condition']:
57-
for description in current_condition['weatherDesc']:
58-
weather_info.append(f"- Condition: {description['value']}")
59-
weather_info.append(f"- Temperature (°C): {current_condition['temp_C']}")
60-
weather_info.append(f"- Humidity: {current_condition['humidity']}")
61-
weather_info.append(f"- Cloud Cover (%): {current_condition['cloudcover']}")
62-
weather_info.append(f"- Wind Speed (km/h): {current_condition['windspeedKmph']}")
63-
weather_info.append("# Forecast:")
64-
for current_condition in data['weather']:
55+
if filter_date is None or filter_date_date == now.strftime("%Y-%m-%d"):
56+
weather_info.append("# Current Weather:")
57+
for current_condition in data['current_condition']:
58+
for description in current_condition['weatherDesc']:
59+
weather_info.append(f"- Condition: {description['value']}")
60+
weather_info.append(f"- Temperature (°C): {current_condition['temp_C']}")
61+
weather_info.append(f"- Humidity: {current_condition['humidity']}")
62+
weather_info.append(f"- Cloud Cover (%): {current_condition['cloudcover']}")
63+
weather_info.append(f"- Wind Speed (km/h): {current_condition['windspeedKmph']}")
6564

66-
if filter_date is not None and filter_date_date != current_condition['date']:
67-
continue
68-
69-
weather_info.append(f"- {current_condition['date']}")
70-
for description in current_condition['hourly']:
71-
time = str(description['time']).zfill(4)
72-
weather_info.append(
73-
f"-- {time[:2]}:{time[2:]}: {description['weatherDesc'][0]['value']}, {description['tempC']}°C, {description['chanceofrain']}% rain, {description['windspeedKmph']} km/h")
65+
weather_info.append("# Forecast:")
66+
for current_condition in data['weather']:
67+
if filter_date is not None and filter_date_date != current_condition['date']:
68+
continue
69+
weather_info.append(f"- {current_condition['date']}")
70+
for description in current_condition['hourly']:
71+
time = str(description['time']).zfill(4)
72+
weather_info.append(
73+
f"-- {time[:2]}:{time[2:]}: {description['weatherDesc'][0]['value']}, {description['tempC']}°C, {description['chanceofrain']}% rain, {description['windspeedKmph']} km/h")
7474

7575
self.cache[filter_date_date] = "\n".join(weather_info)
7676

examples/1_minimal_groq/my_config.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ agent:
1010
prompts:
1111
- default
1212
temperature: 0
13-
name: Computer
13+
name: Bobik
1414
max_tries: 3
1515
sleep_seconds_between_tries: 2
1616
agent_type: conversational-react-description

examples/2_minimal_groq_with_voice/my_config.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ agent:
1010
prompts:
1111
- default
1212
temperature: 0
13-
name: Computer
13+
name: Bobik
1414
max_tries: 3
1515
sleep_seconds_between_tries: 2
1616
agent_type: conversational-react-description

examples/3_full/my_config.yaml

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ agent:
1212
- default
1313
# - private
1414
temperature: 0
15-
name: Computer
15+
name: Bobik
1616
max_tries: 3
1717
sleep_seconds_between_tries: 2
1818
agent_type: conversational-react-description

0 commit comments

Comments
 (0)