-
-
Notifications
You must be signed in to change notification settings - Fork 8
Expand file tree
/
Copy pathsetup_workbench.log
More file actions
290 lines (282 loc) · 14.7 KB
/
setup_workbench.log
File metadata and controls
290 lines (282 loc) · 14.7 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
==== Ollama Workbench Setup Started ====
Wed May 21 17:01:30 EDT 2025 - Starting setup
Wed May 21 17:01:30 EDT 2025 - CHECKPOINT: Activating existing virtual environment.
Wed May 21 17:01:30 EDT 2025 - CHECKPOINT: Uninstalling all NumPy, torch, torchvision, torchaudio.
Wed May 21 17:01:33 EDT 2025 - CHECKPOINT: Cleaning up .pyc and __pycache__ files.
Wed May 21 17:01:35 EDT 2025 - CHECKPOINT: Upgrading pip.
Wed May 21 17:01:36 EDT 2025 - CHECKPOINT: Installing compatible versions of NumPy, PyTorch, Streamlit.
Wed May 21 17:02:04 EDT 2025 - CHECKPOINT: Installing all other dependencies.
Wed May 21 17:02:25 EDT 2025 - CHECKPOINT: Verifying installed versions.
Wed May 21 17:02:40 EDT 2025 - CHECKPOINT: Ensuring Ollama server is running.
==== Checking Ollama Server Status ====
CHECKPOINT: Verifying Ollama server is running
CHECKPOINT: Ollama server is already running
CHECKPOINT: Testing Ollama API connection...
CHECKPOINT: Ollama API is responding
CHECKPOINT: Ollama server check complete
Wed May 21 17:02:40 EDT 2025 - CHECKPOINT: Launching Ollama Workbench.
You can now view your Streamlit app in your browser.
Local URL: http://localhost:8501
Network URL: http://192.168.1.16:8501
/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/flaml/__init__.py:20: UserWarning: flaml.automl is not available. Please install flaml[automl] to enable AutoML functionalities.
warnings.warn("flaml.automl is not available. Please install flaml[automl] to enable AutoML functionalities.")
pygame 2.6.1 (SDL 2.28.4, Python 3.11.9)
Hello from the pygame community. https://www.pygame.org/contribute.html
* Serving Flask app 'openai_compatibility'
* Debug mode: off
Address already in use
Port 8000 is in use by another program. Either identify and stop that program, or start the server with a different port.
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'openai_compatibility'
* Debug mode: off
Address already in use
Port 8000 is in use by another program. Either identify and stop that program, or start the server with a different port.
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'openai_compatibility'
* Debug mode: off
Address already in use
Port 8000 is in use by another program. Either identify and stop that program, or start the server with a different port.
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
2025-05-22 05:56:37.532 Please replace `st.experimental_rerun` with `st.rerun`.
`st.experimental_rerun` will be removed after 2024-04-01.
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
2025-05-22 05:56:41.954 Please replace `st.experimental_rerun` with `st.rerun`.
`st.experimental_rerun` will be removed after 2024-04-01.
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'openai_compatibility'
* Debug mode: off
Address already in use
Port 8000 is in use by another program. Either identify and stop that program, or start the server with a different port.
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'openai_compatibility'
* Debug mode: off
Address already in use
Port 8000 is in use by another program. Either identify and stop that program, or start the server with a different port.
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
* Serving Flask app 'integrated_main'
* Debug mode: off
Stopping...
Stopping...
Exception ignored in: <module 'threading' from '/opt/homebrew/Caskroom/miniconda/base/envs/ollamaworkbench/lib/python3.11/threading.py'>
Traceback (most recent call last):
File "/opt/homebrew/Caskroom/miniconda/base/envs/ollamaworkbench/lib/python3.11/threading.py", line 1590, in _shutdown
lock.acquire()
File "/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/streamlit/web/bootstrap.py", line 69, in signal_handler
server.stop()
File "/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/streamlit/web/server/server.py", line 397, in stop
self._runtime.stop()
File "/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/streamlit/runtime/runtime.py", line 308, in stop
async_objs.eventloop.call_soon_threadsafe(stop_on_eventloop)
File "/opt/homebrew/Caskroom/miniconda/base/envs/ollamaworkbench/lib/python3.11/asyncio/base_events.py", line 807, in call_soon_threadsafe
self._check_closed()
File "/opt/homebrew/Caskroom/miniconda/base/envs/ollamaworkbench/lib/python3.11/asyncio/base_events.py", line 520, in _check_closed
raise RuntimeError('Event loop is closed')
RuntimeError: Event loop is closed
Thu May 22 06:08:58 EDT 2025 - CHECKPOINT: Installing additional dependencies for robust Ollama utilities.
Thu May 22 06:08:59 EDT 2025 - CHECKPOINT: Verifying installed versions.
Thu May 22 06:09:02 EDT 2025 - CHECKPOINT: Ensuring Ollama server is running.
==== Checking Ollama Server Status ====
CHECKPOINT: Verifying Ollama server is running
CHECKPOINT: Ollama server is already running
CHECKPOINT: Testing Ollama API connection...
CHECKPOINT: Ollama API is responding
CHECKPOINT: Ollama server check complete
Thu May 22 06:09:02 EDT 2025 - CHECKPOINT: Launching Ollama Workbench.
You can now view your Streamlit app in your browser.
Local URL: http://localhost:8501
Network URL: http://192.168.1.16:8501
/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/flaml/__init__.py:20: UserWarning: flaml.automl is not available. Please install flaml[automl] to enable AutoML functionalities.
warnings.warn("flaml.automl is not available. Please install flaml[automl] to enable AutoML functionalities.")
pygame 2.6.1 (SDL 2.28.4, Python 3.11.9)
Hello from the pygame community. https://www.pygame.org/contribute.html
* Serving Flask app 'openai_compatibility'
* Debug mode: off
Address already in use
Port 8000 is in use by another program. Either identify and stop that program, or start the server with a different port.
* Serving Flask app 'integrated_main'
* Debug mode: off
2025-05-22 06:09:12.459 Uncaught app exception
Traceback (most recent call last):
File "/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 541, in _run_script
exec(code, module.__dict__)
File "/Volumes/FILES/code/Ollama-Workbench/integrated_main.py", line 934, in <module>
main()
File "/Volumes/FILES/code/Ollama-Workbench/integrated_main.py", line 880, in main
params = dict(st.query_params)
^^^^^^^^^^^^^^^
AttributeError: module 'streamlit' has no attribute 'query_params'
2025-05-22 06:09:34.531 Uncaught app exception
Traceback (most recent call last):
File "/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 541, in _run_script
exec(code, module.__dict__)
File "/Volumes/FILES/code/Ollama-Workbench/integrated_main.py", line 934, in <module>
main()
File "/Volumes/FILES/code/Ollama-Workbench/integrated_main.py", line 880, in main
params = dict(st.query_params)
^^^^^^^^^^^^^^^
AttributeError: module 'streamlit' has no attribute 'query_params'
* Serving Flask app 'integrated_main'
* Debug mode: off
2025-05-22 06:10:17.566 Uncaught app exception
Traceback (most recent call last):
File "/Volumes/FILES/code/Ollama-Workbench/venv/lib/python3.11/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 541, in _run_script
exec(code, module.__dict__)
File "/Volumes/FILES/code/Ollama-Workbench/integrated_main.py", line 934, in <module>
main()
File "/Volumes/FILES/code/Ollama-Workbench/integrated_main.py", line 880, in main
params = dict(st.query_params)
^^^^^^^^^^^^^^^
AttributeError: module 'streamlit' has no attribute 'query_params'
* Serving Flask app 'integrated_main'
* Debug mode: off
2025-05-23 08:27:25,786 - INFO - 🚀 Starting Ollama Workbench setup...
2025-05-23 08:27:25,787 - INFO -
==================================================
2025-05-23 08:27:25,787 - INFO - 📋 STEP: Prerequisites
2025-05-23 08:27:25,787 - INFO - ==================================================
2025-05-23 08:27:25,787 - INFO - 🔍 Checking system prerequisites...
2025-05-23 08:27:25,787 - INFO - ✅ Python 3.11.9
2025-05-23 08:27:25,815 - INFO - ✅ All required commands available
2025-05-23 08:27:25,815 - INFO - ✅ Prerequisites completed successfully
2025-05-23 08:27:25,815 - INFO -
==================================================
2025-05-23 08:27:25,815 - INFO - 📋 STEP: Virtual Environment
2025-05-23 08:27:25,815 - INFO - ==================================================
2025-05-23 08:27:25,815 - INFO - 🐍 Setting up virtual environment...
2025-05-23 08:27:25,815 - INFO - ♻️ Virtual environment already exists
2025-05-23 08:27:25,815 - INFO - ✅ Virtual environment ready
2025-05-23 08:27:25,815 - INFO - ✅ Virtual Environment completed successfully
2025-05-23 08:27:25,815 - INFO -
==================================================
2025-05-23 08:27:25,815 - INFO - 📋 STEP: Dependencies
2025-05-23 08:27:25,815 - INFO - ==================================================
2025-05-23 08:27:25,815 - INFO - 📦 Installing dependencies...
2025-05-23 08:27:26,639 - INFO - ✅ Pip upgraded
2025-05-23 08:27:26,639 - INFO - 📥 Installing packages from requirements.txt...
2025-05-23 08:28:05,726 - INFO - ✅ All dependencies installed successfully
2025-05-23 08:28:05,726 - INFO - ✅ Dependencies completed successfully
2025-05-23 08:28:05,726 - INFO -
==================================================
2025-05-23 08:28:05,726 - INFO - 📋 STEP: Ollama Installation
2025-05-23 08:28:05,726 - INFO - ==================================================
2025-05-23 08:28:05,726 - INFO - 🦙 Checking Ollama installation...
2025-05-23 08:28:05,780 - INFO - ✅ Ollama already installed
2025-05-23 08:28:05,780 - INFO - ✅ Ollama Installation completed successfully
2025-05-23 08:28:05,780 - INFO -
==================================================
2025-05-23 08:28:05,780 - INFO - 📋 STEP: Directory Setup
2025-05-23 08:28:05,780 - INFO - ==================================================
2025-05-23 08:28:05,780 - INFO - 📁 Setting up directories...
2025-05-23 08:28:05,781 - INFO - ✅ Directory: data
2025-05-23 08:28:05,781 - INFO - ✅ Directory: uploads
2025-05-23 08:28:05,781 - INFO - ✅ Directory: models
2025-05-23 08:28:05,781 - INFO - ✅ Directory: projects
2025-05-23 08:28:05,781 - INFO - ✅ Directory: cache
2025-05-23 08:28:05,781 - INFO - ✅ Directory: security
2025-05-23 08:28:05,781 - INFO - ✅ Directory: security/audit
2025-05-23 08:28:05,781 - INFO - ✅ Directory: logs
2025-05-23 08:28:05,781 - INFO - ✅ Directory: sessions
2025-05-23 08:28:05,781 - INFO - ✅ Directory Setup completed successfully
2025-05-23 08:28:05,781 - INFO -
==================================================
2025-05-23 08:28:05,781 - INFO - 📋 STEP: Security Framework
2025-05-23 08:28:05,781 - INFO - ==================================================
2025-05-23 08:28:05,781 - INFO - 🔐 Initializing security framework...
2025-05-23 08:28:06,806 - INFO - ✅ Security framework initialized
2025-05-23 08:28:06,806 - INFO - ✅ Security Framework completed successfully
2025-05-23 08:28:06,806 - INFO -
==================================================
2025-05-23 08:28:06,806 - INFO - 📋 STEP: Configuration
2025-05-23 08:28:06,806 - INFO - ==================================================
2025-05-23 08:28:06,806 - INFO - ⚙️ Creating configuration...
2025-05-23 08:28:06,807 - INFO - ✅ Configuration created
2025-05-23 08:28:06,807 - INFO - ✅ Configuration completed successfully
2025-05-23 08:28:06,807 - INFO -
==================================================
2025-05-23 08:28:06,807 - INFO - 📋 STEP: Ollama Server
2025-05-23 08:28:06,807 - INFO - ==================================================
2025-05-23 08:28:06,807 - INFO - 🚀 Starting Ollama server...
2025-05-23 08:28:06,966 - INFO - ✅ Ollama server already running
2025-05-23 08:28:06,966 - INFO - ✅ Ollama Server completed successfully
2025-05-23 08:28:06,966 - INFO -
==================================================
2025-05-23 08:28:06,966 - INFO - 📋 STEP: Default Model
2025-05-23 08:28:06,966 - INFO - ==================================================
2025-05-23 08:28:06,966 - INFO - 📥 Pulling default model...
2025-05-23 08:30:10,919 - INFO - ✅ Default model (llama3.2:1b) downloaded
2025-05-23 08:30:10,921 - INFO - ✅ Default Model completed successfully
2025-05-23 08:30:10,922 - INFO -
==================================================
2025-05-23 08:30:10,922 - INFO - 📋 STEP: Startup Script
2025-05-23 08:30:10,923 - INFO - ==================================================
2025-05-23 08:30:10,923 - INFO - 📜 Creating startup script...
2025-05-23 08:30:10,925 - INFO - ✅ Startup script created: start_workbench.sh
2025-05-23 08:30:10,925 - INFO - ✅ Startup Script completed successfully
2025-05-23 08:30:10,925 - INFO -
==================================================
2025-05-23 08:30:10,925 - INFO - 📋 STEP: Basic Tests
2025-05-23 08:30:10,926 - INFO - ==================================================
2025-05-23 08:30:10,926 - INFO - 🧪 Running basic tests...
2025-05-23 08:30:11,708 - INFO - ✅ Basic tests passed
2025-05-23 08:30:11,709 - INFO - ✅ Basic Tests completed successfully
2025-05-23 08:30:11,709 - INFO -
🎉 Setup completed successfully!