Skip to content

[BUG] llamactl not work on Windows, got encoding error #574

@Sun-ZhenXing

Description

@Sun-ZhenXing

My Windows default encoding is gbk. As is well known, this is an age-old tradition of Windows (no utf-8).

[uv sync] > uv sync --no-dev --inexact
[uv sync] Resolved 60 packages in 1ms
[uv sync] Audited 18 packages in 0.23ms
[pnpm install] > pnpm install
[pnpm install] Lockfile is up to date, resolution step is skipped
[pnpm install] Already up to date
[pnpm install] (node:14124) [DEP0169] DeprecationWarning: `url.parse()` behavior is not standardized and prone to errors that have security implications. Use the WHATWG URL API instead. CVEs are not issued for `url.parse()` vulnerabilities.
[pnpm install] (Use `node --trace-deprecation ...` to show where the warning was created)
[pnpm install]
[pnpm install] ╭ Warning ─────────────────────────────────────────────────────────────────────╮
[pnpm install] │                                                                              │
[pnpm install] │   Ignored build scripts: es5-ext, esbuild.                                   │
[pnpm install] │   Run "pnpm approve-builds" to pick which dependencies should be allowed     │
[pnpm install] │   to run scripts.                                                            │
[pnpm install] │                                                                              │
[pnpm install] ╰──────────────────────────────────────────────────────────────────────────────╯
[pnpm install]
[pnpm install] Done in 561ms using pnpm v10.11.0
> uv run --no-progress python -m llama_deploy.appserver.app --proxy-ui --reload --deployment-file D:\workspace\AlexProjects\Playground\showcase --open-browser
[npm run dev] > npm run dev
03:43:30 [info     ] Will watch for changes in these directories: ['D:\\workspace\\AlexProjects\\Playground\\showcase\\src'] [uvicorn.error]
03:43:30 [info     ] Uvicorn running on http://127.0.0.1:4501 (Press CTRL+C to quit) [uvicorn.error]
03:43:30 [info     ] Started reloader process [13272] using WatchFiles [uvicorn.error]
[npm run dev]
[npm run dev] > [email protected] dev
[npm run dev] > vite
[npm run dev]
[npm run dev]
[npm run dev]   VITE v5.4.21  ready in 213 ms
[npm run dev]
Exception in thread Thread-1 (_stream_source):
Traceback (most recent call last):
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\threading.py", line 1043, in _bootstrap_inner
    self.run()
    ~~~~~~~~^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\threading.py", line 994, in run
    self._target(*self._args, **self._kwargs)
    ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\workspace\AlexProjects\Playground\showcase\.venv\Lib\site-packages\llama_deploy\appserver\process_utils.py", line 206, in _stream_source
    writer.write(out)
    ~~~~~~~~~~~~^^^^^
UnicodeEncodeError: 'gbk' codec can't encode character '\u279c' in position 21: illegal multibyte sequence
03:43:32 [info     ] Started server process [19268] [uvicorn.error]
03:43:32 [info     ] Waiting for application startup. [uvicorn.error]
03:43:32 [info     ] Using local sqlite persistence for workflows [root]
03:43:32 [info     ] Application startup complete.  [uvicorn.error]
03:43:33 [info     ] GET /                          [app.access] duration_ms=572.01 status_code=307
03:43:34 [info     ] GET /deployments/app/          [app.access] duration_ms=381.65 status_code=307
03:44:11 [info     ] Shutting down                  [uvicorn.error]
03:44:11 [error    ] Proxy error:                   [llama_deploy.appserver.routers.ui_proxy] request_id=dFeSPQ3D
03:44:11 [error    ] Proxy error:                   [llama_deploy.appserver.routers.ui_proxy] request_id=yvyvQMBz
03:44:11 [error    ] Proxy error:                   [llama_deploy.appserver.routers.ui_proxy] request_id=mL2JeqGE
03:44:11 [error    ] Proxy error:                   [llama_deploy.appserver.routers.ui_proxy] request_id=ImONzhvM
03:44:11 [error    ] Proxy error:                   [llama_deploy.appserver.routers.ui_proxy] request_id=tLhDPr8b
03:44:12 [error    ] Cancel 13 running task(s), timeout graceful shutdown exceeded [uvicorn.error]
03:44:12 [info     ] Finished server process [19268] [uvicorn.error]
03:44:12 [info     ] Shutting down Workflow server. Cancelling 0 handlers. [root]
03:44:12 [error    ] Traceback (most recent call last):
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\runners.py", line 195, in run
    return runner.run(main)
           ~~~~~~~~~~^^^^^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\base_events.py", line 712, in run_until_complete
    self.run_forever()
    ~~~~~~~~~~~~~~~~^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\base_events.py", line 683, in run_forever
    self._run_once()
    ~~~~~~~~~~~~~~^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\base_events.py", line 2050, in _run_once
    handle._run()
    ~~~~~~~~~~~^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\events.py", line 89, in _run
    self._context.run(self._callback, *self._args)
    ~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\workspace\AlexProjects\Playground\showcase\.venv\Lib\site-packages\uvicorn\server.py", line 70, in serve
    with self.capture_signals():
         ~~~~~~~~~~~~~~~~~~~~^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\contextlib.py", line 148, in __exit__
    next(self.gen)
    ~~~~^^^^^^^^^^
  File "D:\workspace\AlexProjects\Playground\showcase\.venv\Lib\site-packages\uvicorn\server.py", line 331, in capture_signals
    signal.raise_signal(captured_signal)
    ~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\runners.py", line 157, in _on_sigint
    raise KeyboardInterrupt()
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\workspace\AlexProjects\Playground\showcase\.venv\Lib\site-packages\starlette\routing.py", line 701, in lifespan
    await receive()
  File "D:\workspace\AlexProjects\Playground\showcase\.venv\Lib\site-packages\uvicorn\lifespan\on.py", line 137, in receive
    return await self.receive_queue.get()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\asyncio\queues.py", line 186, in get
    await getter
asyncio.exceptions.CancelledError
 [uvicorn.error]
Exception in thread Thread-9 (_stream_source):
Traceback (most recent call last):
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\threading.py", line 1043, in _bootstrap_inner
    self.run()
    ~~~~~~~~^^
  File "C:\Users\10069\AppData\Roaming\uv\python\cpython-3.13.7-windows-x86_64-none\Lib\threading.py", line 994, in run
    self._target(*self._args, **self._kwargs)
    ~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\10069\AppData\Roaming\uv\tools\llamactl\Lib\site-packages\llama_deploy\appserver\process_utils.py", line 202, in _stream_source
    for line in iter(source.readline, ""):
                ~~~~^^^^^^^^^^^^^^^^^^^^^
  File "<frozen codecs>", line 325, in decode
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa9 in position 75: invalid start byte

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions