Describe the bug
glm-5 model enable thinking and stream,CompressionConfig Compression failed
To Reproduce
OpenAIChatModel
ReActAgent.CompressionConfig( enable=True, agent_token_counter=CharTokenCounter(), trigger_threshold=config["trigger_threshold"], keep_recent=config["keep_recent"], )
ReActAgent model use glm-5,No compression model configured
Error messages
ERROR app\services\chat_service.py:604 | 2026-04-03 11:48:25 | [_agent_stream] Exception in agent stream: All 2 models in fallback chain failed. Last error: ValueError: expected value at line 1 column 1 Traceback (most recent call last): File "D:\traeProjects\agent-baseAnti\app\services\chat_service.py", line 495, in _agent_stream_generator async for response_msg, is_last in msg_stream: ...<96 lines>... } File "D:\software_setup\python3.13\Lib\site-packages\agentscope\pipeline\_functional.py", line 192, in stream_printing_messages raise exception from None File "D:\software_setup\python3.13\Lib\site-packages\agentscope\agent\_agent_base.py", line 455, in __call__ reply_msg = await self.reply(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "D:\software_setup\python3.13\Lib\site-packages\agentscope\tracing\_trace.py", line 390, in wrapper return await func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\software_setup\python3.13\Lib\site-packages\agentscope\agent\_react_agent.py", line 434, in reply await self._compress_memory_if_needed() File "D:\software_setup\python3.13\Lib\site-packages\agentscope\agent\_react_agent.py", line 1102, in _compress_memory_if_needed async for chunk in res: last_chunk = chunk File "D:\traeProjects\agent-baseAnti\app\agent\fallback_chain.py", line 193, in _execute_streaming raise RuntimeError( ...<2 lines>... ) from last_error RuntimeError: All 2 models in fallback chain failed. Last error: ValueError: expected value at line 1 column 1
The key error message to pay attention to is: "expected value at line 1, column 1"
**Environment **
- AgentScope Version: [1.0.16]
- Python Version: [3.13]
- OS: [ windows11]
Additional context
Add any other context about the problem here.
Describe the bug
glm-5 model enable thinking and stream,CompressionConfig Compression failed
To Reproduce
OpenAIChatModelReActAgent.CompressionConfig( enable=True, agent_token_counter=CharTokenCounter(), trigger_threshold=config["trigger_threshold"], keep_recent=config["keep_recent"], )ReActAgent model use glm-5,No compression model configured
Error messages
ERROR app\services\chat_service.py:604 | 2026-04-03 11:48:25 | [_agent_stream] Exception in agent stream: All 2 models in fallback chain failed. Last error: ValueError: expected value at line 1 column 1 Traceback (most recent call last): File "D:\traeProjects\agent-baseAnti\app\services\chat_service.py", line 495, in _agent_stream_generator async for response_msg, is_last in msg_stream: ...<96 lines>... } File "D:\software_setup\python3.13\Lib\site-packages\agentscope\pipeline\_functional.py", line 192, in stream_printing_messages raise exception from None File "D:\software_setup\python3.13\Lib\site-packages\agentscope\agent\_agent_base.py", line 455, in __call__ reply_msg = await self.reply(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^ ...<4 lines>... ) ^ File "D:\software_setup\python3.13\Lib\site-packages\agentscope\tracing\_trace.py", line 390, in wrapper return await func(self, *args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\software_setup\python3.13\Lib\site-packages\agentscope\agent\_react_agent.py", line 434, in reply await self._compress_memory_if_needed() File "D:\software_setup\python3.13\Lib\site-packages\agentscope\agent\_react_agent.py", line 1102, in _compress_memory_if_needed async for chunk in res: last_chunk = chunk File "D:\traeProjects\agent-baseAnti\app\agent\fallback_chain.py", line 193, in _execute_streaming raise RuntimeError( ...<2 lines>... ) from last_error RuntimeError: All 2 models in fallback chain failed. Last error: ValueError: expected value at line 1 column 1The key error message to pay attention to is: "expected value at line 1, column 1"
**Environment **
Additional context
Add any other context about the problem here.