|
| 1 | +# Feature Overview |
| 2 | +The Tracing module is used for tracing components and arbitrary functions, including two parts: Log and Report. The Log part outputs in Dashscope Log format, while the Report part uses the OpenTelemetry SDK to report tracing information. |
| 3 | + |
| 4 | +# Usage |
| 5 | +## Logging |
| 6 | +1. Configure environment variables (enabled by default) |
| 7 | +```shell |
| 8 | +export TRACE_ENABLE_LOG=true |
| 9 | +``` |
| 10 | +2. Add decorator to any function, example: |
| 11 | +```python |
| 12 | +from agentscope_runtime.engine.tracing import trace, TraceType |
| 13 | + |
| 14 | +@trace(trace_type=TraceType.LLM, trace_name="llm_func") |
| 15 | +def llm_func(): |
| 16 | + pass |
| 17 | +``` |
| 18 | +Output: |
| 19 | +```text |
| 20 | +{"time": "2025-08-13 11:23:41.808", "step": "llm_func_start", "model": "", "user_id": "", "code": "", "message": "", "task_id": "", "request_id": "", "context": {}, "interval": {"type": "llm_func_start", "cost": 0}, "ds_service_id": "test_id", "ds_service_name": "test_name"} |
| 21 | +{"time": "2025-08-13 11:23:41.808", "step": "llm_func_end", "model": "", "user_id": "", "code": "", "message": "", "task_id": "", "request_id": "", "context": {}, "interval": {"type": "llm_func_end", "cost": "0.000"}, "ds_service_id": "test_id", "ds_service_name": "test_name"} |
| 22 | +``` |
| 23 | + |
| 24 | +3. Custom logging (prerequisite: **function contains kwargs parameter**) |
| 25 | +```python |
| 26 | +from agentscope_runtime.engine.tracing import trace, TraceType |
| 27 | + |
| 28 | +@trace(trace_type=TraceType.LLM, trace_name="llm_func") |
| 29 | +def llm_func(**kwargs): |
| 30 | + trace_event = kwargs.pop("trace_event", None) |
| 31 | + if trace_event: |
| 32 | + # Custom string message |
| 33 | + trace_event.on_log("hello") |
| 34 | + |
| 35 | + # Formatted step message |
| 36 | + trace_event.on_log( |
| 37 | + "", |
| 38 | + **{ |
| 39 | + "step_suffix": "mid_result", |
| 40 | + "payload": { |
| 41 | + "output": "hello", |
| 42 | + }, |
| 43 | + }, |
| 44 | + ) |
| 45 | +``` |
| 46 | +Output: |
| 47 | +```text |
| 48 | +{"time": "2025-08-13 11:27:14.727", "step": "llm_func_start", "model": "", "user_id": "", "code": "", "message": "", "task_id": "", "request_id": "", "context": {}, "interval": {"type": "llm_func_start", "cost": 0}, "ds_service_id": "test_id", "ds_service_name": "test_name"} |
| 49 | +{"time": "2025-08-13 11:27:14.728", "step": "", "model": "", "user_id": "", "code": "", "message": "hello", "task_id": "", "request_id": "", "context": {}, "interval": {"type": "", "cost": "0"}, "ds_service_id": "test_id", "ds_service_name": "test_name"} |
| 50 | +{"time": "2025-08-13 11:27:14.728", "step": "llm_func_mid_result", "model": "", "user_id": "", "code": "", "message": "", "task_id": "", "request_id": "", "context": {"output": "hello"}, "interval": {"type": "llm_func_mid_result", "cost": "0.000"}, "ds_service_id": "test_id", "ds_service_name": "test_name"} |
| 51 | +{"time": "2025-08-13 11:27:14.728", "step": "llm_func_end", "model": "", "user_id": "", "code": "", "message": "", "task_id": "", "request_id": "", "context": {}, "interval": {"type": "llm_func_end", "cost": "0.000"}, "ds_service_id": "test_id", "ds_service_name": "test_name"} |
| 52 | +``` |
| 53 | +## Reporting |
| 54 | +1. Configure environment variables (disabled by default) |
| 55 | +```shell |
| 56 | +export TRACE_ENABLE_LOG=false |
| 57 | +export TRACE_ENABLE_REPORT=true |
| 58 | +export TRACE_AUTHENTICATION={YOUR_AUTHENTICATION} |
| 59 | +export TRACE_ENDPOINT={YOUR_ENDPOINT} |
| 60 | +``` |
| 61 | +2. Add decorator to non-streaming functions, example: |
| 62 | + |
| 63 | +```python |
| 64 | +from agentscope_runtime.engine.tracing import trace, TraceType |
| 65 | + |
| 66 | +@trace(trace_type=TraceType.LLM, |
| 67 | + trace_name="llm_func") |
| 68 | +def llm_func(args: str): |
| 69 | + return args + "hello" |
| 70 | +``` |
| 71 | + |
| 72 | + |
| 73 | +3. Add decorator to streaming functions, example: |
| 74 | +```python |
| 75 | +from agentscope_runtime.engine.tracing import trace, TraceType |
| 76 | +from agentscope_runtime.engine.tracing.message_util import ( |
| 77 | + get_finish_reason, |
| 78 | + merge_incremental_chunk, |
| 79 | +) |
| 80 | + |
| 81 | +@trace(trace_type=TraceType.LLM, |
| 82 | + trace_name="llm_func", |
| 83 | + get_finish_reason_func=get_finish_reason, |
| 84 | + merge_output_func=merge_incremental_chunk) |
| 85 | +def llm_func(args: str): |
| 86 | + for i in range(10): |
| 87 | + yield i |
| 88 | +``` |
| 89 | +Where get_finish_reason and merge_incremental_chunk are custom processing functions, optional, defaults to get_finish_reason and merge_incremental_chunk in message_util.py. |
| 90 | + |
| 91 | +get_finish_reason is a custom function to get finish_reason, used to determine if streaming output has ended. Example: |
| 92 | +```python |
| 93 | +from openai.types.chat import ChatCompletionChunk |
| 94 | +from typing import List, Optional |
| 95 | + |
| 96 | +def get_finish_reason(response: ChatCompletionChunk) -> Optional[str]: |
| 97 | + finish_reason = None |
| 98 | + if hasattr(response, 'choices') and len(response.choices) > 0: |
| 99 | + if response.choices[0].finish_reason: |
| 100 | + finish_reason = response.choices[0].finish_reason |
| 101 | + |
| 102 | + return finish_reason |
| 103 | +``` |
| 104 | + |
| 105 | +merge_output is a custom function to merge output, used to construct the final output information. Example: |
| 106 | +```python |
| 107 | +from openai.types.chat import ChatCompletionChunk |
| 108 | +from typing import List, Optional |
| 109 | + |
| 110 | +def merge_incremental_chunk( |
| 111 | + responses: List[ChatCompletionChunk], |
| 112 | +) -> Optional[ChatCompletionChunk]: |
| 113 | + # get usage or finish reason |
| 114 | + merged = ChatCompletionChunk(**responses[-1].__dict__) |
| 115 | + |
| 116 | + # if the responses has usage info, then merge the finish reason chunk to usage chunk |
| 117 | + if not merged.choices and len(responses) > 1: |
| 118 | + merged.choices = responses[-2].choices |
| 119 | + |
| 120 | + for resp in reversed(responses[:-1]): |
| 121 | + for i, j in zip(merged.choices, resp.choices): |
| 122 | + if isinstance(i.delta.content, str) and isinstance( |
| 123 | + j.delta.content, |
| 124 | + str, |
| 125 | + ): |
| 126 | + i.delta.content = j.delta.content + i.delta.content |
| 127 | + if merged.usage and resp.usage: |
| 128 | + merged.usage.total_tokens += resp.usage.total_tokens |
| 129 | + |
| 130 | + return merged |
| 131 | +``` |
| 132 | + |
| 133 | + |
| 134 | + |
| 135 | +4. Setting request_id and common attributes |
| 136 | + |
| 137 | +request_id is used to bind the context of different requests. common attributes are public span attributes, all spans under this request will have these attributes. |
| 138 | + |
| 139 | +**Automatic request_id setting**: When the user does not manually call `TracingUtil.set_request_id` at the beginning of request processing, the system will automatically generate and set a unique request_id in the root span. |
| 140 | + |
| 141 | +**Manual setting**: Set request_id and common attributes in functions **not decorated with @trace**, for example, set immediately after request information is parsed. Example: |
| 142 | + |
| 143 | +```python |
| 144 | +from agentscope_runtime.engine.tracing import TracingUtil |
| 145 | + |
| 146 | +common_attributes = { |
| 147 | + "gen_ai.user.id": "user_id", |
| 148 | + "bailian.app.id": "app_id", |
| 149 | + "bailian.app.owner_id": "app_id", |
| 150 | + "bailian.app.env": "pre", |
| 151 | + "bailian.app.workspace": "workspace" |
| 152 | +} |
| 153 | +TracingUtil.set_request_id("request_id") |
| 154 | +TracingUtil.set_common_attributes(common_attributes) |
| 155 | +``` |
| 156 | +5. Custom reporting (prerequisite: **function contains kwargs parameter**) |
| 157 | +```python |
| 158 | +import json |
| 159 | +from agentscope_runtime.engine.tracing import trace, TraceType |
| 160 | + |
| 161 | +@trace(trace_type=TraceType.LLM, trace_name="llm_func") |
| 162 | +def llm_func(**kwargs): |
| 163 | + trace_event = kwargs.pop("trace_event", None) |
| 164 | + if trace_event: |
| 165 | + # Set string attribute |
| 166 | + trace_event.set_attribute("key", "value") |
| 167 | + # Set dict attribute |
| 168 | + trace_event.set_attribute("func_7.key", json.dumps({'key0': 'value0', 'key1': 'value1'})) |
| 169 | +``` |
0 commit comments